CINXE.COM
Search results for: camera calibration
<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: camera calibration</title> <meta name="description" content="Search results for: camera calibration"> <meta name="keywords" content="camera calibration"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="camera calibration" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="camera calibration"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 428</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: camera calibration</h1> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">428</span> Research of Linear Camera Calibration Based on Planar Pattern</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Jin%20Sun">Jin Sun</a>, <a href="https://publications.waset.org/search?q=Hongbin%20Gu"> Hongbin Gu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> An important step in three-dimensional reconstruction and computer vision is camera calibration, whose objective is to estimate the intrinsic and extrinsic parameters of each camera. In this paper, two linear methods based on the different planes are given. In both methods, the general plane is used to replace the calibration object with very good precision. In the first method, after controlling the camera to undergo five times- translation movements and taking pictures of the orthogonal planes, a set of linear constraints of the camera intrinsic parameters is then derived by means of homography matrix. The second method is to get all camera parameters by taking only one picture of a given radius circle. experiments on simulated data and real images,indicate that our method is reasonable and is a good supplement to camera calibration. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=camera%20calibration" title="camera calibration">camera calibration</a>, <a href="https://publications.waset.org/search?q=3D%20reconstruction" title=" 3D reconstruction"> 3D reconstruction</a>, <a href="https://publications.waset.org/search?q=computervision" title=" computervision"> computervision</a> </p> <a href="https://publications.waset.org/11514/research-of-linear-camera-calibration-based-on-planar-pattern" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/11514/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/11514/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/11514/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/11514/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/11514/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/11514/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/11514/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/11514/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/11514/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/11514/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/11514.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1830</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">427</span> A Method of Planar-Template- Based Camera Self-Calibration for Single-View</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Yue%20Zhao">Yue Zhao</a>, <a href="https://publications.waset.org/search?q=Chao%20Li"> Chao Li</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Camera calibration is an important step in 3D reconstruction. Camera calibration may be classified into two major types: traditional calibration and self-calibration. However, a calibration method in using a checkerboard is intermediate between traditional calibration and self-calibration. A self is proposed based on a square in this paper. Only a square in the planar template, the camera self-calibration can be completed through the single view. The proposed algorithm is that the virtual circle and straight line are established by a square on planar template, and circular points, vanishing points in straight lines and the relation between them are be used, in order to obtain the image of the absolute conic (IAC) and establish the camera intrinsic parameters. To make the calibration template is simpler, as compared with the Zhang Zhengyou-s method. Through real experiments and experiments, the experimental results show that this algorithm is feasible and available, and has a certain precision and robustness. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Absolute%20conic" title="Absolute conic">Absolute conic</a>, <a href="https://publications.waset.org/search?q=camera%20calibration" title=" camera calibration"> camera calibration</a>, <a href="https://publications.waset.org/search?q=circle%20point" title=" circle point"> circle point</a>, <a href="https://publications.waset.org/search?q=vanishing%20point." title=" vanishing point."> vanishing point.</a> </p> <a href="https://publications.waset.org/13898/a-method-of-planar-template-based-camera-self-calibration-for-single-view" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/13898/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/13898/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/13898/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/13898/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/13898/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/13898/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/13898/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/13898/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/13898/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/13898/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/13898.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1895</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">426</span> Calibration Method for an Augmented Reality System</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=S.%20Malek">S. Malek</a>, <a href="https://publications.waset.org/search?q=N.%20Zenati-Henda"> N. Zenati-Henda</a>, <a href="https://publications.waset.org/search?q=M.%20Belhocine"> M. Belhocine</a>, <a href="https://publications.waset.org/search?q=S.%20Benbelkacem"> S. Benbelkacem</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In geometrical camera calibration, the objective is to determine a set of camera parameters that describe the mapping between 3D references coordinates and 2D image coordinates. In this paper, a technique of calibration and tracking based on both a least squares method is presented and a correlation technique developed as part of an augmented reality system. This approach is fast and it can be used for a real time system <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Camera%20calibration" title="Camera calibration">Camera calibration</a>, <a href="https://publications.waset.org/search?q=pinhole%20model" title=" pinhole model"> pinhole model</a>, <a href="https://publications.waset.org/search?q=least%20squares%0Amethod" title=" least squares method"> least squares method</a>, <a href="https://publications.waset.org/search?q=augmented%20reality" title=" augmented reality"> augmented reality</a>, <a href="https://publications.waset.org/search?q=strong%20calibration." title=" strong calibration."> strong calibration.</a> </p> <a href="https://publications.waset.org/15193/calibration-method-for-an-augmented-reality-system" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/15193/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/15193/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/15193/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/15193/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/15193/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/15193/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/15193/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/15193/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/15193/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/15193/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/15193.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">2001</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">425</span> Automatic Camera Calibration for Images of Soccer Match </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Qihe%20Li">Qihe Li</a>, <a href="https://publications.waset.org/search?q=Yupin%20Luo"> Yupin Luo</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>Camera calibration plays an important role in the domain of the analysis of sports video. Considering soccer video, in most cases, the cross-points can be used for calibration at the center of the soccer field are not sufficient, so this paper introduces a new automatic camera calibration algorithm focus on solving this problem by using the properties of images of the center circle, halfway line and a touch line. After the theoretical analysis, a practicable automatic algorithm is proposed. Very little information used though, results of experiments with both synthetic data and real data show that the algorithm is applicable.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Absolute%20conic" title="Absolute conic">Absolute conic</a>, <a href="https://publications.waset.org/search?q=camera%20calibration" title=" camera calibration"> camera calibration</a>, <a href="https://publications.waset.org/search?q=circular%20points" title=" circular points"> circular points</a>, <a href="https://publications.waset.org/search?q=line%20at%20infinity." title=" line at infinity."> line at infinity.</a> </p> <a href="https://publications.waset.org/14049/automatic-camera-calibration-for-images-of-soccer-match" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/14049/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/14049/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/14049/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/14049/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/14049/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/14049/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/14049/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/14049/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/14049/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/14049/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/14049.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">2366</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">424</span> Robust Camera Calibration using Discrete Optimization</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Stephan%20Rupp">Stephan Rupp</a>, <a href="https://publications.waset.org/search?q=Matthias%20Elter"> Matthias Elter</a>, <a href="https://publications.waset.org/search?q=Michael%20Breitung"> Michael Breitung</a>, <a href="https://publications.waset.org/search?q=Walter%20Zink"> Walter Zink</a>, <a href="https://publications.waset.org/search?q=Christian%20K%C3%BCblbeck"> Christian K眉blbeck</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Camera calibration is an indispensable step for augmented reality or image guided applications where quantitative information should be derived from the images. Usually, a camera calibration is obtained by taking images of a special calibration object and extracting the image coordinates of projected calibration marks enabling the calculation of the projection from the 3d world coordinates to the 2d image coordinates. Thus such a procedure exhibits typical steps, including feature point localization in the acquired images, camera model fitting, correction of distortion introduced by the optics and finally an optimization of the model-s parameters. In this paper we propose to extend this list by further step concerning the identification of the optimal subset of images yielding the smallest overall calibration error. For this, we present a Monte Carlo based algorithm along with a deterministic extension that automatically determines the images yielding an optimal calibration. Finally, we present results proving that the calibration can be significantly improved by automated image selection. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Camera%20Calibration" title="Camera Calibration">Camera Calibration</a>, <a href="https://publications.waset.org/search?q=Discrete%20Optimization" title=" Discrete Optimization"> Discrete Optimization</a>, <a href="https://publications.waset.org/search?q=Monte%0ACarlo%20Method." title=" Monte Carlo Method."> Monte Carlo Method.</a> </p> <a href="https://publications.waset.org/450/robust-camera-calibration-using-discrete-optimization" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/450/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/450/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/450/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/450/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/450/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/450/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/450/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/450/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/450/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/450/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/450.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1815</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">423</span> Calibration of Parallel Multi-View Cameras</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=M.%20Ali-Bey">M. Ali-Bey</a>, <a href="https://publications.waset.org/search?q=N.%20Manamanni"> N. Manamanni</a>, <a href="https://publications.waset.org/search?q=S.%20Moughamir"> S. Moughamir</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper focuses on the calibration problem of a multi-view shooting system designed for the production of 3D content for auto-stereoscopic visualization. The considered multiview camera is characterized by coplanar and decentered image sensors regarding to the corresponding optical axis. Based on the Faug茅ras and Toscani-s calibration approach, a calibration method is herein proposed for the case of multi-view camera with parallel and decentered image sensors. At first, the geometrical model of the shooting system is recalled and some industrial prototypes with some shooting simulations are presented. Next, the development of the proposed calibration method is detailed. Finally, some simulation results are presented before ending with some conclusions about this work. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Auto-stereoscopic%20display" title="Auto-stereoscopic display">Auto-stereoscopic display</a>, <a href="https://publications.waset.org/search?q=camera%20calibration" title=" camera calibration"> camera calibration</a>, <a href="https://publications.waset.org/search?q=multi-view%20cameras" title=" multi-view cameras"> multi-view cameras</a>, <a href="https://publications.waset.org/search?q=visual%20servoing" title=" visual servoing"> visual servoing</a> </p> <a href="https://publications.waset.org/1418/calibration-of-parallel-multi-view-cameras" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/1418/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/1418/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/1418/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/1418/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/1418/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/1418/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/1418/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/1418/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/1418/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/1418/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/1418.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1699</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">422</span> X-Corner Detection for Camera Calibration Using Saddle Points</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Abdulrahman%20S.%20Alturki">Abdulrahman S. Alturki</a>, <a href="https://publications.waset.org/search?q=John%20S.%20Loomis"> John S. Loomis</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper discusses a corner detection algorithm for camera calibration. Calibration is a necessary step in many computer vision and image processing applications. Robust corner detection for an image of a checkerboard is required to determine intrinsic and extrinsic parameters. In this paper, an algorithm for fully automatic and robust X-corner detection is presented. Checkerboard corner points are automatically found in each image without user interaction or any prior information regarding the number of rows or columns. The approach represents each X-corner with a quadratic fitting function. Using the fact that the X-corners are saddle points, the coefficients in the fitting function are used to identify each corner location. The automation of this process greatly simplifies calibration. Our method is robust against noise and different camera orientations. Experimental analysis shows the accuracy of our method using actual images acquired at different camera locations and orientations. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Camera%20Calibration" title="Camera Calibration">Camera Calibration</a>, <a href="https://publications.waset.org/search?q=Corner%20Detector" title=" Corner Detector"> Corner Detector</a>, <a href="https://publications.waset.org/search?q=Saddle%0D%0APoints" title=" Saddle Points"> Saddle Points</a>, <a href="https://publications.waset.org/search?q=X-Corners." title=" X-Corners."> X-Corners.</a> </p> <a href="https://publications.waset.org/10003988/x-corner-detection-for-camera-calibration-using-saddle-points" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10003988/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10003988/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10003988/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10003988/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10003988/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10003988/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10003988/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10003988/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10003988/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10003988/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10003988.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">3153</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">421</span> Influence of Temperature Variations on Calibrated Cameras</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Peter%20Podbreznik">Peter Podbreznik</a>, <a href="https://publications.waset.org/search?q=Bo%C5%BEidar%20Potocnik"> Bo啪idar Potocnik</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The camera parameters are changed due to temperature variations, which directly influence calibrated cameras accuracy. Robustness of calibration methods were measured and their accuracy was tested. An error ratio due to camera parameters change with respect to total error originated during calibration process was determined. It pointed out that influence of temperature variations decrease by increasing distance of observed objects from cameras. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=camera%20calibration" title="camera calibration">camera calibration</a>, <a href="https://publications.waset.org/search?q=perspective%20projection%20matrix" title=" perspective projection matrix"> perspective projection matrix</a>, <a href="https://publications.waset.org/search?q=epipolar%20geometry" title=" epipolar geometry"> epipolar geometry</a>, <a href="https://publications.waset.org/search?q=temperature%20variation." title=" temperature variation."> temperature variation.</a> </p> <a href="https://publications.waset.org/7294/influence-of-temperature-variations-on-calibrated-cameras" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/7294/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/7294/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/7294/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/7294/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/7294/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/7294/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/7294/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/7294/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/7294/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/7294/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/7294.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1859</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">420</span> Analytical Camera Model Supplemented with Influence of Temperature Variations</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Peter%20Podbreznik">Peter Podbreznik</a>, <a href="https://publications.waset.org/search?q=Bo%C5%BEidar%20Potocnik"> Bo啪idar Potocnik</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A camera in the building site is exposed to different weather conditions. Differences between images of the same scene captured with the same camera arise also due to temperature variations. The influence of temperature changes on camera parameters were modelled and integrated into existing analytical camera model. Modified camera model enables quantitatively assessing the influence of temperature variations. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=camera%20calibration" title="camera calibration">camera calibration</a>, <a href="https://publications.waset.org/search?q=analytical%20model" title=" analytical model"> analytical model</a>, <a href="https://publications.waset.org/search?q=intrinsic%20parameters" title=" intrinsic parameters"> intrinsic parameters</a>, <a href="https://publications.waset.org/search?q=extrinsic%20parameters" title=" extrinsic parameters"> extrinsic parameters</a>, <a href="https://publications.waset.org/search?q=temperature%20variations." title=" temperature variations."> temperature variations.</a> </p> <a href="https://publications.waset.org/10107/analytical-camera-model-supplemented-with-influence-of-temperature-variations" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10107/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10107/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10107/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10107/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10107/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10107/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10107/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10107/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10107/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10107/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10107.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1507</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">419</span> Dead-Reckoning Error Calibration using Celling Looking Vision Camera</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Jae-Young%20Choi">Jae-Young Choi</a>, <a href="https://publications.waset.org/search?q=Sung-Gaun%20Kim"> Sung-Gaun Kim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper suggests a calibration method to reduce errors occurring due to mobile robot sliding during location estimation using the Dead-reckoning. Due to sliding of the mobile robot caused between its wheels and the road surface while on free run, location estimation can be erroneous. Sliding especially occurs during cornering of mobile robot. Therefore, in order to reduce these frequent sliding errors in cornering, we calibrated the mobile robot-s heading values using a vision camera and templates of the ceiling. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Dead-reckoning" title="Dead-reckoning">Dead-reckoning</a>, <a href="https://publications.waset.org/search?q=Localization" title=" Localization"> Localization</a>, <a href="https://publications.waset.org/search?q=Odomerty" title=" Odomerty"> Odomerty</a>, <a href="https://publications.waset.org/search?q=Vision%0ACamera" title=" Vision Camera"> Vision Camera</a> </p> <a href="https://publications.waset.org/11899/dead-reckoning-error-calibration-using-celling-looking-vision-camera" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/11899/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/11899/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/11899/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/11899/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/11899/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/11899/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/11899/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/11899/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/11899/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/11899/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/11899.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1783</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">418</span> A Calibration Device for Force-Torque Sensors </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Nicolay%20Zarutskiy">Nicolay Zarutskiy</a>, <a href="https://publications.waset.org/search?q=Roman%20Bulkin"> Roman Bulkin </a> </p> <p class="card-text"><strong>Abstract:</strong></p> The paper deals with the existing methods of force-torque sensor calibration with a number of components from one to six, analyzed their advantages and disadvantages, the necessity of introduction of a calibration method. Calibration method and its constructive realization are also described here. A calibration method allows performing automated force-torque sensor calibration both with selected components of the main vector of forces and moments and with complex loading. Thus, two main advantages of the proposed calibration method are achieved: the automation of the calibration process and universality. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Automation" title="Automation">Automation</a>, <a href="https://publications.waset.org/search?q=calibration" title=" calibration"> calibration</a>, <a href="https://publications.waset.org/search?q=calibration%20device" title=" calibration device"> calibration device</a>, <a href="https://publications.waset.org/search?q=calibration%20method" title=" calibration method"> calibration method</a>, <a href="https://publications.waset.org/search?q=force-torque%20sensors." title=" force-torque sensors."> force-torque sensors.</a> </p> <a href="https://publications.waset.org/10005197/a-calibration-device-for-force-torque-sensors" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10005197/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10005197/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10005197/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10005197/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10005197/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10005197/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10005197/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10005197/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10005197/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10005197/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10005197.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1290</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">417</span> Vehicle Velocity Estimation for Traffic Surveillance System</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=H.%20A.%20Rahim">H. A. Rahim</a>, <a href="https://publications.waset.org/search?q=U.%20U.%20Sheikh"> U. U. Sheikh</a>, <a href="https://publications.waset.org/search?q=R.%20B.%20Ahmad"> R. B. Ahmad</a>, <a href="https://publications.waset.org/search?q=A.%20S.%20M.%20Zain"> A. S. M. Zain</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>This paper describes an algorithm to estimate realtime vehicle velocity using image processing technique from the known camera calibration parameters. The presented algorithm involves several main steps. First, the moving object is extracted by utilizing frame differencing technique. Second, the object tracking method is applied and the speed is estimated based on the displacement of the object-s centroid. Several assumptions are listed to simplify the transformation of 2D images from 3D real-world images. The results obtained from the experiment have been compared to the estimated ground truth. From this experiment, it exhibits that the proposed algorithm has achieved the velocity accuracy estimation of about ± 1.7 km/h.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=camera%20calibration" title="camera calibration">camera calibration</a>, <a href="https://publications.waset.org/search?q=object%20tracking" title=" object tracking"> object tracking</a>, <a href="https://publications.waset.org/search?q=velocity%20estimation" title=" velocity estimation"> velocity estimation</a>, <a href="https://publications.waset.org/search?q=video%20image%20processing" title=" video image processing"> video image processing</a> </p> <a href="https://publications.waset.org/701/vehicle-velocity-estimation-for-traffic-surveillance-system" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/701/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/701/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/701/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/701/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/701/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/701/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/701/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/701/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/701/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/701/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/701.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">4456</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">416</span> Determining the Criteria and their Importance Level of Calibration Supplier Selection</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Ayse%20Gecer">Ayse Gecer</a>, <a href="https://publications.waset.org/search?q=Nihal%20Erginel"> Nihal Erginel</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Quality control is the crucial step for ISO 9001 Quality System Management Standard for companies. While measuring the quality level of both raw material and semi product/product, the calibration of the measuring device is an essential requirement. Calibration suppliers are in the service sector and therefore the calibration supplier selection is becoming a worthy topic for improving service quality. This study presents the results of a questionnaire about the selection criteria of a calibration supplier. The questionnaire was applied to 103 companies and the results are discussed in this paper. The analysis was made with MINITAB 14.0 statistical programs. 鈥淐ompetence of documentations" and 鈥渢echnical capability" are defined as the prerequisites because of the ISO/IEC17025:2005 standard. Also 鈥渨arranties and complaint policy", 鈥渃ommunication", 鈥渟ervice features", 鈥渜uality" and 鈥減erformance history" are defined as very important criteria for calibration supplier selection. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Calibration" title="Calibration">Calibration</a>, <a href="https://publications.waset.org/search?q=criteria%20of%20calibration%20supplier%20selection" title=" criteria of calibration supplier selection"> criteria of calibration supplier selection</a>, <a href="https://publications.waset.org/search?q=calibration%20supplier%20selection" title=" calibration supplier selection"> calibration supplier selection</a>, <a href="https://publications.waset.org/search?q=questionnaire" title=" questionnaire"> questionnaire</a> </p> <a href="https://publications.waset.org/10873/determining-the-criteria-and-their-importance-level-of-calibration-supplier-selection" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10873/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10873/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10873/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10873/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10873/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10873/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10873/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10873/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10873/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10873/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10873.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">2008</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">415</span> Evaluation of Manual and Automatic Calibration Methods for Digital Tachographs</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Sarp%20Erturk">Sarp Erturk</a>, <a href="https://publications.waset.org/search?q=Levent%20Eyigel"> Levent Eyigel</a>, <a href="https://publications.waset.org/search?q=Cihat%20Celik"> Cihat Celik</a>, <a href="https://publications.waset.org/search?q=Muhammet%20Sahinoglu"> Muhammet Sahinoglu</a>, <a href="https://publications.waset.org/search?q=Serdar%20Ay"> Serdar Ay</a>, <a href="https://publications.waset.org/search?q=Yasin%20Kaya"> Yasin Kaya</a>, <a href="https://publications.waset.org/search?q=Hasan%20Kaya"> Hasan Kaya</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>This paper presents a quantitative analysis on the need for automotive calibration methods for digital tachographs. Digital tachographs are mandatory for vehicles used in people and goods transport and they are an important aspect for road safety and inspection. Digital tachographs need to be calibrated for workshops in order for the digital tachograph to display and record speed and odometer values correctly. Calibration of digital tachographs can be performed either manual or automatic. It is shown in this paper that manual calibration of digital tachographs is prone to errors and there can be differences between manual and automatic calibration parameters. Therefore automatic calibration methods are imperative for digital tachograph calibration. The presented experimental results and error analysis clearly support the claims of the paper by evaluating and statistically comparing manual and automatic calibration methods.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Digital%20tachograph" title="Digital tachograph">Digital tachograph</a>, <a href="https://publications.waset.org/search?q=road%20safety" title=" road safety"> road safety</a>, <a href="https://publications.waset.org/search?q=tachograph%20calibration" title=" tachograph calibration"> tachograph calibration</a>, <a href="https://publications.waset.org/search?q=tachograph%20workshops." title=" tachograph workshops. "> tachograph workshops. </a> </p> <a href="https://publications.waset.org/10010564/evaluation-of-manual-and-automatic-calibration-methods-for-digital-tachographs" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10010564/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10010564/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10010564/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10010564/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10010564/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10010564/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10010564/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10010564/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10010564/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10010564/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10010564.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">778</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">414</span> Video Sharing System Based on Wi-Fi Camera</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Qidi%20Lin">Qidi Lin</a>, <a href="https://publications.waset.org/search?q=Hewei%20Yu"> Hewei Yu</a>, <a href="https://publications.waset.org/search?q=Jinbin%20Huang"> Jinbin Huang</a>, <a href="https://publications.waset.org/search?q=Weile%20Liang"> Weile Liang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper introduces a video sharing platform based on WiFi, which consists of camera, mobile phone and PC server. This platform can receive wireless signal from the camera and show the live video on the mobile phone captured by camera. In addition, it is able to send commands to camera and control the camera鈥檚 holder to rotate. The platform can be applied to interactive teaching and dangerous area鈥檚 monitoring and so on. Testing results show that the platform can share the live video of mobile phone. Furthermore, if the system鈥檚 PC server and the camera and many mobile phones are connected together, it can transfer photos concurrently. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Wifi%20Camera" title="Wifi Camera">Wifi Camera</a>, <a href="https://publications.waset.org/search?q=Socket" title=" Socket"> Socket</a>, <a href="https://publications.waset.org/search?q=Mobile%20platform" title=" Mobile platform"> Mobile platform</a>, <a href="https://publications.waset.org/search?q=Video%0D%0Amonitoring" title=" Video monitoring"> Video monitoring</a>, <a href="https://publications.waset.org/search?q=Remote%20control." title=" Remote control."> Remote control.</a> </p> <a href="https://publications.waset.org/10001704/video-sharing-system-based-on-wi-fi-camera" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10001704/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10001704/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10001704/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10001704/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10001704/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10001704/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10001704/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10001704/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10001704/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10001704/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10001704.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1787</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">413</span> Smart Side View Mirror Camera for Real Time System</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Nunziata%20Ivana%20Guarneri">Nunziata Ivana Guarneri</a>, <a href="https://publications.waset.org/search?q=Arcangelo%20Bruna"> Arcangelo Bruna</a>, <a href="https://publications.waset.org/search?q=Giuseppe%20Spampinato"> Giuseppe Spampinato</a>, <a href="https://publications.waset.org/search?q=Antonio%20Buemi"> Antonio Buemi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>In the last decade, automotive companies have invested a lot in terms of innovation about many aspects regarding the automatic driver assistance systems. One innovation regards the usage of a smart camera placed on the car’s side mirror for monitoring the back and lateral road situation. A common road scenario is the overtaking of the preceding car and, in this case, a brief distraction or a loss of concentration can lead the driver to undertake this action, even if there is an already overtaking vehicle, leading to serious accidents. A valid support for a secure drive can be a smart camera system, which is able to automatically analyze the road scenario and consequentially to warn the driver when another vehicle is overtaking. This paper describes a method for monitoring the side view of a vehicle by using camera optical flow motion vectors. The proposed solution detects the presence of incoming vehicles, assesses their distance from the host car, and warns the driver through different levels of alert according to the estimated distance. Due to the low complexity and computational cost, the proposed system ensures real time performances.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Camera%20calibration" title="Camera calibration">Camera calibration</a>, <a href="https://publications.waset.org/search?q=ego%20motion" title=" ego motion"> ego motion</a>, <a href="https://publications.waset.org/search?q=kalman%20filters" title=" kalman filters"> kalman filters</a>, <a href="https://publications.waset.org/search?q=object%20tracking" title=" object tracking"> object tracking</a>, <a href="https://publications.waset.org/search?q=real%20time%20systems." title=" real time systems. "> real time systems. </a> </p> <a href="https://publications.waset.org/10009228/smart-side-view-mirror-camera-for-real-time-system" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10009228/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10009228/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10009228/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10009228/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10009228/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10009228/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10009228/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10009228/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10009228/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10009228/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10009228.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">901</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">412</span> Non-contact Gaze Tracking with Head Movement Adaptation based on Single Camera</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Ying%20Huang">Ying Huang</a>, <a href="https://publications.waset.org/search?q=Zhiliang%20Wang"> Zhiliang Wang</a>, <a href="https://publications.waset.org/search?q=An%20Ping"> An Ping</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>With advances in computer vision, non-contact gaze tracking systems are heading towards being much easier to operate and more comfortable for use, the technique proposed in this paper is specially designed for achieving these goals. For the convenience in operation, the proposal aims at the system with simple configuration which is composed of a fixed wide angle camera and dual infrared illuminators. Then in order to enhance the usability of the system based on single camera, a self-adjusting method which is called Real-time gaze Tracking Algorithm with head movement Compensation (RTAC) is developed for estimating the gaze direction under natural head movement and simplifying the calibration procedure at the same time. According to the actual evaluations, the average accuracy of about 1° is achieved over a field of 20×15×15 cm3.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=computer%20vision" title="computer vision">computer vision</a>, <a href="https://publications.waset.org/search?q=gaze%20tracking" title=" gaze tracking"> gaze tracking</a>, <a href="https://publications.waset.org/search?q=human-computer%20interaction." title=" human-computer interaction."> human-computer interaction.</a> </p> <a href="https://publications.waset.org/10887/non-contact-gaze-tracking-with-head-movement-adaptation-based-on-single-camera" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10887/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10887/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10887/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10887/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10887/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10887/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10887/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10887/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10887/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10887/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10887.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1920</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">411</span> In-Flight Radiometric Performances Analysis of an Airborne Optical Payload</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Caixia%20Gao">Caixia Gao</a>, <a href="https://publications.waset.org/search?q=Chuanrong%20Li"> Chuanrong Li</a>, <a href="https://publications.waset.org/search?q=Lingli%20Tang"> Lingli Tang</a>, <a href="https://publications.waset.org/search?q=Lingling%20Ma"> Lingling Ma</a>, <a href="https://publications.waset.org/search?q=Yaokai%20Liu"> Yaokai Liu</a>, <a href="https://publications.waset.org/search?q=Xinhong%20Wang"> Xinhong Wang</a>, <a href="https://publications.waset.org/search?q=Yongsheng%20Zhou"> Yongsheng Zhou </a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>Performances analysis of remote sensing sensor is required to pursue a range of scientific research and application objectives. Laboratory analysis of any remote sensing instrument is essential, but not sufficient to establish a valid inflight one. In this study, with the aid of the <em>in situ</em> measurements and corresponding image of three-gray scale permanent artificial target, the in-flight radiometric performances analyses (in-flight radiometric calibration, dynamic range and response linearity, signal-noise-ratio (SNR), radiometric resolution) of self-developed short-wave infrared (SWIR) camera are performed. To acquire the inflight calibration coefficients of the SWIR camera, the at-sensor radiances (<em>L<sub>i</sub></em>) for the artificial targets are firstly simulated with <em>in situ </em>measurements (atmosphere parameter and spectral reflectance of the target) and viewing geometries using MODTRAN model. With these radiances and the corresponding digital numbers (<em>DN</em>) in the image, a straight line with a formulation of L = G × DN + B is fitted by a minimization regression method, and the fitted coefficients, G and B, are inflight calibration coefficients. And then the high point (L<sub>H</sub>) and the low point (L<sub>L</sub>) of dynamic range can be described as L<sub>H</sub>= (G × DN<sub>H</sub> + B) and L<sub>L</sub>= B, respectively, where DN<sub>H</sub> is equal to 2<sup>n</sup> − 1 (n is the quantization number of the payload). Meanwhile, the sensor’s response linearity (δ) is described as the correlation coefficient of the regressed line. The results show that the calibration coefficients (G and B) are 0.0083 W·sr<sup>−1</sup>m<sup>−2</sup>µm<sup>−1</sup> and −3.5 W·sr<sup>−1</sup>m<sup>−2</sup>µm<sup>−1</sup>; the low point of dynamic range is −3.5 W·sr<sup>−1</sup>m<sup>−2</sup>µm<sup>−1</sup> and the high point is 30.5 W·sr<sup>−1</sup>m<sup>−2</sup>µm<sup>−1</sup>; the response linearity is approximately 99%. Furthermore, a SNR normalization method is used to assess the sensor’s SNR, and the normalized SNR is about 59.6 when the mean value of radiance is equal to 11.0 W·sr<sup>−1</sup>m<sup>−2</sup>µm<sup>−1</sup>; subsequently, the radiometric resolution is calculated about 0.1845 W•sr<sup>-1</sup>m<sup>-2</sup>μm<sup>-1</sup>. Moreover, in order to validate the result, a comparison of the measured radiance with a radiative-transfer-code-predicted over four portable artificial targets with reflectance of 20%, 30%, 40%, 50% respectively, is performed. It is noted that relative error for the calibration is within 6.6%.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Calibration" title="Calibration">Calibration</a>, <a href="https://publications.waset.org/search?q=dynamic%20range" title=" dynamic range"> dynamic range</a>, <a href="https://publications.waset.org/search?q=radiometric%20resolution" title=" radiometric resolution"> radiometric resolution</a>, <a href="https://publications.waset.org/search?q=SNR." title=" SNR. "> SNR. </a> </p> <a href="https://publications.waset.org/10004246/in-flight-radiometric-performances-analysis-of-an-airborne-optical-payload" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10004246/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10004246/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10004246/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10004246/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10004246/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10004246/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10004246/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10004246/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10004246/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10004246/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10004246.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1340</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">410</span> Ice Load Measurements on Known Structures Using Image Processing Methods</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Azam%20Fazelpour">Azam Fazelpour</a>, <a href="https://publications.waset.org/search?q=Saeed%20R.%20Dehghani"> Saeed R. Dehghani</a>, <a href="https://publications.waset.org/search?q=Vlastimil%20Masek"> Vlastimil Masek</a>, <a href="https://publications.waset.org/search?q=Yuri%20S.%20Muzychka"> Yuri S. Muzychka</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>This study employs a method based on image analyses and structure information to detect accumulated ice on known structures. The icing of marine vessels and offshore structures causes significant reductions in their efficiency and creates unsafe working conditions. Image processing methods are used to measure ice loads automatically. Most image processing methods are developed based on captured image analyses. In this method, ice loads on structures are calculated by defining structure coordinates and processing captured images. A pyramidal structure is designed with nine cylindrical bars as the known structure of experimental setup. Unsymmetrical ice accumulated on the structure in a cold room represents the actual case of experiments. Camera intrinsic and extrinsic parameters are used to define structure coordinates in the image coordinate system according to the camera location and angle. The thresholding method is applied to capture images and detect iced structures in a binary image. The ice thickness of each element is calculated by combining the information from the binary image and the structure coordinate. Averaging ice diameters from different camera views obtains ice thicknesses of structure elements. Comparison between ice load measurements using this method and the actual ice loads shows positive correlations with an acceptable range of error. The method can be applied to complex structures defining structure and camera coordinates.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Camera%20calibration" title="Camera calibration">Camera calibration</a>, <a href="https://publications.waset.org/search?q=Ice%20detection" title=" Ice detection"> Ice detection</a>, <a href="https://publications.waset.org/search?q=ice%20load%20measurements" title=" ice load measurements"> ice load measurements</a>, <a href="https://publications.waset.org/search?q=image%20processing." title=" image processing."> image processing.</a> </p> <a href="https://publications.waset.org/10007700/ice-load-measurements-on-known-structures-using-image-processing-methods" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10007700/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10007700/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10007700/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10007700/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10007700/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10007700/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10007700/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10007700/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10007700/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10007700/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10007700.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1257</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">409</span> A Novel Digital Calibration Technique for Gain and Offset Mismatch in TI危螖 ADCs</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Ali%20Beydoun">Ali Beydoun</a>, <a href="https://publications.waset.org/search?q=Van-Tam%20Nguyen"> Van-Tam Nguyen</a>, <a href="https://publications.waset.org/search?q=Patrick%20Loumeau"> Patrick Loumeau</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Time interleaved sigma-delta (TI危螖) architecture is a potential candidate for high bandwidth analog to digital converters (ADC) which remains a bottleneck for software and cognitive radio receivers. However, the performance of the TI危螖 architecture is limited by the unavoidable gain and offset mismatches resulting from the manufacturing process. This paper presents a novel digital calibration method to compensate the gain and offset mismatch effect. The proposed method takes advantage of the reconstruction digital signal processing on each channel and requires only few logic components for implementation. The run time calibration is estimated to 10 and 15 clock cycles for offset cancellation and gain mismatch calibration respectively. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=sigma-delta" title="sigma-delta">sigma-delta</a>, <a href="https://publications.waset.org/search?q=calibration" title=" calibration"> calibration</a>, <a href="https://publications.waset.org/search?q=gain%20and%20offset%20mismatches" title=" gain and offset mismatches"> gain and offset mismatches</a>, <a href="https://publications.waset.org/search?q=analog-to-digital%20conversion" title="analog-to-digital conversion">analog-to-digital conversion</a>, <a href="https://publications.waset.org/search?q=time-interleaving." title=" time-interleaving."> time-interleaving.</a> </p> <a href="https://publications.waset.org/11985/a-novel-digital-calibration-technique-for-gain-and-offset-mismatch-in-tisd-adcs" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/11985/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/11985/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/11985/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/11985/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/11985/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/11985/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/11985/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/11985/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/11985/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/11985/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/11985.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">5528</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">408</span> Interactive PTZ Camera Control System Using Wii Remote and Infrared Sensor Bar</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=A.%20H.%20W.%20Goh">A. H. W. Goh</a>, <a href="https://publications.waset.org/search?q=Y.%20S.%20Yong"> Y. S. Yong</a>, <a href="https://publications.waset.org/search?q=C.%20H.%20Chan"> C. H. Chan</a>, <a href="https://publications.waset.org/search?q=S.%20J.%20Then"> S. J. Then</a>, <a href="https://publications.waset.org/search?q=L.%20P.%20Chu"> L. P. Chu</a>, <a href="https://publications.waset.org/search?q=S.%20W.%20Chau"> S. W. Chau</a>, <a href="https://publications.waset.org/search?q=H.%20W.%20Hon"> H. W. Hon</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper proposes an alternative control mechanism for an interactive Pan/Tilt/Zoom (PTZ) camera control system. Instead of using a mouse or a joystick, the proposed mechanism utilizes a Nintendo Wii remote and infrared (IR) sensor bar. The Wii remote has buttons that allows the user to control the movement of a PTZ camera through Bluetooth connectivity. In addition, the Wii remote has a built-in motion sensor that allows the user to give control signals to the PTZ camera through pitch and roll movement. A stationary IR sensor bar, placed at some distance away opposite the Wii remote, enables the detection of yaw movement. In addition, the Wii remote-s built-in IR camera has the ability to detect its spatial position, and thus generates a control signal when the user moves the Wii remote. Some experiments are carried out and their performances are compared with an industry-standard PTZ joystick. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Bluetooth" title="Bluetooth">Bluetooth</a>, <a href="https://publications.waset.org/search?q=Infrared" title=" Infrared"> Infrared</a>, <a href="https://publications.waset.org/search?q=Pan%2FTilt%2FZoom" title=" Pan/Tilt/Zoom"> Pan/Tilt/Zoom</a>, <a href="https://publications.waset.org/search?q=PTZ%20Camera" title=" PTZ Camera"> PTZ Camera</a>, <a href="https://publications.waset.org/search?q=Visual%20Surveillance" title="Visual Surveillance">Visual Surveillance</a>, <a href="https://publications.waset.org/search?q=Wii%20Remote" title=" Wii Remote"> Wii Remote</a> </p> <a href="https://publications.waset.org/6162/interactive-ptz-camera-control-system-using-wii-remote-and-infrared-sensor-bar" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/6162/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/6162/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/6162/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/6162/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/6162/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/6162/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/6162/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/6162/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/6162/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/6162/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/6162.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">2099</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">407</span> Study on Construction of 3D Topography by UAV-Based Images</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Yun-Yao%20Chi">Yun-Yao Chi</a>, <a href="https://publications.waset.org/search?q=Chieh-Kai%20Tsai"> Chieh-Kai Tsai</a>, <a href="https://publications.waset.org/search?q=Dai-Ling%20Li"> Dai-Ling Li</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>In this paper, a method of fast 3D topography modeling using the high-resolution camera images is studied based on the characteristics of Unmanned Aerial Vehicle (UAV) system for low altitude aerial photogrammetry and the need of three dimensional (3D) urban landscape modeling. Firstly, the existing high-resolution digital camera with special design of overlap images is designed by reconstructing and analyzing the auto-flying paths of UAVs, which improves the self-calibration function to achieve the high precision imaging by software, and further increased the resolution of the imaging system. Secondly, several-angle images including vertical images and oblique images gotten by the UAV system are used for the detail measure of urban land surfaces and the texture extraction. Finally, the aerial photography and 3D topography construction are both developed in campus of Chang-Jung University and in Guerin district area in Tainan, Taiwan, provide authentication model for construction of 3D topography based on combined UAV-based camera images from system. The results demonstrated that the UAV system for low altitude aerial photogrammetry can be used in the construction of 3D topography production, and the technology solution in this paper offers a new, fast, and technical plan for the 3D expression of the city landscape, fine modeling and visualization.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=3D" title="3D">3D</a>, <a href="https://publications.waset.org/search?q=topography" title=" topography"> topography</a>, <a href="https://publications.waset.org/search?q=UAV" title=" UAV"> UAV</a>, <a href="https://publications.waset.org/search?q=images." title=" images. "> images. </a> </p> <a href="https://publications.waset.org/10008869/study-on-construction-of-3d-topography-by-uav-based-images" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10008869/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10008869/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10008869/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10008869/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10008869/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10008869/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10008869/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10008869/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10008869/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10008869/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10008869.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">802</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">406</span> Uncertainty Propagation and Sensitivity Analysis During Calibration of an Integrated Land Use and Transport Model</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Parikshit%20Dutta">Parikshit Dutta</a>, <a href="https://publications.waset.org/search?q=Mathieu%20Saujot"> Mathieu Saujot</a>, <a href="https://publications.waset.org/search?q=Elise%20Arnaud"> Elise Arnaud</a>, <a href="https://publications.waset.org/search?q=Benoit%20Lefevre"> Benoit Lefevre</a>, <a href="https://publications.waset.org/search?q=Emmanuel%20Prados"> Emmanuel Prados</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this work, propagation of uncertainty during calibration process of TRANUS, an integrated land use and transport model (ILUTM), has been investigated. It has also been examined, through a sensitivity analysis, which input parameters affect the variation of the outputs the most. Moreover, a probabilistic verification methodology of calibration process, which equates the observed and calculated production, has been proposed. The model chosen as an application is the model of the city of Grenoble, France. For sensitivity analysis and uncertainty propagation, Monte Carlo method was employed, and a statistical hypothesis test was used for verification. The parameters of the induced demand function in TRANUS, were assumed as uncertain in the present case. It was found that, if during calibration, TRANUS converges, then with a high probability the calibration process is verified. Moreover, a weak correlation was found between the inputs and the outputs of the calibration process. The total effect of the inputs on outputs was investigated, and the output variation was found to be dictated by only a few input parameters. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Uncertainty%20propagation" title="Uncertainty propagation">Uncertainty propagation</a>, <a href="https://publications.waset.org/search?q=sensitivity%20analysis" title=" sensitivity analysis"> sensitivity analysis</a>, <a href="https://publications.waset.org/search?q=calibration%0D%0Aunder%20uncertainty" title=" calibration under uncertainty"> calibration under uncertainty</a>, <a href="https://publications.waset.org/search?q=hypothesis%20testing" title=" hypothesis testing"> hypothesis testing</a>, <a href="https://publications.waset.org/search?q=integrated%20land%20use%20and%0D%0Atransport%20models" title=" integrated land use and transport models"> integrated land use and transport models</a>, <a href="https://publications.waset.org/search?q=TRANUS" title=" TRANUS"> TRANUS</a>, <a href="https://publications.waset.org/search?q=Grenoble." title=" Grenoble."> Grenoble.</a> </p> <a href="https://publications.waset.org/12927/uncertainty-propagation-and-sensitivity-analysis-during-calibration-of-an-integrated-land-use-and-transport-model" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/12927/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/12927/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/12927/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/12927/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/12927/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/12927/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/12927/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/12927/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/12927/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/12927/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/12927.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1521</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">405</span> Interaxial Distance and Convergence Control for Efficient Stereoscopic Shooting using Horizontal Moving 3D Camera Rig</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Seong-Mo%20An">Seong-Mo An</a>, <a href="https://publications.waset.org/search?q=Rohit%20Ramesh"> Rohit Ramesh</a>, <a href="https://publications.waset.org/search?q=Young-Sook%20Lee"> Young-Sook Lee</a>, <a href="https://publications.waset.org/search?q=Wan-Young%20Chung"> Wan-Young Chung</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The proper assessment of interaxial distance and convergence control are important factors in stereoscopic imaging technology to make an efficient 3D image. To control interaxial distance and convergence for efficient 3D shooting, horizontal 3D camera rig is designed using some hardware components like 'LM Guide', 'Goniometer' and 'Rotation Stage'. The horizontal 3D camera rig system can be properly aligned by moving the two cameras horizontally in same or opposite directions, by adjusting the camera angle and finally considering horizontal swing as well as vertical swing. In this paper, the relationship between interaxial distance and convergence angle control are discussed and intensive experiments are performed in order to demonstrate an easy and effective 3D shooting. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Interaxial" title="Interaxial">Interaxial</a>, <a href="https://publications.waset.org/search?q=Convergence" title=" Convergence"> Convergence</a>, <a href="https://publications.waset.org/search?q=Stereoscopic" title=" Stereoscopic"> Stereoscopic</a>, <a href="https://publications.waset.org/search?q=Horizontal%203D%0D%0ACamera%20Rig" title=" Horizontal 3D Camera Rig"> Horizontal 3D Camera Rig</a> </p> <a href="https://publications.waset.org/1913/interaxial-distance-and-convergence-control-for-efficient-stereoscopic-shooting-using-horizontal-moving-3d-camera-rig" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/1913/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/1913/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/1913/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/1913/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/1913/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/1913/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/1913/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/1913/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/1913/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/1913/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/1913.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">2646</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">404</span> 2D Bar Codes Reading: Solutions for Camera Phones</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Hao%20Wang">Hao Wang</a>, <a href="https://publications.waset.org/search?q=Yanming%20Zou"> Yanming Zou</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Two-dimensional (2D) bar codes were designed to carry significantly more data with higher information density and robustness than its 1D counterpart. Thanks to the popular combination of cameras and mobile phones, it will naturally bring great commercial value to use the camera phone for 2D bar code reading. This paper addresses the problem of specific 2D bar code design for mobile phones and introduces a low-level encoding method of matrix codes. At the same time, we propose an efficient scheme for 2D bar codes decoding, of which the effort is put on solutions of the difficulties introduced by low image quality that is very common in bar code images taken by a phone camera. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=2D%20bar%20code%20reading" title="2D bar code reading">2D bar code reading</a>, <a href="https://publications.waset.org/search?q=camera%20phone" title=" camera phone"> camera phone</a>, <a href="https://publications.waset.org/search?q=low-level%0Aencoding" title=" low-level encoding"> low-level encoding</a>, <a href="https://publications.waset.org/search?q=mixed%20model" title=" mixed model"> mixed model</a> </p> <a href="https://publications.waset.org/5194/2d-bar-codes-reading-solutions-for-camera-phones" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/5194/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/5194/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/5194/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/5194/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/5194/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/5194/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/5194/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/5194/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/5194/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/5194/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/5194.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1848</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">403</span> Mathematical Programming on Multivariate Calibration Estimation in Stratified Sampling</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Dinesh%20Rao">Dinesh Rao</a>, <a href="https://publications.waset.org/search?q=M.G.M.%20Khan"> M.G.M. Khan</a>, <a href="https://publications.waset.org/search?q=Sabiha%20Khan"> Sabiha Khan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Calibration estimation is a method of adjusting the original design weights to improve the survey estimates by using auxiliary information such as the known population total (or mean) of the auxiliary variables. A calibration estimator uses calibrated weights that are determined to minimize a given distance measure to the original design weights while satisfying a set of constraints related to the auxiliary information. In this paper, we propose a new multivariate calibration estimator for the population mean in the stratified sampling design, which incorporates information available for more than one auxiliary variable. The problem of determining the optimum calibrated weights is formulated as a Mathematical Programming Problem (MPP) that is solved using the Lagrange multiplier technique. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Calibration%20estimation" title="Calibration estimation">Calibration estimation</a>, <a href="https://publications.waset.org/search?q=Stratified%20sampling" title=" Stratified sampling"> Stratified sampling</a>, <a href="https://publications.waset.org/search?q=Multivariate%20auxiliary%20information" title=" Multivariate auxiliary information"> Multivariate auxiliary information</a>, <a href="https://publications.waset.org/search?q=Mathematical%20programming%0Aproblem" title=" Mathematical programming problem"> Mathematical programming problem</a>, <a href="https://publications.waset.org/search?q=Lagrange%20multiplier%20technique." title=" Lagrange multiplier technique."> Lagrange multiplier technique.</a> </p> <a href="https://publications.waset.org/10096/mathematical-programming-on-multivariate-calibration-estimation-in-stratified-sampling" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10096/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10096/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10096/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10096/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10096/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10096/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10096/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10096/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10096/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10096/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10096.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1953</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">402</span> Learning Spatio-Temporal Topology of a Multi-Camera Network by Tracking Multiple People</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Yunyoung%20Nam">Yunyoung Nam</a>, <a href="https://publications.waset.org/search?q=Junghun%20Ryu"> Junghun Ryu</a>, <a href="https://publications.waset.org/search?q=Yoo-Joo%20Choi"> Yoo-Joo Choi</a>, <a href="https://publications.waset.org/search?q=We-Duke%20Cho"> We-Duke Cho</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper presents a novel approach for representing the spatio-temporal topology of the camera network with overlapping and non-overlapping fields of view (FOVs). The topology is determined by tracking moving objects and establishing object correspondence across multiple cameras. To track people successfully in multiple camera views, we used the Merge-Split (MS) approach for object occlusion in a single camera and the grid-based approach for extracting the accurate object feature. In addition, we considered the appearance of people and the transition time between entry and exit zones for tracking objects across blind regions of multiple cameras with non-overlapping FOVs. The main contribution of this paper is to estimate transition times between various entry and exit zones, and to graphically represent the camera topology as an undirected weighted graph using the transition probabilities. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Surveillance" title="Surveillance">Surveillance</a>, <a href="https://publications.waset.org/search?q=multiple%20camera" title=" multiple camera"> multiple camera</a>, <a href="https://publications.waset.org/search?q=people%20tracking" title=" people tracking"> people tracking</a>, <a href="https://publications.waset.org/search?q=topology." title=" topology."> topology.</a> </p> <a href="https://publications.waset.org/7797/learning-spatio-temporal-topology-of-a-multi-camera-network-by-tracking-multiple-people" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/7797/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/7797/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/7797/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/7797/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/7797/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/7797/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/7797/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/7797/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/7797/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/7797/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/7797.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1651</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">401</span> Development of Perez-Du Mortier Calibration Algorithm for Ground-Based Aerosol Optical Depth Measurement with Validation using SMARTS Model</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Jedol%20Dayou">Jedol Dayou</a>, <a href="https://publications.waset.org/search?q=Jackson%20Hian%20Wui%20Chang"> Jackson Hian Wui Chang</a>, <a href="https://publications.waset.org/search?q=Rubena%20Yusoff"> Rubena Yusoff</a>, <a href="https://publications.waset.org/search?q=Ag.%20Sufiyan%20Abd.%20Hamid"> Ag. Sufiyan Abd. Hamid</a>, <a href="https://publications.waset.org/search?q=Fauziah%20Sulaiman"> Fauziah Sulaiman</a>, <a href="https://publications.waset.org/search?q=Justin%20Sentian"> Justin Sentian</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>Aerosols are small particles suspended in air that have wide varying spatial and temporal distributions. The concentration of aerosol in total columnar atmosphere is normally measured using aerosol optical depth (AOD). In long-term monitoring stations, accurate AOD retrieval is often difficult due to the lack of frequent calibration. To overcome this problem, a near-sea-level Langley calibration algorithm is developed using the combination of clear-sky detection model and statistical filter. It attempts to produce a dataset that consists of only homogenous and stable atmospheric condition for the Langley calibration purposes. In this paper, a radiance-based validation method is performed to further investigate the feasibility and consistency of the proposed algorithm at different location, day, and time. The algorithm is validated using SMARTS model based n DNI value. The overall results confirmed that the proposed calibration algorithm feasible and consistent for measurements taken at different sites and weather conditions.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Aerosol%20optical%20depth" title="Aerosol optical depth">Aerosol optical depth</a>, <a href="https://publications.waset.org/search?q=direct%20normal%20irradiance" title=" direct normal irradiance"> direct normal irradiance</a>, <a href="https://publications.waset.org/search?q=Langley%20calibration" title=" Langley calibration"> Langley calibration</a>, <a href="https://publications.waset.org/search?q=radiance-based%20validation" title=" radiance-based validation"> radiance-based validation</a>, <a href="https://publications.waset.org/search?q=SMARTS." title=" SMARTS. "> SMARTS. </a> </p> <a href="https://publications.waset.org/17241/development-of-perez-du-mortier-calibration-algorithm-for-ground-based-aerosol-optical-depth-measurement-with-validation-using-smarts-model" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/17241/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/17241/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/17241/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/17241/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/17241/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/17241/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/17241/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/17241/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/17241/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/17241/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/17241.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1808</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">400</span> Model-Based Person Tracking Through Networked Cameras</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Kyoung-Mi%20Lee">Kyoung-Mi Lee</a>, <a href="https://publications.waset.org/search?q=Youn-Mi%20Lee"> Youn-Mi Lee</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>This paper proposes a way to track persons by making use of multiple non-overlapping cameras. Tracking persons on multiple non-overlapping cameras enables data communication among cameras through the network connection between a camera and a computer, while at the same time transferring human feature data captured by a camera to another camera that is connected via the network. To track persons with a camera and send the tracking data to another camera, the proposed system uses a hierarchical human model that comprises a head, a torso, and legs. The feature data of the person being modeled are transferred to the server, after which the server sends the feature data of the human model to the cameras connected over the network. This enables a camera that captures a person's movement entering its vision to keep tracking the recognized person with the use of the feature data transferred from the server.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Person%20tracking" title="Person tracking">Person tracking</a>, <a href="https://publications.waset.org/search?q=human%20model" title=" human model"> human model</a>, <a href="https://publications.waset.org/search?q=networked%20cameras" title=" networked cameras"> networked cameras</a>, <a href="https://publications.waset.org/search?q=vision-based%20surveillance." title=" vision-based surveillance."> vision-based surveillance.</a> </p> <a href="https://publications.waset.org/8274/model-based-person-tracking-through-networked-cameras" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/8274/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/8274/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/8274/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/8274/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/8274/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/8274/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/8274/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/8274/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/8274/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/8274/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/8274.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1489</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">399</span> First Person View Camera Based Quadcopter with Raspberry Pi</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=C.%20R.%20Balamurugan">C. R. Balamurugan</a>, <a href="https://publications.waset.org/search?q=P.%20Vijayakumar"> P. Vijayakumar</a>, <a href="https://publications.waset.org/search?q=P.%20Kiruba"> P. Kiruba</a>, <a href="https://publications.waset.org/search?q=S.%20Arun%20Kanna"> S. Arun Kanna</a>, <a href="https://publications.waset.org/search?q=E.%20R.%20Hariprasath"> E. R. Hariprasath</a>, <a href="https://publications.waset.org/search?q=C.%20Anu%20Priya"> C. Anu Priya</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>This paper studies in details about the need of quadcopter in various fields especially in the place of remote area where the road transportation facility is very less. It is used to monitor and collect data in a specific region. The movement of this quadcopter is controlled by the Raspberry Pi. FPV camera is used for capturing the image and will transmit the image to the receiver which can be monitored using an android smart phone. This is mainly used for surveillance purpose and hidden activities can be captured.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=FPV%20camera" title="FPV camera">FPV camera</a>, <a href="https://publications.waset.org/search?q=A2212%20brushless%20direct%20current%20motor" title=" A2212 brushless direct current motor"> A2212 brushless direct current motor</a>, <a href="https://publications.waset.org/search?q=Raspberry%20Pi" title=" Raspberry Pi"> Raspberry Pi</a>, <a href="https://publications.waset.org/search?q=lithium%20polymer%20battery." title=" lithium polymer battery. "> lithium polymer battery. </a> </p> <a href="https://publications.waset.org/10009522/first-person-view-camera-based-quadcopter-with-raspberry-pi" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10009522/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10009522/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10009522/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10009522/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10009522/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10009522/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10009522/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10009522/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10009522/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10009522/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10009522.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1060</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">‹</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=camera%20calibration&page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=camera%20calibration&page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=camera%20calibration&page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=camera%20calibration&page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=camera%20calibration&page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=camera%20calibration&page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=camera%20calibration&page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=camera%20calibration&page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=camera%20calibration&page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=camera%20calibration&page=14">14</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=camera%20calibration&page=15">15</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=camera%20calibration&page=2" rel="next">›</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">© 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">×</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>