CINXE.COM

Search results for: remote sensing images

<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: remote sensing images</title> <meta name="description" content="Search results for: remote sensing images"> <meta name="keywords" content="remote sensing images"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="remote sensing images" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="remote sensing images"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 3913</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: remote sensing images</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3913</span> A Comparative Study on Automatic Feature Classification Methods of Remote Sensing Images </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Lee%20Jeong%20Min">Lee Jeong Min</a>, <a href="https://publications.waset.org/abstracts/search?q=Lee%20Mi%20Hee"> Lee Mi Hee</a>, <a href="https://publications.waset.org/abstracts/search?q=Eo%20Yang%20Dam"> Eo Yang Dam</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Geospatial feature extraction is a very important issue in the remote sensing research. In the meantime, the image classification based on statistical techniques, but, in recent years, data mining and machine learning techniques for automated image processing technology is being applied to remote sensing it has focused on improved results generated possibility. In this study, artificial neural network and decision tree technique is applied to classify the high-resolution satellite images, as compared to the MLC processing result is a statistical technique and an analysis of the pros and cons between each of the techniques. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title="remote sensing">remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=artificial%20neural%20network" title=" artificial neural network"> artificial neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=decision%20tree" title=" decision tree"> decision tree</a>, <a href="https://publications.waset.org/abstracts/search?q=maximum%20likelihood%20classification" title=" maximum likelihood classification"> maximum likelihood classification</a> </p> <a href="https://publications.waset.org/abstracts/48370/a-comparative-study-on-automatic-feature-classification-methods-of-remote-sensing-images" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/48370.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">347</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3912</span> A Method to Estimate Wheat Yield Using Landsat Data</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Zama%20Mahmood">Zama Mahmood</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The increasing demand of food management, monitoring of the crop growth and forecasting its yield well before harvest is very important. These days, yield assessment together with monitoring of crop development and its growth are being identified with the help of satellite and remote sensing images. Studies using remote sensing data along with field survey validation reported high correlation between vegetation indices and yield. With the development of remote sensing technique, the detection of crop and its mechanism using remote sensing data on regional or global scales have become popular topics in remote sensing applications. Punjab, specially the southern Punjab region is extremely favourable for wheat production. But measuring the exact amount of wheat production is a tedious job for the farmers and workers using traditional ground based measurements. However, remote sensing can provide the most real time information. In this study, using the Normalized Differentiate Vegetation Index (NDVI) indicator developed from Landsat satellite images, the yield of wheat has been estimated during the season of 2013-2014 for the agricultural area around Bahawalpur. The average yield of the wheat was found 35 kg/acre by analysing field survey data. The field survey data is in fair agreement with the NDVI values extracted from Landsat images. A correlation between wheat production (ton) and number of wheat pixels has also been calculated which is in proportional pattern with each other. Also a strong correlation between the NDVI and wheat area was found (R2=0.71) which represents the effectiveness of the remote sensing tools for crop monitoring and production estimation. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=landsat" title="landsat">landsat</a>, <a href="https://publications.waset.org/abstracts/search?q=NDVI" title=" NDVI"> NDVI</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title=" remote sensing"> remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=satellite%20images" title=" satellite images"> satellite images</a>, <a href="https://publications.waset.org/abstracts/search?q=yield" title=" yield"> yield</a> </p> <a href="https://publications.waset.org/abstracts/31728/a-method-to-estimate-wheat-yield-using-landsat-data" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/31728.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">335</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3911</span> Remote Sensing through Deep Neural Networks for Satellite Image Classification</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Teja%20Sai%20Puligadda">Teja Sai Puligadda</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Satellite images in detail can serve an important role in the geographic study. Quantitative and qualitative information provided by the satellite and remote sensing images minimizes the complexity of work and time. Data/images are captured at regular intervals by satellite remote sensing systems, and the amount of data collected is often enormous, and it expands rapidly as technology develops. Interpreting remote sensing images, geographic data mining, and researching distinct vegetation types such as agricultural and forests are all part of satellite image categorization. One of the biggest challenge data scientists faces while classifying satellite images is finding the best suitable classification algorithms based on the available that could able to classify images with utmost accuracy. In order to categorize satellite images, which is difficult due to the sheer volume of data, many academics are turning to deep learning machine algorithms. As, the CNN algorithm gives high accuracy in image recognition problems and automatically detects the important features without any human supervision and the ANN algorithm stores information on the entire network (Abhishek Gupta., 2020), these two deep learning algorithms have been used for satellite image classification. This project focuses on remote sensing through Deep Neural Networks i.e., ANN and CNN with Deep Sat (SAT-4) Airborne dataset for classifying images. Thus, in this project of classifying satellite images, the algorithms ANN and CNN are implemented, evaluated & compared and the performance is analyzed through evaluation metrics such as Accuracy and Loss. Additionally, the Neural Network algorithm which gives the lowest bias and lowest variance in solving multi-class satellite image classification is analyzed. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=artificial%20neural%20network" title="artificial neural network">artificial neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=convolutional%20neural%20network" title=" convolutional neural network"> convolutional neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title=" remote sensing"> remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=accuracy" title=" accuracy"> accuracy</a>, <a href="https://publications.waset.org/abstracts/search?q=loss" title=" loss"> loss</a> </p> <a href="https://publications.waset.org/abstracts/146723/remote-sensing-through-deep-neural-networks-for-satellite-image-classification" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/146723.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">159</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3910</span> Color Fusion of Remote Sensing Images for Imparting Fluvial Geomorphological Features of River Yamuna and Ganga over Doon Valley </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=P.%20S.%20Jagadeesh%20Kumar">P. S. Jagadeesh Kumar</a>, <a href="https://publications.waset.org/abstracts/search?q=Tracy%20Lin%20Huan"> Tracy Lin Huan</a>, <a href="https://publications.waset.org/abstracts/search?q=Rebecca%20K.%20Rossi"> Rebecca K. Rossi</a>, <a href="https://publications.waset.org/abstracts/search?q=Yanmin%20Yuan"> Yanmin Yuan</a>, <a href="https://publications.waset.org/abstracts/search?q=Xianpei%20Li"> Xianpei Li</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The fiscal growth of any country hinges on the prudent administration of water resources. The river Yamuna and Ganga are measured as the life line of India as it affords the needs for life to endure. Earth observation over remote sensing images permits the precise description and identification of ingredients on the superficial from space and airborne platforms. Multiple and heterogeneous image sources are accessible for the same geographical section; multispectral, hyperspectral, radar, multitemporal, and multiangular images. In this paper, a taxonomical learning of the fluvial geomorphological features of river Yamuna and Ganga over doon valley using color fusion of multispectral remote sensing images was performed. Experimental results exhibited that the segmentation based colorization technique stranded on pattern recognition, and color mapping fashioned more colorful and truthful colorized images for geomorphological feature extraction. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=color%20fusion" title="color fusion">color fusion</a>, <a href="https://publications.waset.org/abstracts/search?q=geomorphology" title=" geomorphology"> geomorphology</a>, <a href="https://publications.waset.org/abstracts/search?q=fluvial%20processes" title=" fluvial processes"> fluvial processes</a>, <a href="https://publications.waset.org/abstracts/search?q=multispectral%20images" title=" multispectral images"> multispectral images</a>, <a href="https://publications.waset.org/abstracts/search?q=pattern%20recognition" title=" pattern recognition"> pattern recognition</a> </p> <a href="https://publications.waset.org/abstracts/87961/color-fusion-of-remote-sensing-images-for-imparting-fluvial-geomorphological-features-of-river-yamuna-and-ganga-over-doon-valley" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/87961.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">306</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3909</span> The Use of Remote Sensing in the Study of Vegetation Jebel Boutaleb, Setif, Algeria</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Khaled%20Missaoui">Khaled Missaoui</a>, <a href="https://publications.waset.org/abstracts/search?q=Amina%20Beldjazia"> Amina Beldjazia</a>, <a href="https://publications.waset.org/abstracts/search?q=Rachid%20Gharzouli"> Rachid Gharzouli</a>, <a href="https://publications.waset.org/abstracts/search?q=Yamna%20Djellouli"> Yamna Djellouli</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Optical remote sensing makes use of visible, near infrared and short-wave infrared sensors to form images of the earth's surface by detecting the solar radiation reflected from targets on the ground. Different materials reflect and absorb differently at different wavelengths. Thus, the targets can be differentiated by their spectral reflectance signatures in the remotely sensed images. In this work, we are interested to study the distribution of vegetation in the massif forest of Boutaleb (North East of Algeria) which suffered between 1998 and 1999 very large fires. In this case, we use remote sensing with Landsat images from two dates (1984 and 2000) to see the results of these fires. Vegetation has a unique spectral signature which enables it to be distinguished readily from other types of land cover in an optical/near-infrared image. Normalized Difference Vegetation Index (NDVI) is calculated with ENVI 4.7 from Band 3 and 4. The results showed a very important floristic diversity in this forest. The comparison of NDVI from the two dates confirms that there is a decrease of the density of vegetation in this area due to repeated fires. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title="remote sensing">remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=boutaleb" title=" boutaleb"> boutaleb</a>, <a href="https://publications.waset.org/abstracts/search?q=diversity" title=" diversity"> diversity</a>, <a href="https://publications.waset.org/abstracts/search?q=forest" title=" forest"> forest</a> </p> <a href="https://publications.waset.org/abstracts/23426/the-use-of-remote-sensing-in-the-study-of-vegetation-jebel-boutaleb-setif-algeria" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/23426.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">560</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3908</span> Remote Sensing and GIS for Land Use Change Assessment: Case Study of Oued Bou Hamed Watershed, Southern Tunisia</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ouerchefani%20Dalel">Ouerchefani Dalel</a>, <a href="https://publications.waset.org/abstracts/search?q=Mahdhaoui%20Basma"> Mahdhaoui Basma</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Land use change is one of the important factors needed to evaluate later on the impact of human actions on land degradation. This work present the application of a methodology based on remote sensing for evaluation land use change in an arid region of Tunisia. This methodology uses Landsat TM and ETM+ images to produce land use maps by supervised classification based on ground truth region of interests. This study showed that it was possible to rely on radiometric values of the pixels to define each land use class in the field. It was also possible to generate 3 land use classes of the same study area between 1988 and 2011. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=land%20use" title="land use">land use</a>, <a href="https://publications.waset.org/abstracts/search?q=change" title=" change"> change</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title=" remote sensing"> remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=GIS" title=" GIS"> GIS</a> </p> <a href="https://publications.waset.org/abstracts/31556/remote-sensing-and-gis-for-land-use-change-assessment-case-study-of-oued-bou-hamed-watershed-southern-tunisia" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/31556.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">565</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3907</span> Automatic Extraction of Water Bodies Using Whole-R Method</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Nikhat%20Nawaz">Nikhat Nawaz</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20Srinivasulu"> S. Srinivasulu</a>, <a href="https://publications.waset.org/abstracts/search?q=P.%20Kesava%20Rao"> P. Kesava Rao</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Feature extraction plays an important role in many remote sensing applications. Automatic extraction of water bodies is of great significance in many remote sensing applications like change detection, image retrieval etc. This paper presents a procedure for automatic extraction of water information from remote sensing images. The algorithm uses the relative location of R-colour component of the chromaticity diagram. This method is then integrated with the effectiveness of the spatial scale transformation of whole method. The whole method is based on water index fitted from spectral library. Experimental results demonstrate the improved accuracy and effectiveness of the integrated method for automatic extraction of water bodies. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=feature%20extraction" title="feature extraction">feature extraction</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title=" remote sensing"> remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20retrieval" title=" image retrieval"> image retrieval</a>, <a href="https://publications.waset.org/abstracts/search?q=chromaticity" title=" chromaticity"> chromaticity</a>, <a href="https://publications.waset.org/abstracts/search?q=water%20index" title=" water index"> water index</a>, <a href="https://publications.waset.org/abstracts/search?q=spectral%20library" title=" spectral library"> spectral library</a>, <a href="https://publications.waset.org/abstracts/search?q=integrated%20method" title=" integrated method "> integrated method </a> </p> <a href="https://publications.waset.org/abstracts/2097/automatic-extraction-of-water-bodies-using-whole-r-method" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/2097.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">385</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3906</span> Oil-Spill Monitoring in Istanbul Strait and Marmara Sea by RASAT Remote Sensing Images</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ozgun%20Oktar">Ozgun Oktar</a>, <a href="https://publications.waset.org/abstracts/search?q=Sevilay%20Can"> Sevilay Can</a>, <a href="https://publications.waset.org/abstracts/search?q=Cengiz%20V.%20Ekici"> Cengiz V. Ekici</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The oil spill is a form of pollution caused by releasing of a liquid petroleum hydrocarbon into the marine environment. Considering the growth of ship traffic, increasing of off-shore oil drilling and seaside refineries affect the risk of oil spill upward. The oil spill is easy to spread to large areas when occurs especially on the sea surface. Remote sensing technology offers the easiest way to control/monitor the area of the oil spill in a large region. It’s usually easy to detect pollution when occurs by the ship accidents, however monitoring non-accidental pollution could be possible by remote sensing. It is also needed to observe specific regions daily and continuously by satellite solutions. Remote sensing satellites mostly and effectively used for monitoring oil pollution are RADARSAT, ENVISAT and MODIS. Spectral coverage and transition period of these satellites are not proper to monitor Marmara Sea and Istanbul Strait continuously. In this study, RASAT and GOKTURK-2 are suggested to use for monitoring Marmara Sea and Istanbul Strait. RASAT, with spectral resolution 420 – 730 nm, is the first Turkish-built satellite. GOKTURK-2’s resolution can reach up to 2,5 meters. This study aims to analyze the images from both satellites and produce maps to show the regions which have potentially affected by spills from shipping traffic. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Marmara%20Sea" title="Marmara Sea">Marmara Sea</a>, <a href="https://publications.waset.org/abstracts/search?q=monitoring" title=" monitoring"> monitoring</a>, <a href="https://publications.waset.org/abstracts/search?q=oil%20spill" title=" oil spill"> oil spill</a>, <a href="https://publications.waset.org/abstracts/search?q=satellite%20remote%20sensing" title=" satellite remote sensing"> satellite remote sensing</a> </p> <a href="https://publications.waset.org/abstracts/52390/oil-spill-monitoring-in-istanbul-strait-and-marmara-sea-by-rasat-remote-sensing-images" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/52390.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">423</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3905</span> Application of the Hit or Miss Transform to Detect Dams Monitored for Water Quality Using Remote Sensing in South Africa</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Brighton%20Chamunorwa">Brighton Chamunorwa</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The current remote sensing of water quality procedures does not provide a step representing physical visualisation of the monitored dam. The application of the remote sensing of water quality techniques may benefit from use of mathematical morphology operators for shape identification. Given an input of dam outline, morphological operators such as the hit or miss transform identifies if the water body is present on input remotely sensed images. This study seeks to determine the accuracy of the hit or miss transform to identify dams monitored by the water resources authorities in South Africa on satellite images. To achieve this objective the study download a Landsat image acquired in winter and tested the capability of the hit or miss transform using shapefile boundaries of dams in the crocodile marico catchment. The results of the experiment show that it is possible to detect most dams on the Landsat image after the adjusting the erosion operator to detect pixel matching a percentage similarity of 80% and above. Successfully implementation of the current study contributes towards optimisation of mathematical morphology image operators. Additionally, the effort helps develop remote sensing of water quality monitoring with improved simulation of the conventional procedures. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=hit%20or%20miss%20transform" title="hit or miss transform">hit or miss transform</a>, <a href="https://publications.waset.org/abstracts/search?q=mathematical%20morphology" title=" mathematical morphology"> mathematical morphology</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title=" remote sensing"> remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=water%20quality%20monitoring" title=" water quality monitoring"> water quality monitoring</a> </p> <a href="https://publications.waset.org/abstracts/128717/application-of-the-hit-or-miss-transform-to-detect-dams-monitored-for-water-quality-using-remote-sensing-in-south-africa" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/128717.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">153</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3904</span> Unsupervised Detection of Burned Area from Remote Sensing Images Using Spatial Correlation and Fuzzy Clustering </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Tauqir%20A.%20Moughal">Tauqir A. Moughal</a>, <a href="https://publications.waset.org/abstracts/search?q=Fusheng%20Yu"> Fusheng Yu</a>, <a href="https://publications.waset.org/abstracts/search?q=Abeer%20Mazher"> Abeer Mazher</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Land-cover and land-use change information are important because of their practical uses in various applications, including deforestation, damage assessment, disasters monitoring, urban expansion, planning, and land management. Therefore, developing change detection methods for remote sensing images is an important ongoing research agenda. However, detection of change through optical remote sensing images is not a trivial task due to many factors including the vagueness between the boundaries of changed and unchanged regions and spatial dependence of the pixels to its neighborhood. In this paper, we propose a binary change detection technique for bi-temporal optical remote sensing images. As in most of the optical remote sensing images, the transition between the two clusters (change and no change) is overlapping and the existing methods are incapable of providing the accurate cluster boundaries. In this regard, a methodology has been proposed which uses the fuzzy c-means clustering to tackle the problem of vagueness in the changed and unchanged class by formulating the soft boundaries between them. Furthermore, in order to exploit the neighborhood information of the pixels, the input patterns are generated corresponding to each pixel from bi-temporal images using 3×3, 5×5 and 7×7 window. The between images and within image spatial dependence of the pixels to its neighborhood is quantified by using Pearson product moment correlation and Moran’s I statistics, respectively. The proposed technique consists of two phases. At first, between images and within image spatial correlation is calculated to utilize the information that the pixels at different locations may not be independent. Second, fuzzy c-means technique is used to produce two clusters from input feature by not only taking care of vagueness between the changed and unchanged class but also by exploiting the spatial correlation of the pixels. To show the effectiveness of the proposed technique, experiments are conducted on multispectral and bi-temporal remote sensing images. A subset (2100×1212 pixels) of a pan-sharpened, bi-temporal Landsat 5 thematic mapper optical image of Los Angeles, California, is used in this study which shows a long period of the forest fire continued from July until October 2009. Early forest fire and later forest fire optical remote sensing images were acquired on July 5, 2009 and October 25, 2009, respectively. The proposed technique is used to detect the fire (which causes change on earth’s surface) and compared with the existing K-means clustering technique. Experimental results showed that proposed technique performs better than the already existing technique. The proposed technique can be easily extendable for optical hyperspectral images and is suitable for many practical applications. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=burned%20area" title="burned area">burned area</a>, <a href="https://publications.waset.org/abstracts/search?q=change%20detection" title=" change detection"> change detection</a>, <a href="https://publications.waset.org/abstracts/search?q=correlation" title=" correlation"> correlation</a>, <a href="https://publications.waset.org/abstracts/search?q=fuzzy%20clustering" title=" fuzzy clustering"> fuzzy clustering</a>, <a href="https://publications.waset.org/abstracts/search?q=optical%20remote%20sensing" title=" optical remote sensing"> optical remote sensing</a> </p> <a href="https://publications.waset.org/abstracts/82253/unsupervised-detection-of-burned-area-from-remote-sensing-images-using-spatial-correlation-and-fuzzy-clustering" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/82253.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">169</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3903</span> Flood Monitoring Using Active Microwave Remote Sensed Synthetic Aperture Radar Data</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Bikramjit%20Goswami">Bikramjit Goswami</a>, <a href="https://publications.waset.org/abstracts/search?q=Manoranjan%20Kalita"> Manoranjan Kalita</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Active microwave remote sensing is useful in remote sensing applications in cloud-covered regions in the world. Because of high spatial resolution, the spatial variations of land cover can be monitored in greater detail using synthetic aperture radar (SAR). Inundation is studied using the SAR images obtained from Sentinel-1A in both VH and VV polarizations in the present experimental study. The temporal variation of the SAR scattering coefficient values for the area gives a good indication of flood and its boundary. The study area is the district of Morigaon in the state of Assam in India. The period of flood monitoring study is the monsoon season of the year 2017, during which high flood occurred in the state of Assam. The variation of microwave scattering value shows a distinctive indication of flood from the non-flooded period. Frequent monitoring of flood in a large area (10 km x 10 km) using passive microwave sensing and pin-pointing the actual flooded portions (5 m x 5 m) within the flooded area using active microwave sensing, can be a highly useful combination, as revealed by the present experimental results. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=active%20remote%20sensing" title="active remote sensing">active remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=flood%20monitoring" title=" flood monitoring"> flood monitoring</a>, <a href="https://publications.waset.org/abstracts/search?q=microwave%20remote%20sensing" title=" microwave remote sensing"> microwave remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=synthetic%20aperture%20radar" title=" synthetic aperture radar"> synthetic aperture radar</a> </p> <a href="https://publications.waset.org/abstracts/105375/flood-monitoring-using-active-microwave-remote-sensed-synthetic-aperture-radar-data" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/105375.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">151</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3902</span> Efficient Schemes of Classifiers for Remote Sensing Satellite Imageries of Land Use Pattern Classifications</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=S.%20S.%20Patil">S. S. Patil</a>, <a href="https://publications.waset.org/abstracts/search?q=Sachidanand%20Kini"> Sachidanand Kini</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Classification of land use patterns is compelling in complexity and variability of remote sensing imageries data. An imperative research in remote sensing application exploited to mine some of the significant spatially variable factors as land cover and land use from satellite images for remote arid areas in Karnataka State, India. The diverse classification techniques, unsupervised and supervised consisting of maximum likelihood, Mahalanobis distance, and minimum distance are applied in Bellary District in Karnataka State, India for the classification of the raw satellite images. The accuracy evaluations of results are compared visually with the standard maps with ground-truths. We initiated with the maximum likelihood technique that gave the finest results and both minimum distance and Mahalanobis distance methods over valued agriculture land areas. In meanness of mislaid few irrelevant features due to the low resolution of the satellite images, high-quality accord between parameters extracted automatically from the developed maps and field observations was found. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mahalanobis%20distance" title="Mahalanobis distance">Mahalanobis distance</a>, <a href="https://publications.waset.org/abstracts/search?q=minimum%20distance" title=" minimum distance"> minimum distance</a>, <a href="https://publications.waset.org/abstracts/search?q=supervised" title=" supervised"> supervised</a>, <a href="https://publications.waset.org/abstracts/search?q=unsupervised" title=" unsupervised"> unsupervised</a>, <a href="https://publications.waset.org/abstracts/search?q=user%20classification%20accuracy" title=" user classification accuracy"> user classification accuracy</a>, <a href="https://publications.waset.org/abstracts/search?q=producer%27s%20classification%20accuracy" title=" producer&#039;s classification accuracy"> producer&#039;s classification accuracy</a>, <a href="https://publications.waset.org/abstracts/search?q=maximum%20likelihood" title=" maximum likelihood"> maximum likelihood</a>, <a href="https://publications.waset.org/abstracts/search?q=kappa%20coefficient" title=" kappa coefficient"> kappa coefficient</a> </p> <a href="https://publications.waset.org/abstracts/103621/efficient-schemes-of-classifiers-for-remote-sensing-satellite-imageries-of-land-use-pattern-classifications" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/103621.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">183</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3901</span> Oil Pollution Analysis of the Ecuadorian Rainforest Using Remote Sensing Methods</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Juan%20Heredia">Juan Heredia</a>, <a href="https://publications.waset.org/abstracts/search?q=Naci%20Dilekli"> Naci Dilekli</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The Ecuadorian Rainforest has been polluted for almost 60 years with little to no regard to oversight, law, or regulations. The consequences have been vast environmental damage such as pollution and deforestation, as well as sickness and the death of many people and animals. The aim of this paper is to quantify and localize the polluted zones, which something that has not been conducted and is the first step for remediation. To approach this problem, multi-spectral Remote Sensing imagery was utilized using a novel algorithm developed for this study, based on four normalized indices available in the literature. The algorithm classifies the pixels in polluted or healthy ones. The results of this study include a new algorithm for pixel classification and quantification of the polluted area in the selected image. Those results were finally validated by ground control points found in the literature. The main conclusion of this work is that using hyperspectral images, it is possible to identify polluted vegetation. The future work is environmental remediation, in-situ tests, and more extensive results that would inform new policymaking. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title="remote sensing">remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=oil%20pollution%20quatification" title=" oil pollution quatification"> oil pollution quatification</a>, <a href="https://publications.waset.org/abstracts/search?q=amazon%20forest" title=" amazon forest"> amazon forest</a>, <a href="https://publications.waset.org/abstracts/search?q=hyperspectral%20remote%20sensing" title=" hyperspectral remote sensing"> hyperspectral remote sensing</a> </p> <a href="https://publications.waset.org/abstracts/127618/oil-pollution-analysis-of-the-ecuadorian-rainforest-using-remote-sensing-methods" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/127618.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">163</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3900</span> 3D Remote Sensing Images Parallax Refining Based On HTML5</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Qian%20Pei">Qian Pei</a>, <a href="https://publications.waset.org/abstracts/search?q=Hengjian%20Tong"> Hengjian Tong</a>, <a href="https://publications.waset.org/abstracts/search?q=Weitao%20Chen"> Weitao Chen</a>, <a href="https://publications.waset.org/abstracts/search?q=Hai%20Wang"> Hai Wang</a>, <a href="https://publications.waset.org/abstracts/search?q=Yanrong%20Feng"> Yanrong Feng</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Horizontal parallax is the foundation of stereoscopic viewing. However, the human eye will feel uncomfortable and it will occur diplopia if horizontal parallax is larger than eye separation. Therefore, we need to do parallax refining before conducting stereoscopic observation. Although some scholars have been devoted to online remote sensing refining, the main work of image refining is completed on the server side. There will be a significant delay when multiple users access the server at the same time. The emergence of HTML5 technology in recent years makes it possible to develop rich browser web application. Authors complete the image parallax refining on the browser side based on HTML5, while server side only need to transfer image data and parallax file to browser side according to the browser’s request. In this way, we can greatly reduce the server CPU load and allow a large number of users to access server in parallel and respond the user’s request quickly. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=3D%20remote%20sensing%20images" title="3D remote sensing images">3D remote sensing images</a>, <a href="https://publications.waset.org/abstracts/search?q=parallax" title=" parallax"> parallax</a>, <a href="https://publications.waset.org/abstracts/search?q=online%20refining" title=" online refining"> online refining</a>, <a href="https://publications.waset.org/abstracts/search?q=rich%20browser%20web%20application" title=" rich browser web application"> rich browser web application</a>, <a href="https://publications.waset.org/abstracts/search?q=HTML5" title=" HTML5"> HTML5</a> </p> <a href="https://publications.waset.org/abstracts/19927/3d-remote-sensing-images-parallax-refining-based-on-html5" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/19927.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">461</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3899</span> Geographic Information Systems and Remotely Sensed Data for the Hydrological Modelling of Mazowe Dam</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ellen%20Nhedzi%20Gozo">Ellen Nhedzi Gozo</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Unavailability of adequate hydro-meteorological data has always limited the analysis and understanding of hydrological behaviour of several dam catchments including Mazowe Dam in Zimbabwe. The problem of insufficient data for Mazowe Dam catchment analysis was solved by extracting catchment characteristics and aerial hydro-meteorological data from ASTER, LANDSAT, Shuttle Radar Topographic Mission SRTM remote sensing (RS) images using ILWIS, ArcGIS and ERDAS Imagine geographic information systems (GIS) software. Available observed hydrological as well as meteorological data complemented the use of the remotely sensed information. Ground truth land cover was mapped using a Garmin Etrex global positioning system (GPS) system. This information was then used to validate land cover classification detail that was obtained from remote sensing images. A bathymetry survey was conducted using a SONAR system connected to GPS. Hydrological modelling using the HBV model was then performed to simulate the hydrological process of the catchment in an effort to verify the reliability of the derived parameters. The model output shows a high Nash-Sutcliffe Coefficient that is close to 1 indicating that the parameters derived from remote sensing and GIS can be applied with confidence in the analysis of Mazowe Dam catchment. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=geographic%20information%20systems" title="geographic information systems">geographic information systems</a>, <a href="https://publications.waset.org/abstracts/search?q=hydrological%20modelling" title=" hydrological modelling"> hydrological modelling</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title=" remote sensing"> remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=water%20resources%20management" title=" water resources management"> water resources management</a> </p> <a href="https://publications.waset.org/abstracts/46387/geographic-information-systems-and-remotely-sensed-data-for-the-hydrological-modelling-of-mazowe-dam" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/46387.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">336</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3898</span> Elevating Environmental Impact Assessment through Remote Sensing in Engineering</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Spoorthi%20Srupad">Spoorthi Srupad</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Environmental Impact Assessment (EIA) stands as a critical engineering application facilitated by Earth Resources and Environmental Remote Sensing. Employing advanced technologies, this process enables a systematic evaluation of potential environmental impacts arising from engineering projects. Remote sensing techniques, including satellite imagery and geographic information systems (GIS), play a pivotal role in providing comprehensive data for assessing changes in land cover, vegetation, water bodies, and air quality. This abstract delves into the significance of EIA in engineering, emphasizing its role in ensuring sustainable and environmentally responsible practices. The integration of remote sensing technologies enhances the accuracy and efficiency of impact assessments, contributing to informed decision-making and the mitigation of adverse environmental consequences associated with engineering endeavors. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=environmental%20impact%20assessment" title="environmental impact assessment">environmental impact assessment</a>, <a href="https://publications.waset.org/abstracts/search?q=engineering%20applications" title=" engineering applications"> engineering applications</a>, <a href="https://publications.waset.org/abstracts/search?q=sustainability" title=" sustainability"> sustainability</a>, <a href="https://publications.waset.org/abstracts/search?q=environmental%20monitoring" title=" environmental monitoring"> environmental monitoring</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title=" remote sensing"> remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=geographic%20information%20systems" title=" geographic information systems"> geographic information systems</a>, <a href="https://publications.waset.org/abstracts/search?q=environmental%20management" title=" environmental management"> environmental management</a> </p> <a href="https://publications.waset.org/abstracts/179151/elevating-environmental-impact-assessment-through-remote-sensing-in-engineering" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/179151.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">92</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3897</span> Integration of GIS with Remote Sensing and GPS for Disaster Mitigation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sikander%20Nawaz%20Khan">Sikander Nawaz Khan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Natural disasters like flood, earthquake, cyclone, volcanic eruption and others are causing immense losses to the property and lives every year. Current status and actual loss information of natural hazards can be determined and also prediction for next probable disasters can be made using different remote sensing and mapping technologies. Global Positioning System (GPS) calculates the exact position of damage. It can also communicate with wireless sensor nodes embedded in potentially dangerous places. GPS provide precise and accurate locations and other related information like speed, track, direction and distance of target object to emergency responders. Remote Sensing facilitates to map damages without having physical contact with target area. Now with the addition of more remote sensing satellites and other advancements, early warning system is used very efficiently. Remote sensing is being used both at local and global scale. High Resolution Satellite Imagery (HRSI), airborne remote sensing and space-borne remote sensing is playing vital role in disaster management. Early on Geographic Information System (GIS) was used to collect, arrange, and map the spatial information but now it has capability to analyze spatial data. This analytical ability of GIS is the main cause of its adaption by different emergency services providers like police and ambulance service. Full potential of these so called 3S technologies cannot be used in alone. Integration of GPS and other remote sensing techniques with GIS has pointed new horizons in modeling of earth science activities. Many remote sensing cases including Asian Ocean Tsunami in 2004, Mount Mangart landslides and Pakistan-India earthquake in 2005 are described in this paper. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=disaster%20mitigation" title="disaster mitigation">disaster mitigation</a>, <a href="https://publications.waset.org/abstracts/search?q=GIS" title=" GIS"> GIS</a>, <a href="https://publications.waset.org/abstracts/search?q=GPS" title=" GPS"> GPS</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title=" remote sensing"> remote sensing</a> </p> <a href="https://publications.waset.org/abstracts/11085/integration-of-gis-with-remote-sensing-and-gps-for-disaster-mitigation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/11085.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">481</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3896</span> Analysis of Spatial and Temporal Data Using Remote Sensing Technology</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Kapil%20Pandey">Kapil Pandey</a>, <a href="https://publications.waset.org/abstracts/search?q=Vishnu%20Goyal"> Vishnu Goyal</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Spatial and temporal data analysis is very well known in the field of satellite image processing. When spatial data are correlated with time, series analysis it gives the significant results in change detection studies. In this paper the GIS and Remote sensing techniques has been used to find the change detection using time series satellite imagery of Uttarakhand state during the years of 1990-2010. Natural vegetation, urban area, forest cover etc. were chosen as main landuse classes to study. Landuse/ landcover classes within several years were prepared using satellite images. Maximum likelihood supervised classification technique was adopted in this work and finally landuse change index has been generated and graphical models were used to present the changes. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=GIS" title="GIS">GIS</a>, <a href="https://publications.waset.org/abstracts/search?q=landuse%2Flandcover" title=" landuse/landcover"> landuse/landcover</a>, <a href="https://publications.waset.org/abstracts/search?q=spatial%20and%20temporal%20data" title=" spatial and temporal data"> spatial and temporal data</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title=" remote sensing"> remote sensing</a> </p> <a href="https://publications.waset.org/abstracts/40918/analysis-of-spatial-and-temporal-data-using-remote-sensing-technology" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/40918.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">433</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3895</span> Advancing Horizons: Standardized Future Trends in LiDAR and Remote Sensing Technologies</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Spoorthi%20Sripad">Spoorthi Sripad</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Rapid advancements in LiDAR (Light Detection and Ranging) technology, coupled with the synergy of remote sensing, have revolutionized Earth observation methodologies. This paper delves into the transformative impact of integrated LiDAR and remote sensing systems. Focusing on miniaturization, cost reduction, and improved resolution, the study explores the evolving landscape of terrestrial and aquatic environmental monitoring. The integration of multi-wavelength and dual-mode LiDAR systems, alongside collaborative efforts with other remote sensing technologies, presents a comprehensive approach. The paper highlights the pivotal role of LiDAR in environmental assessment, urban planning, and infrastructure development. As the amalgamation of LiDAR and remote sensing reshapes Earth observation, this research anticipates a paradigm shift in our understanding of dynamic planetary processes. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=LiDAR" title="LiDAR">LiDAR</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title=" remote sensing"> remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=earth%20observation" title=" earth observation"> earth observation</a>, <a href="https://publications.waset.org/abstracts/search?q=advancements" title=" advancements"> advancements</a>, <a href="https://publications.waset.org/abstracts/search?q=integration" title=" integration"> integration</a>, <a href="https://publications.waset.org/abstracts/search?q=environmental%20monitoring" title=" environmental monitoring"> environmental monitoring</a>, <a href="https://publications.waset.org/abstracts/search?q=multi-wavelength" title=" multi-wavelength"> multi-wavelength</a>, <a href="https://publications.waset.org/abstracts/search?q=dual-mode" title=" dual-mode"> dual-mode</a>, <a href="https://publications.waset.org/abstracts/search?q=technology" title=" technology"> technology</a>, <a href="https://publications.waset.org/abstracts/search?q=urban%20planning" title=" urban planning"> urban planning</a>, <a href="https://publications.waset.org/abstracts/search?q=infrastructure" title=" infrastructure"> infrastructure</a>, <a href="https://publications.waset.org/abstracts/search?q=resolution" title=" resolution"> resolution</a>, <a href="https://publications.waset.org/abstracts/search?q=miniaturization" title=" miniaturization"> miniaturization</a> </p> <a href="https://publications.waset.org/abstracts/179167/advancing-horizons-standardized-future-trends-in-lidar-and-remote-sensing-technologies" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/179167.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">83</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3894</span> Land Cover Remote Sensing Classification Advanced Neural Networks Supervised Learning</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Eiman%20Kattan">Eiman Kattan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This study aims to evaluate the impact of classifying labelled remote sensing images conventional neural network (CNN) architecture, i.e., AlexNet on different land cover scenarios based on two remotely sensed datasets from different point of views such as the computational time and performance. Thus, a set of experiments were conducted to specify the effectiveness of the selected convolutional neural network using two implementing approaches, named fully trained and fine-tuned. For validation purposes, two remote sensing datasets, AID, and RSSCN7 which are publicly available and have different land covers features were used in the experiments. These datasets have a wide diversity of input data, number of classes, amount of labelled data, and texture patterns. A specifically designed interactive deep learning GPU training platform for image classification (Nvidia Digit) was employed in the experiments. It has shown efficiency in training, validation, and testing. As a result, the fully trained approach has achieved a trivial result for both of the two data sets, AID and RSSCN7 by 73.346% and 71.857% within 24 min, 1 sec and 8 min, 3 sec respectively. However, dramatic improvement of the classification performance using the fine-tuning approach has been recorded by 92.5% and 91% respectively within 24min, 44 secs and 8 min 41 sec respectively. The represented conclusion opens the opportunities for a better classification performance in various applications such as agriculture and crops remote sensing. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=conventional%20neural%20network" title="conventional neural network">conventional neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title=" remote sensing"> remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=land%20cover" title=" land cover"> land cover</a>, <a href="https://publications.waset.org/abstracts/search?q=land%20use" title=" land use"> land use</a> </p> <a href="https://publications.waset.org/abstracts/80774/land-cover-remote-sensing-classification-advanced-neural-networks-supervised-learning" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/80774.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">370</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3893</span> Automated Feature Extraction and Object-Based Detection from High-Resolution Aerial Photos Based on Machine Learning and Artificial Intelligence</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mohammed%20Al%20Sulaimani">Mohammed Al Sulaimani</a>, <a href="https://publications.waset.org/abstracts/search?q=Hamad%20Al%20Manhi"> Hamad Al Manhi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> With the development of Remote Sensing technology, the resolution of optical Remote Sensing images has greatly improved, and images have become largely available. Numerous detectors have been developed for detecting different types of objects. In the past few years, Remote Sensing has benefited a lot from deep learning, particularly Deep Convolution Neural Networks (CNNs). Deep learning holds great promise to fulfill the challenging needs of Remote Sensing and solving various problems within different fields and applications. The use of Unmanned Aerial Systems in acquiring Aerial Photos has become highly used and preferred by most organizations to support their activities because of their high resolution and accuracy, which make the identification and detection of very small features much easier than Satellite Images. And this has opened an extreme era of Deep Learning in different applications not only in feature extraction and prediction but also in analysis. This work addresses the capacity of Machine Learning and Deep Learning in detecting and extracting Oil Leaks from Flowlines (Onshore) using High-Resolution Aerial Photos which have been acquired by UAS fixed with RGB Sensor to support early detection of these leaks and prevent the company from the leak’s losses and the most important thing environmental damage. Here, there are two different approaches and different methods of DL have been demonstrated. The first approach focuses on detecting the Oil Leaks from the RAW Aerial Photos (not processed) using a Deep Learning called Single Shoot Detector (SSD). The model draws bounding boxes around the leaks, and the results were extremely good. The second approach focuses on detecting the Oil Leaks from the Ortho-mosaiced Images (Georeferenced Images) by developing three Deep Learning Models using (MaskRCNN, U-Net and PSP-Net Classifier). Then, post-processing is performed to combine the results of these three Deep Learning Models to achieve a better detection result and improved accuracy. Although there is a relatively small amount of datasets available for training purposes, the Trained DL Models have shown good results in extracting the extent of the Oil Leaks and obtaining excellent and accurate detection. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=GIS" title="GIS">GIS</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title=" remote sensing"> remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=oil%20leak%20detection" title=" oil leak detection"> oil leak detection</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title=" machine learning"> machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=aerial%20photos" title=" aerial photos"> aerial photos</a>, <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20systems" title=" unmanned aerial systems"> unmanned aerial systems</a> </p> <a href="https://publications.waset.org/abstracts/187331/automated-feature-extraction-and-object-based-detection-from-high-resolution-aerial-photos-based-on-machine-learning-and-artificial-intelligence" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/187331.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">34</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3892</span> Application of Improved Semantic Communication Technology in Remote Sensing Data Transmission</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Tingwei%20Shu">Tingwei Shu</a>, <a href="https://publications.waset.org/abstracts/search?q=Dong%20Zhou"> Dong Zhou</a>, <a href="https://publications.waset.org/abstracts/search?q=Chengjun%20Guo"> Chengjun Guo</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Semantic communication is an emerging form of communication that realize intelligent communication by extracting semantic information of data at the source and transmitting it, and recovering the data at the receiving end. It can effectively solve the problem of data transmission under the situation of large data volume, low SNR and restricted bandwidth. With the development of Deep Learning, semantic communication further matures and is gradually applied in the fields of the Internet of Things, Uumanned Air Vehicle cluster communication, remote sensing scenarios, etc. We propose an improved semantic communication system for the situation where the data volume is huge and the spectrum resources are limited during the transmission of remote sensing images. At the transmitting, we need to extract the semantic information of remote sensing images, but there are some problems. The traditional semantic communication system based on Convolutional Neural Network cannot take into account the global semantic information and local semantic information of the image, which results in less-than-ideal image recovery at the receiving end. Therefore, we adopt the improved vision-Transformer-based structure as the semantic encoder instead of the mainstream one using CNN to extract the image semantic features. In this paper, we first perform pre-processing operations on remote sensing images to improve the resolution of the images in order to obtain images with more semantic information. We use wavelet transform to decompose the image into high-frequency and low-frequency components, perform bilinear interpolation on the high-frequency components and bicubic interpolation on the low-frequency components, and finally perform wavelet inverse transform to obtain the preprocessed image. We adopt the improved Vision-Transformer structure as the semantic coder to extract and transmit the semantic information of remote sensing images. The Vision-Transformer structure can better train the huge data volume and extract better image semantic features, and adopt the multi-layer self-attention mechanism to better capture the correlation between semantic features and reduce redundant features. Secondly, to improve the coding efficiency, we reduce the quadratic complexity of the self-attentive mechanism itself to linear so as to improve the image data processing speed of the model. We conducted experimental simulations on the RSOD dataset and compared the designed system with a semantic communication system based on CNN and image coding methods such as BGP and JPEG to verify that the method can effectively alleviate the problem of excessive data volume and improve the performance of image data communication. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=semantic%20communication" title="semantic communication">semantic communication</a>, <a href="https://publications.waset.org/abstracts/search?q=transformer" title=" transformer"> transformer</a>, <a href="https://publications.waset.org/abstracts/search?q=wavelet%20transform" title=" wavelet transform"> wavelet transform</a>, <a href="https://publications.waset.org/abstracts/search?q=data%20processing" title=" data processing"> data processing</a> </p> <a href="https://publications.waset.org/abstracts/167726/application-of-improved-semantic-communication-technology-in-remote-sensing-data-transmission" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/167726.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">78</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3891</span> A Novel Spectral Index for Automatic Shadow Detection in Urban Mapping Based on WorldView-2 Satellite Imagery</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Kaveh%20Shahi">Kaveh Shahi</a>, <a href="https://publications.waset.org/abstracts/search?q=Helmi%20Z.%20M.%20Shafri"> Helmi Z. M. Shafri</a>, <a href="https://publications.waset.org/abstracts/search?q=Ebrahim%20Taherzadeh"> Ebrahim Taherzadeh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In remote sensing, shadow causes problems in many applications such as change detection and classification. It is caused by objects which are elevated, thus can directly affect the accuracy of information. For these reasons, it is very important to detect shadows particularly in urban high spatial resolution imagery which created a significant problem. This paper focuses on automatic shadow detection based on a new spectral index for multispectral imagery known as Shadow Detection Index (SDI). The new spectral index was tested on different areas of World-View 2 images and the results demonstrated that the new spectral index has a massive potential to extract shadows effectively and automatically. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=spectral%20index" title="spectral index">spectral index</a>, <a href="https://publications.waset.org/abstracts/search?q=shadow%20detection" title=" shadow detection"> shadow detection</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing%20images" title=" remote sensing images"> remote sensing images</a>, <a href="https://publications.waset.org/abstracts/search?q=World-View%202" title=" World-View 2"> World-View 2</a> </p> <a href="https://publications.waset.org/abstracts/13500/a-novel-spectral-index-for-automatic-shadow-detection-in-urban-mapping-based-on-worldview-2-satellite-imagery" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/13500.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">538</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3890</span> Use of Satellite Imaging to Understand Earth’s Surface Features: A Roadmap</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sabri%20Serkan%20Gulluoglu">Sabri Serkan Gulluoglu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> It is possible with Geographic Information Systems (GIS) that the information about all natural and artificial resources on the earth is obtained taking advantage of satellite images are obtained by remote sensing techniques. However, determination of unknown sources, mapping of the distribution and efficient evaluation of resources are defined may not be possible with the original image. For this reasons, some process steps are needed like transformation, pre-processing, image enhancement and classification to provide the most accurate assessment numerically and visually. Many studies which present the phases of obtaining and processing of the satellite images have examined in the literature study. The research showed that the determination of the process steps may be followed at this subject with the existence of a common whole may provide to progress the process rapidly for the necessary and possible studies which will be. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title="remote sensing">remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=satellite%20imaging" title=" satellite imaging"> satellite imaging</a>, <a href="https://publications.waset.org/abstracts/search?q=gis" title=" gis"> gis</a>, <a href="https://publications.waset.org/abstracts/search?q=computer%20science" title=" computer science"> computer science</a>, <a href="https://publications.waset.org/abstracts/search?q=information" title=" information"> information</a> </p> <a href="https://publications.waset.org/abstracts/5599/use-of-satellite-imaging-to-understand-earths-surface-features-a-roadmap" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/5599.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">318</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3889</span> Tree Species Classification Using Effective Features of Polarimetric SAR and Hyperspectral Images</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Milad%20Vahidi">Milad Vahidi</a>, <a href="https://publications.waset.org/abstracts/search?q=Mahmod%20R.%20Sahebi"> Mahmod R. Sahebi</a>, <a href="https://publications.waset.org/abstracts/search?q=Mehrnoosh%20Omati"> Mehrnoosh Omati</a>, <a href="https://publications.waset.org/abstracts/search?q=Reza%20Mohammadi"> Reza Mohammadi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Forest management organizations need information to perform their work effectively. Remote sensing is an effective method to acquire information from the Earth. Two datasets of remote sensing images were used to classify forested regions. Firstly, all of extractable features from hyperspectral and PolSAR images were extracted. The optical features were spectral indexes related to the chemical, water contents, structural indexes, effective bands and absorption features. Also, PolSAR features were the original data, target decomposition components, and SAR discriminators features. Secondly, the particle swarm optimization (PSO) and the genetic algorithms (GA) were applied to select optimization features. Furthermore, the support vector machine (SVM) classifier was used to classify the image. The results showed that the combination of PSO and SVM had higher overall accuracy than the other cases. This combination provided overall accuracy about 90.56%. The effective features were the spectral index, the bands in shortwave infrared (SWIR) and the visible ranges and certain PolSAR features. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=hyperspectral" title="hyperspectral">hyperspectral</a>, <a href="https://publications.waset.org/abstracts/search?q=PolSAR" title=" PolSAR"> PolSAR</a>, <a href="https://publications.waset.org/abstracts/search?q=feature%20selection" title=" feature selection"> feature selection</a>, <a href="https://publications.waset.org/abstracts/search?q=SVM" title=" SVM"> SVM</a> </p> <a href="https://publications.waset.org/abstracts/95461/tree-species-classification-using-effective-features-of-polarimetric-sar-and-hyperspectral-images" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/95461.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">416</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3888</span> Runoff Estimation Using NRCS-CN Method</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=E.%20K.%20Naseela">E. K. Naseela</a>, <a href="https://publications.waset.org/abstracts/search?q=B.%20M.%20Dodamani"> B. M. Dodamani</a>, <a href="https://publications.waset.org/abstracts/search?q=Chaithra%20Chandran"> Chaithra Chandran</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The GIS and remote sensing techniques facilitate accurate estimation of surface runoff from watershed. In the present study an attempt has been made to evaluate the applicability of Natural Resources Service Curve Number method using GIS and Remote sensing technique in the upper Krishna basin (69,425 Sq.km). Landsat 7 (with resolution 30 m) satellite data for the year 2012 has been used for the preparation of land use land cover (LU/LC) map. The hydrologic soil group is mapped using GIS platform. The weighted curve numbers (CN) for all the 5 subcatchments calculated on the basis of LU/LC type and hydrologic soil class in the area by considering antecedent moisture condition. Monthly rainfall data was available for 58 raingauge stations. Overlay technique is adopted for generating weighted curve number. Results of the study show that land use changes determined from satellite images are useful in studying the runoff response of the basin. The results showed that there is no significant difference between observed and estimated runoff depths. For each subcatchment, statistically positive correlations were detected between observed and estimated runoff depth (0.6<R^2<1). Therefore, the study reveals that Remote Sensing and GIS based NRCS-CN model can be used effectively to estimate the runoff from the ungauged watersheds when adequate hydrological information is not available. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=curve%20number" title="curve number">curve number</a>, <a href="https://publications.waset.org/abstracts/search?q=GIS" title=" GIS"> GIS</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title=" remote sensing"> remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=runoff" title=" runoff"> runoff</a> </p> <a href="https://publications.waset.org/abstracts/32748/runoff-estimation-using-nrcs-cn-method" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/32748.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">539</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3887</span> Assesing Spatio-Temporal Growth of Kochi City Using Remote Sensing Data</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Navya%20Saira%20%20George">Navya Saira George</a>, <a href="https://publications.waset.org/abstracts/search?q=Patroba%20Achola%20Odera"> Patroba Achola Odera</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This study aims to determine spatio-temporal expansion of Kochi City, situated on the west coast of Kerala State in India. Remote sensing and GIS techniques have been used to determine land use/cover and urban expansion of the City. Classification of Landsat images of the years 1973, 1988, 2002 and 2018 have been used to reproduce a visual story of the growth of the City over a period of 45 years. Accuracy range of 0.79 ~ 0.86 is achieved with kappa coefficient range of 0.69 ~ 0.80. Results show that the areas covered by vegetation and water bodies decreased progressively from 53.0 ~ 30.1% and 34.1 ~ 26.2% respectively, while built-up areas increased steadily from 12.5 to 42.2% over the entire study period (1973 ~ 2018). The shift in land use from agriculture to non-agriculture may be attributed to the land reforms since 1980s. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Geographical%20Information%20Systems" title="Geographical Information Systems">Geographical Information Systems</a>, <a href="https://publications.waset.org/abstracts/search?q=Kochi%20City" title=" Kochi City"> Kochi City</a>, <a href="https://publications.waset.org/abstracts/search?q=Land%20use%2Fcover" title=" Land use/cover"> Land use/cover</a>, <a href="https://publications.waset.org/abstracts/search?q=Remote%20Sensing" title=" Remote Sensing"> Remote Sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=Urban%20Sprawl" title=" Urban Sprawl"> Urban Sprawl</a> </p> <a href="https://publications.waset.org/abstracts/124291/assesing-spatio-temporal-growth-of-kochi-city-using-remote-sensing-data" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/124291.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">129</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3886</span> Sub-Pixel Mapping Based on New Mixed Interpolation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Zeyu%20Zhou">Zeyu Zhou</a>, <a href="https://publications.waset.org/abstracts/search?q=Xiaojun%20Bi"> Xiaojun Bi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Due to the limited environmental parameters and the limited resolution of the sensor, the universal existence of the mixed pixels in the process of remote sensing images restricts the spatial resolution of the remote sensing images. Sub-pixel mapping technology can effectively improve the spatial resolution. As the bilinear interpolation algorithm inevitably produces the edge blur effect, which leads to the inaccurate sub-pixel mapping results. In order to avoid the edge blur effect that affects the sub-pixel mapping results in the interpolation process, this paper presents a new edge-directed interpolation algorithm which uses the covariance adaptive interpolation algorithm on the edge of the low-resolution image and uses bilinear interpolation algorithm in the low-resolution image smooth area. By using the edge-directed interpolation algorithm, the super-resolution of the image with low resolution is obtained, and we get the percentage of each sub-pixel under a certain type of high-resolution image. Then we rely on the probability value as a soft attribute estimate and carry out sub-pixel scale under the ‘hard classification’. Finally, we get the result of sub-pixel mapping. Through the experiment, we compare the algorithm and the bilinear algorithm given in this paper to the results of the sub-pixel mapping method. It is found that the sub-pixel mapping method based on the edge-directed interpolation algorithm has better edge effect and higher mapping accuracy. The results of the paper meet our original intention of the question. At the same time, the method does not require iterative computation and training of samples, making it easier to implement. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing%20images" title="remote sensing images">remote sensing images</a>, <a href="https://publications.waset.org/abstracts/search?q=sub-pixel%20mapping" title=" sub-pixel mapping"> sub-pixel mapping</a>, <a href="https://publications.waset.org/abstracts/search?q=bilinear%20interpolation" title=" bilinear interpolation"> bilinear interpolation</a>, <a href="https://publications.waset.org/abstracts/search?q=edge-directed%20interpolation" title=" edge-directed interpolation"> edge-directed interpolation</a> </p> <a href="https://publications.waset.org/abstracts/77883/sub-pixel-mapping-based-on-new-mixed-interpolation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/77883.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">229</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3885</span> Self-Attention Mechanism for Target Hiding Based on Satellite Images</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hao%20Yuan">Hao Yuan</a>, <a href="https://publications.waset.org/abstracts/search?q=Yongjian%20Shen"> Yongjian Shen</a>, <a href="https://publications.waset.org/abstracts/search?q=Xiangjun%20He"> Xiangjun He</a>, <a href="https://publications.waset.org/abstracts/search?q=Yuheng%20Li"> Yuheng Li</a>, <a href="https://publications.waset.org/abstracts/search?q=Zhouzhou%20Zhang"> Zhouzhou Zhang</a>, <a href="https://publications.waset.org/abstracts/search?q=Pengyu%20Zhang"> Pengyu Zhang</a>, <a href="https://publications.waset.org/abstracts/search?q=Minkang%20Cai"> Minkang Cai</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Remote sensing data can provide support for decision-making in disaster assessment or disaster relief. The traditional processing methods of sensitive targets in remote sensing mapping are mainly based on manual retrieval and image editing tools, which are inefficient. Methods based on deep learning for sensitive target hiding are faster and more flexible. But these methods have disadvantages in training time and cost of calculation. This paper proposed a target hiding model Self Attention (SA) Deepfill, which used self-attention modules to replace part of gated convolution layers in image inpainting. By this operation, the calculation amount of the model becomes smaller, and the performance is improved. And this paper adds free-form masks to the model’s training to enhance the model’s universal. The experiment on an open remote sensing dataset proved the efficiency of our method. Moreover, through experimental comparison, the proposed method can train for a longer time without over-fitting. Finally, compared with the existing methods, the proposed model has lower computational weight and better performance. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing%20mapping" title="remote sensing mapping">remote sensing mapping</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20inpainting" title=" image inpainting"> image inpainting</a>, <a href="https://publications.waset.org/abstracts/search?q=self-attention%20mechanism" title=" self-attention mechanism"> self-attention mechanism</a>, <a href="https://publications.waset.org/abstracts/search?q=target%20hiding" title=" target hiding"> target hiding</a> </p> <a href="https://publications.waset.org/abstracts/166828/self-attention-mechanism-for-target-hiding-based-on-satellite-images" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/166828.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">136</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3884</span> The Study of Dengue Fever Outbreak in Thailand Using Geospatial Techniques, Satellite Remote Sensing Data and Big Data</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Tanapat%20Chongkamunkong">Tanapat Chongkamunkong</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The objective of this paper is to present a practical use of Geographic Information System (GIS) to the public health from spatial correlation between multiple factors and dengue fever outbreak. Meteorological factors, demographic factors and environmental factors are compiled using GIS techniques along with the Global Satellite Mapping Remote Sensing (RS) data. We use monthly dengue fever cases, population density, precipitation, Digital Elevation Model (DEM) data. The scope cover study area under climate change of the El Niño–Southern Oscillation (ENSO) indicated by sea surface temperature (SST) and study area in 12 provinces of Thailand as remote sensing (RS) data from January 2007 to December 2014. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=dengue%20fever" title="dengue fever">dengue fever</a>, <a href="https://publications.waset.org/abstracts/search?q=sea%20surface%20temperature" title=" sea surface temperature"> sea surface temperature</a>, <a href="https://publications.waset.org/abstracts/search?q=Geographic%20Information%20System%20%28GIS%29" title=" Geographic Information System (GIS)"> Geographic Information System (GIS)</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title=" remote sensing"> remote sensing</a> </p> <a href="https://publications.waset.org/abstracts/80471/the-study-of-dengue-fever-outbreak-in-thailand-using-geospatial-techniques-satellite-remote-sensing-data-and-big-data" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/80471.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">198</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">&lsaquo;</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=remote%20sensing%20images&amp;page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=remote%20sensing%20images&amp;page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=remote%20sensing%20images&amp;page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=remote%20sensing%20images&amp;page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=remote%20sensing%20images&amp;page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=remote%20sensing%20images&amp;page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=remote%20sensing%20images&amp;page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=remote%20sensing%20images&amp;page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=remote%20sensing%20images&amp;page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=remote%20sensing%20images&amp;page=130">130</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=remote%20sensing%20images&amp;page=131">131</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=remote%20sensing%20images&amp;page=2" rel="next">&rsaquo;</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">&copy; 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">&times;</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10