CINXE.COM

Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data

<!DOCTYPE html> <html lang="en" xmlns:og="http://ogp.me/ns#" xmlns:fb="https://www.facebook.com/2008/fbml"> <head> <meta charset="utf-8"> <meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1"> <meta content="mdpi" name="sso-service" /> <meta content="width=device-width, initial-scale=1.0" name="viewport" /> <title>Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data</title><link rel="stylesheet" href="https://pub.mdpi-res.com/assets/css/font-awesome.min.css?eb190a3a77e5e1ee?1739771134"> <link rel="stylesheet" href="https://pub.mdpi-res.com/assets/css/jquery.multiselect.css?f56c135cbf4d1483?1739771134"> <link rel="stylesheet" href="https://pub.mdpi-res.com/assets/css/chosen.min.css?d7ca5ca9441ef9e1?1739771134"> <link rel="stylesheet" href="https://pub.mdpi-res.com/assets/css/main2.css?6398b1425402cd8f?1739771134"> <link rel="mask-icon" href="https://pub.mdpi-res.com/img/mask-icon-128.svg?c1c7eca266cd7013?1739771134" color="#4f5671"> <link rel="apple-touch-icon" sizes="180x180" href="https://pub.mdpi-res.com/icon/apple-touch-icon-180x180.png?1739771134"> <link rel="apple-touch-icon" sizes="152x152" href="https://pub.mdpi-res.com/icon/apple-touch-icon-152x152.png?1739771134"> <link rel="apple-touch-icon" sizes="144x144" href="https://pub.mdpi-res.com/icon/apple-touch-icon-144x144.png?1739771134"> <link rel="apple-touch-icon" sizes="120x120" href="https://pub.mdpi-res.com/icon/apple-touch-icon-120x120.png?1739771134"> <link rel="apple-touch-icon" sizes="114x114" href="https://pub.mdpi-res.com/icon/apple-touch-icon-114x114.png?1739771134"> <link rel="apple-touch-icon" sizes="76x76" href="https://pub.mdpi-res.com/icon/apple-touch-icon-76x76.png?1739771134"> <link rel="apple-touch-icon" sizes="72x72" href="https://pub.mdpi-res.com/icon/apple-touch-icon-72x72.png?1739771134"> <link rel="apple-touch-icon" sizes="57x57" href="https://pub.mdpi-res.com/icon/apple-touch-icon-57x57.png?1739771134"> <link rel="apple-touch-icon" href="https://pub.mdpi-res.com/icon/apple-touch-icon-57x57.png?1739771134"> <link rel="apple-touch-icon-precomposed" href="https://pub.mdpi-res.com/icon/apple-touch-icon-57x57.png?1739771134"> <link rel="manifest" href="/manifest.json"> <meta name="theme-color" content="#ffffff"> <meta name="application-name" content="&nbsp;"/> <link rel="apple-touch-startup-image" href="https://pub.mdpi-res.com/img/journals/forests-logo-sq.png?8600e93ff98dbf14"> <link rel="apple-touch-icon" href="https://pub.mdpi-res.com/img/journals/forests-logo-sq.png?8600e93ff98dbf14"> <meta name="msapplication-TileImage" content="https://pub.mdpi-res.com/img/journals/forests-logo-sq.png?8600e93ff98dbf14"> <link rel="stylesheet" href="https://pub.mdpi-res.com/assets/css/jquery-ui-1.10.4.custom.min.css?80647d88647bf347?1739771134"> <link rel="stylesheet" href="https://pub.mdpi-res.com/assets/css/magnific-popup.min.css?04d343e036f8eecd?1739771134"> <link rel="stylesheet" href="https://pub.mdpi-res.com/assets/css/xml2html/article-html.css?230b005b39af4260?1739771134"> <style> h2, #abstract .related_suggestion_title { } .batch_articles a { color: #000; } a, .batch_articles .authors a, a:focus, a:hover, a:active, .batch_articles a:focus, .batch_articles a:hover, li.side-menu-li a { } span.label a { color: #fff; } #main-content a.title-link:hover, #main-content a.title-link:focus, #main-content div.generic-item a.title-link:hover, #main-content div.generic-item a.title-link:focus { } #main-content #middle-column .generic-item.article-item a.title-link:hover, #main-content #middle-column .generic-item.article-item a.title-link:focus { } .art-authors a.toEncode { color: #333; font-weight: 700; } #main-content #middle-column ul li::before { } .accordion-navigation.active a.accordion__title, .accordion-navigation.active a.accordion__title::after { } .accordion-navigation li:hover::before, .accordion-navigation li:hover a, .accordion-navigation li:focus a { } .relative-size-container .relative-size-image .relative-size { } .middle-column__help__fixed a:hover i, } input[type="checkbox"]:checked:after { } input[type="checkbox"]:not(:disabled):hover:before { } #main-content .bolded-text { } #main-content .hypothesis-count-container { } #main-content .hypothesis-count-container:before { } .full-size-menu ul li.menu-item .dropdown-wrapper { } .full-size-menu ul li.menu-item > a.open::after { } #title-story .title-story-orbit .orbit-caption { #background: url('/img/design/000000_background.png') !important; background: url('/img/design/ffffff_background.png') !important; color: rgb(51, 51, 51) !important; } #main-content .content__container__orbit { background-color: #000 !important; } #main-content .content__container__journal { color: #fff; } .html-article-menu .row span { } .html-article-menu .row span.active { } .accordion-navigation__journal .side-menu-li.active::before, .accordion-navigation__journal .side-menu-li.active a { color: rgba(37,78,31,0.75) !important; font-weight: 700; } .accordion-navigation__journal .side-menu-li:hover::before , .accordion-navigation__journal .side-menu-li:hover a { color: rgba(37,78,31,0.75) !important; } .side-menu-ul li.active a, .side-menu-ul li.active, .side-menu-ul li.active::before { color: rgba(37,78,31,0.75) !important; } .side-menu-ul li.active a { } .result-selected, .active-result.highlighted, .active-result:hover, .result-selected, .active-result.highlighted, .active-result:focus { } .search-container.search-container__default-scheme { } nav.tab-bar .open-small-search.active:after { } .search-container.search-container__default-scheme .custom-accordion-for-small-screen-link::after { color: #fff; } @media only screen and (max-width: 50em) { #main-content .content__container.journal-info { color: #fff; } #main-content .content__container.journal-info a { color: #fff; } } .button.button--color { } .button.button--color:hover, .button.button--color:focus { } .button.button--color-journal { position: relative; background-color: rgba(37,78,31,0.75); border-color: #fff; color: #fff !important; } .button.button--color-journal:hover::before { content: ''; position: absolute; top: 0; left: 0; height: 100%; width: 100%; background-color: #ffffff; opacity: 0.2; } .button.button--color-journal:visited, .button.button--color-journal:hover, .button.button--color-journal:focus { background-color: rgba(37,78,31,0.75); border-color: #fff; color: #fff !important; } .button.button--color path { } .button.button--color:hover path { fill: #fff; } #main-content #search-refinements .ui-slider-horizontal .ui-slider-range { } .breadcrumb__element:last-of-type a { } #main-header { } #full-size-menu .top-bar, #full-size-menu li.menu-item span.user-email { } .top-bar-section li:not(.has-form) a:not(.button) { } #full-size-menu li.menu-item .dropdown-wrapper li a:hover { } #full-size-menu li.menu-item a:hover, #full-size-menu li.menu.item a:focus, nav.tab-bar a:hover { } #full-size-menu li.menu.item a:active, #full-size-menu li.menu.item a.active { } #full-size-menu li.menu-item a.open-mega-menu.active, #full-size-menu li.menu-item div.mega-menu, a.open-mega-menu.active { } #full-size-menu li.menu-item div.mega-menu li, #full-size-menu li.menu-item div.mega-menu a { border-color: #9a9a9a; } div.type-section h2 { font-size: 20px; line-height: 26px; font-weight: 300; } div.type-section h3 { margin-left: 15px; margin-bottom: 0px; font-weight: 300; } .journal-tabs .tab-title.active a { } </style> <link rel="stylesheet" href="https://pub.mdpi-res.com/assets/css/slick.css?f38b2db10e01b157?1739771134"> <meta name="title" content="Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data"> <meta name="description" content="This study experiments with different combinations of UAV hyperspectral data and LiDAR metrics for classifying eight tree species found in a Brazilian Atlantic Forest remnant, the most degraded Brazilian biome with high fragmentation but with huge structural complexity. The selection of the species was done based on the number of tree samples, which exist in the plot data and in the fact the UAV imagery does not acquire information below the forest canopy. Due to the complexity of the forest, only species that exist in the upper canopy of the remnant were included in the classification. A combination of hyperspectral UAV images and LiDAR point clouds were in the experiment. The hyperspectral images were photogrammetric and radiometric processed to obtain orthomosaics with reflectance factor values. Raw spectra were extracted from the trees, and vegetation indices (VIs) were calculated. Regarding the LiDAR data, both the point cloud&mdash;referred to as Peak Returns (PR)&mdash;and the full-waveform (FWF) LiDAR were included in this study. The point clouds were processed to normalize the intensities and heights, and different metrics for each data type (PR and FWF) were extracted. Segmentation was preformed semi-automatically using the superpixel algorithm, followed with manual correction to ensure precise tree crown delineation before tree species classification. Thirteen different classification scenarios were tested. The scenarios included spectral features and LiDAR metrics either combined or not. The best result was obtained with all features transformed with principal component analysis with an accuracy of 76%, which did not differ significantly from the scenarios using the raw spectra or VIs with PR or FWF LiDAR metrics. The combination of spectral data with geometric information from LiDAR improved the classification of tree species in a complex tropical forest, and these results can serve to inform management and conservation practices of these forest remnants." > <link rel="image_src" href="https://pub.mdpi-res.com/img/journals/forests-logo.png?8600e93ff98dbf14" > <meta name="dc.title" content="Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data"> <meta name="dc.creator" content="Rorai Pereira Martins-Neto"> <meta name="dc.creator" content="Antonio Maria Garcia Tommaselli"> <meta name="dc.creator" content="Nilton Nobuhiro Imai"> <meta name="dc.creator" content="Eija Honkavaara"> <meta name="dc.creator" content="Milto Miltiadou"> <meta name="dc.creator" content="Erika Akemi Saito Moriya"> <meta name="dc.creator" content="Hassan Camil David"> <meta name="dc.type" content="Article"> <meta name="dc.source" content="Forests 2023, Vol. 14, Page 945"> <meta name="dc.date" content="2023-05-04"> <meta name ="dc.identifier" content="10.3390/f14050945"> <meta name="dc.publisher" content="Multidisciplinary Digital Publishing Institute"> <meta name="dc.rights" content="http://creativecommons.org/licenses/by/3.0/"> <meta name="dc.format" content="application/pdf" > <meta name="dc.language" content="en" > <meta name="dc.description" content="This study experiments with different combinations of UAV hyperspectral data and LiDAR metrics for classifying eight tree species found in a Brazilian Atlantic Forest remnant, the most degraded Brazilian biome with high fragmentation but with huge structural complexity. The selection of the species was done based on the number of tree samples, which exist in the plot data and in the fact the UAV imagery does not acquire information below the forest canopy. Due to the complexity of the forest, only species that exist in the upper canopy of the remnant were included in the classification. A combination of hyperspectral UAV images and LiDAR point clouds were in the experiment. The hyperspectral images were photogrammetric and radiometric processed to obtain orthomosaics with reflectance factor values. Raw spectra were extracted from the trees, and vegetation indices (VIs) were calculated. Regarding the LiDAR data, both the point cloud&mdash;referred to as Peak Returns (PR)&mdash;and the full-waveform (FWF) LiDAR were included in this study. The point clouds were processed to normalize the intensities and heights, and different metrics for each data type (PR and FWF) were extracted. Segmentation was preformed semi-automatically using the superpixel algorithm, followed with manual correction to ensure precise tree crown delineation before tree species classification. Thirteen different classification scenarios were tested. The scenarios included spectral features and LiDAR metrics either combined or not. The best result was obtained with all features transformed with principal component analysis with an accuracy of 76%, which did not differ significantly from the scenarios using the raw spectra or VIs with PR or FWF LiDAR metrics. The combination of spectral data with geometric information from LiDAR improved the classification of tree species in a complex tropical forest, and these results can serve to inform management and conservation practices of these forest remnants." > <meta name="dc.subject" content="Brazilian Atlantic Forest" > <meta name="dc.subject" content="tree species mapping" > <meta name="dc.subject" content="LiDAR" > <meta name="dc.subject" content="hyperspectral imaging" > <meta name="dc.subject" content="superpixel segmentation" > <meta name ="prism.issn" content="1999-4907"> <meta name ="prism.publicationName" content="Forests"> <meta name ="prism.publicationDate" content="2023-05-04"> <meta name ="prism.volume" content="14"> <meta name ="prism.number" content="5"> <meta name ="prism.section" content="Article" > <meta name ="prism.startingPage" content="945" > <meta name="citation_issn" content="1999-4907"> <meta name="citation_journal_title" content="Forests"> <meta name="citation_publisher" content="Multidisciplinary Digital Publishing Institute"> <meta name="citation_title" content="Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data"> <meta name="citation_publication_date" content="2023/5"> <meta name="citation_online_date" content="2023/05/04"> <meta name="citation_volume" content="14"> <meta name="citation_issue" content="5"> <meta name="citation_firstpage" content="945"> <meta name="citation_author" content="Pereira Martins-Neto, Rorai"> <meta name="citation_author" content="Garcia Tommaselli, Antonio Maria"> <meta name="citation_author" content="Imai, Nilton Nobuhiro"> <meta name="citation_author" content="Honkavaara, Eija"> <meta name="citation_author" content="Miltiadou, Milto"> <meta name="citation_author" content="Saito Moriya, Erika Akemi"> <meta name="citation_author" content="David, Hassan Camil"> <meta name="citation_doi" content="10.3390/f14050945"> <meta name="citation_id" content="mdpi-f14050945"> <meta name="citation_abstract_html_url" content="https://www.mdpi.com/1999-4907/14/5/945"> <meta name="citation_pdf_url" content="https://www.mdpi.com/1999-4907/14/5/945/pdf?version=1683252910"> <link rel="alternate" type="application/pdf" title="PDF Full-Text" href="https://www.mdpi.com/1999-4907/14/5/945/pdf?version=1683252910"> <meta name="fulltext_pdf" content="https://www.mdpi.com/1999-4907/14/5/945/pdf?version=1683252910"> <meta name="citation_fulltext_html_url" content="https://www.mdpi.com/1999-4907/14/5/945/htm"> <link rel="alternate" type="text/html" title="HTML Full-Text" href="https://www.mdpi.com/1999-4907/14/5/945/htm"> <meta name="fulltext_html" content="https://www.mdpi.com/1999-4907/14/5/945/htm"> <link rel="alternate" type="text/xml" title="XML Full-Text" href="https://www.mdpi.com/1999-4907/14/5/945/xml"> <meta name="fulltext_xml" content="https://www.mdpi.com/1999-4907/14/5/945/xml"> <meta name="citation_xml_url" content="https://www.mdpi.com/1999-4907/14/5/945/xml"> <meta name="twitter:card" content="summary" /> <meta name="twitter:site" content="@MDPIOpenAccess" /> <meta name="twitter:image" content="https://pub.mdpi-res.com/img/journals/forests-logo-social.png?8600e93ff98dbf14" /> <meta property="fb:app_id" content="131189377574"/> <meta property="og:site_name" content="MDPI"/> <meta property="og:type" content="article"/> <meta property="og:url" content="https://www.mdpi.com/1999-4907/14/5/945" /> <meta property="og:title" content="Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data" /> <meta property="og:description" content="This study experiments with different combinations of UAV hyperspectral data and LiDAR metrics for classifying eight tree species found in a Brazilian Atlantic Forest remnant, the most degraded Brazilian biome with high fragmentation but with huge structural complexity. The selection of the species was done based on the number of tree samples, which exist in the plot data and in the fact the UAV imagery does not acquire information below the forest canopy. Due to the complexity of the forest, only species that exist in the upper canopy of the remnant were included in the classification. A combination of hyperspectral UAV images and LiDAR point clouds were in the experiment. The hyperspectral images were photogrammetric and radiometric processed to obtain orthomosaics with reflectance factor values. Raw spectra were extracted from the trees, and vegetation indices (VIs) were calculated. Regarding the LiDAR data, both the point cloud&mdash;referred to as Peak Returns (PR)&mdash;and the full-waveform (FWF) LiDAR were included in this study. The point clouds were processed to normalize the intensities and heights, and different metrics for each data type (PR and FWF) were extracted. Segmentation was preformed semi-automatically using the superpixel algorithm, followed with manual correction to ensure precise tree crown delineation before tree species classification. Thirteen different classification scenarios were tested. The scenarios included spectral features and LiDAR metrics either combined or not. The best result was obtained with all features transformed with principal component analysis with an accuracy of 76%, which did not differ significantly from the scenarios using the raw spectra or VIs with PR or FWF LiDAR metrics. The combination of spectral data with geometric information from LiDAR improved the classification of tree species in a complex tropical forest, and these results can serve to inform management and conservation practices of these forest remnants." /> <meta property="og:image" content="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-ag-550.jpg?1683253040" /> <link rel="alternate" type="application/rss+xml" title="MDPI Publishing - Latest articles" href="https://www.mdpi.com/rss"> <meta name="google-site-verification" content="PxTlsg7z2S00aHroktQd57fxygEjMiNHydKn3txhvwY"> <meta name="facebook-domain-verification" content="mcoq8dtq6sb2hf7z29j8w515jjoof7" /> <script id="Cookiebot" data-cfasync="false" src="https://consent.cookiebot.com/uc.js" data-cbid="51491ddd-fe7a-4425-ab39-69c78c55829f" type="text/javascript" async></script> <!--[if lt IE 9]> <script>var browserIe8 = true;</script> <link rel="stylesheet" href="https://pub.mdpi-res.com/assets/css/ie8foundationfix.css?50273beac949cbf0?1739771134"> <script src="//html5shiv.googlecode.com/svn/trunk/html5.js"></script> <script src="//cdnjs.cloudflare.com/ajax/libs/html5shiv/3.6.2/html5shiv.js"></script> <script src="//s3.amazonaws.com/nwapi/nwmatcher/nwmatcher-1.2.5-min.js"></script> <script src="//html5base.googlecode.com/svn-history/r38/trunk/js/selectivizr-1.0.3b.js"></script> <script src="//cdnjs.cloudflare.com/ajax/libs/respond.js/1.1.0/respond.min.js"></script> <script src="https://pub.mdpi-res.com/assets/js/ie8/ie8patch.js?9e1d3c689a0471df?1739771134"></script> <script src="https://pub.mdpi-res.com/assets/js/ie8/rem.min.js?94b62787dcd6d2f2?1739771134"></script> <![endif]--> <script type="text/plain" data-cookieconsent="statistics"> (function(w,d,s,l,i){w[l]=w[l]||[];w[l].push({'gtm.start': new Date().getTime(),event:'gtm.js'});var f=d.getElementsByTagName(s)[0], j=d.createElement(s),dl=l!='dataLayer'?'&l='+l:'';j.async=true;j.src= 'https://www.googletagmanager.com/gtm.js?id='+i+dl;f.parentNode.insertBefore(j,f); })(window,document,'script','dataLayer','GTM-WPK7SW5'); </script> <script type="text/plain" data-cookieconsent="statistics"> _linkedin_partner_id = "2846186"; window._linkedin_data_partner_ids = window._linkedin_data_partner_ids || []; window._linkedin_data_partner_ids.push(_linkedin_partner_id); </script><script type="text/javascript"> (function(){var s = document.getElementsByTagName("script")[0]; var b = document.createElement("script"); b.type = "text/javascript";b.async = true; b.src = "https://snap.licdn.com/li.lms-analytics/insight.min.js"; s.parentNode.insertBefore(b, s);})(); </script> <script type="text/plain" data-cookieconsent="statistics" data-cfasync="false" src="//script.crazyegg.com/pages/scripts/0116/4951.js" async="async" ></script> </head> <body> <div class="direction direction_right" id="small_right" style="border-right-width: 0px; padding:0;"> <i class="fa fa-caret-right fa-2x"></i> </div> <div class="big_direction direction_right" id="big_right" style="border-right-width: 0px;"> <div style="text-align: right;"> Next Article in Journal<br> <div><a href="/1999-4907/14/5/946">Diversity of Spontaneous Plants in Eco-Parks and Its Relationship with Environmental Characteristics of Parks</a></div> Next Article in Special Issue<br> <div><a href="/1999-4907/14/5/1060">MARC-Net: Terrain Classification in Parallel Network Architectures Containing Multiple Attention Mechanisms and Multi-Scale Residual Cascades</a></div> </div> </div> <div class="direction" id="small_left" style="border-left-width: 0px"> <i class="fa fa-caret-left fa-2x"></i> </div> <div class="big_direction" id="big_left" style="border-left-width: 0px;"> <div> Previous Article in Journal<br> <div><a href="/1999-4907/14/5/944">The &ldquo;Oxygen Sink&rdquo; of Bamboo Shoots Regulates and Guarantees the Oxygen Supply for Aerobic Respiration</a></div> Previous Article in Special Issue<br> <div><a href="/1999-4907/14/4/754">Response Mechanism of Leaf Area Index and Main Nutrient Content in Mangrove Supported by Hyperspectral Data</a></div> </div> </div> <div style="clear: both;"></div> <div id="menuModal" class="reveal-modal reveal-modal-new reveal-modal-menu" aria-hidden="true" data-reveal role="dialog"> <div class="menu-container"> <div class="UI_NavMenu"> <div class="content__container " > <div class="custom-accordion-for-small-screen-link " > <h2>Journals</h2> </div> <div class="target-item custom-accordion-for-small-screen-content show-for-medium-up"> <div class="menu-container__links"> <div style="width: 100%; float: left;"> <a href="/about/journals">Active Journals</a> <a href="/about/journalfinder">Find a Journal</a> <a href="/about/journals/proposal">Journal Proposal</a> <a href="/about/proceedings">Proceedings Series</a> </div> </div> </div> </div> <a href="/topics"> <h2>Topics</h2> </a> <div class="content__container " > <div class="custom-accordion-for-small-screen-link " > <h2>Information</h2> </div> <div class="target-item custom-accordion-for-small-screen-content show-for-medium-up"> <div class="menu-container__links"> <div style="width: 100%; max-width: 200px; float: left;"> <a href="/authors">For Authors</a> <a href="/reviewers">For Reviewers</a> <a href="/editors">For Editors</a> <a href="/librarians">For Librarians</a> <a href="/publishing_services">For Publishers</a> <a href="/societies">For Societies</a> <a href="/conference_organizers">For Conference Organizers</a> </div> <div style="width: 100%; max-width: 250px; float: left;"> <a href="/openaccess">Open Access Policy</a> <a href="/ioap">Institutional Open Access Program</a> <a href="/special_issues_guidelines">Special Issues Guidelines</a> <a href="/editorial_process">Editorial Process</a> <a href="/ethics">Research and Publication Ethics</a> <a href="/apc">Article Processing Charges</a> <a href="/awards">Awards</a> <a href="/testimonials">Testimonials</a> </div> </div> </div> </div> <a href="/authors/english"> <h2>Author Services</h2> </a> <div class="content__container " > <div class="custom-accordion-for-small-screen-link " > <h2>Initiatives</h2> </div> <div class="target-item custom-accordion-for-small-screen-content show-for-medium-up"> <div class="menu-container__links"> <div style="width: 100%; float: left;"> <a href="https://sciforum.net" target="_blank" rel="noopener noreferrer">Sciforum</a> <a href="https://www.mdpi.com/books" target="_blank" rel="noopener noreferrer">MDPI Books</a> <a href="https://www.preprints.org" target="_blank" rel="noopener noreferrer">Preprints.org</a> <a href="https://www.scilit.com" target="_blank" rel="noopener noreferrer">Scilit</a> <a href="https://sciprofiles.com" target="_blank" rel="noopener noreferrer">SciProfiles</a> <a href="https://encyclopedia.pub" target="_blank" rel="noopener noreferrer">Encyclopedia</a> <a href="https://jams.pub" target="_blank" rel="noopener noreferrer">JAMS</a> <a href="/about/proceedings">Proceedings Series</a> </div> </div> </div> </div> <div class="content__container " > <div class="custom-accordion-for-small-screen-link " > <h2>About</h2> </div> <div class="target-item custom-accordion-for-small-screen-content show-for-medium-up"> <div class="menu-container__links"> <div style="width: 100%; float: left;"> <a href="/about">Overview</a> <a href="/about/contact">Contact</a> <a href="https://careers.mdpi.com" target="_blank" rel="noopener noreferrer">Careers</a> <a href="/about/announcements">News</a> <a href="/about/press">Press</a> <a href="http://blog.mdpi.com/" target="_blank" rel="noopener noreferrer">Blog</a> </div> </div> </div> </div> </div> <div class="menu-container__buttons"> <a class="button UA_SignInUpButton" href="/user/login">Sign In / Sign Up</a> </div> </div> </div> <div id="captchaModal" class="reveal-modal reveal-modal-new reveal-modal-new--small" data-reveal aria-label="Captcha" aria-hidden="true" role="dialog"></div> <div id="actionDisabledModal" class="reveal-modal" data-reveal aria-labelledby="actionDisableModalTitle" aria-hidden="true" role="dialog" style="width: 300px;"> <h2 id="actionDisableModalTitle">Notice</h2> <form action="/email/captcha" method="post" id="emailCaptchaForm"> <div class="row"> <div id="js-action-disabled-modal-text" class="small-12 columns"> </div> <div id="js-action-disabled-modal-submit" class="small-12 columns" style="margin-top: 10px; display: none;"> You can make submissions to other journals <a href="https://susy.mdpi.com/user/manuscripts/upload">here</a>. </div> </div> </form> <a class="close-reveal-modal" aria-label="Close"> <i class="material-icons">clear</i> </a> </div> <div id="rssNotificationModal" class="reveal-modal reveal-modal-new" data-reveal aria-labelledby="rssNotificationModalTitle" aria-hidden="true" role="dialog"> <div class="row"> <div class="small-12 columns"> <h2 id="rssNotificationModalTitle">Notice</h2> <p> You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader. </p> </div> </div> <div class="row"> <div class="small-12 columns"> <a class="button button--color js-rss-notification-confirm">Continue</a> <a class="button button--grey" onclick="$(this).closest('.reveal-modal').find('.close-reveal-modal').click(); return false;">Cancel</a> </div> </div> <a class="close-reveal-modal" aria-label="Close"> <i class="material-icons">clear</i> </a> </div> <div id="drop-article-label-openaccess" class="f-dropdown medium" data-dropdown-content aria-hidden="true" tabindex="-1"> <p> All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to <a href="https://www.mdpi.com/openaccess">https://www.mdpi.com/openaccess</a>. </p> </div> <div id="drop-article-label-feature" class="f-dropdown medium" data-dropdown-content aria-hidden="true" tabindex="-1"> <p> Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications. </p> <p> Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers. </p> </div> <div id="drop-article-label-choice" class="f-dropdown medium" data-dropdown-content aria-hidden="true" tabindex="-1"> <p> Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal. <div style="margin-top: -10px;"> <div id="drop-article-label-choice-journal-link" style="display: none; margin-top: -10px; padding-top: 10px;"> </div> </div> </p> </div> <div id="drop-article-label-resubmission" class="f-dropdown medium" data-dropdown-content aria-hidden="true" tabindex="-1"> <p> Original Submission Date Received: <span id="drop-article-label-resubmission-date"></span>. </p> </div> <div id="container"> <noscript> <div id="no-javascript"> You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled. </div> </noscript> <div class="fixed"> <nav class="tab-bar show-for-medium-down"> <div class="row full-width collapse"> <div class="medium-3 small-4 columns"> <a href="/"> <img class="full-size-menu__mdpi-logo" src="https://pub.mdpi-res.com/img/design/mdpi-pub-logo-black-small1.svg?da3a8dcae975a41c?1739771134" style="width: 64px;" title="MDPI Open Access Journals"> </a> </div> <div class="medium-3 small-4 columns right-aligned"> <div class="show-for-medium-down"> <a href="#" style="display: none;"> <i class="material-icons" onclick="$('#menuModal').foundation('reveal', 'close'); return false;">clear</i> </a> <a class="js-toggle-desktop-layout-link" title="Toggle desktop layout" style="display: none;" href="/toggle_desktop_layout_cookie"> <i class="material-icons">zoom_out_map</i> </a> <a href="#" class="js-open-small-search open-small-search"> <i class="material-icons show-for-small only">search</i> </a> <a title="MDPI main page" class="js-open-menu" data-reveal-id="menuModal" href="#"> <i class="material-icons">menu</i> </a> </div> </div> </div> </nav> </div> <section class="main-section"> <header> <div class="full-size-menu show-for-large-up"> <div class="row full-width"> <div class="large-1 columns"> <a href="/"> <img class="full-size-menu__mdpi-logo" src="https://pub.mdpi-res.com/img/design/mdpi-pub-logo-black-small1.svg?da3a8dcae975a41c?1739771134" title="MDPI Open Access Journals"> </a> </div> <div class="large-8 columns text-right UI_NavMenu"> <ul> <li class="menu-item"> <a href="/about/journals" data-dropdown="journals-dropdown" aria-controls="journals-dropdown" aria-expanded="false" data-options="is_hover: true; hover_timeout: 200">Journals</a> <ul id="journals-dropdown" class="f-dropdown dropdown-wrapper dropdown-wrapper__small" data-dropdown-content aria-hidden="true" tabindex="-1"> <li> <div class="row"> <div class="small-12 columns"> <ul> <li> <a href="/about/journals"> Active Journals </a> </li> <li> <a href="/about/journalfinder"> Find a Journal </a> </li> <li> <a href="/about/journals/proposal"> Journal Proposal </a> </li> <li> <a href="/about/proceedings"> Proceedings Series </a> </li> </ul> </div> </div> </li> </ul> </li> <li class="menu-item"> <a href="/topics">Topics</a> </li> <li class="menu-item"> <a href="/authors" data-dropdown="information-dropdown" aria-controls="information-dropdown" aria-expanded="false" data-options="is_hover:true; hover_timeout:200">Information</a> <ul id="information-dropdown" class="f-dropdown dropdown-wrapper" data-dropdown-content aria-hidden="true" tabindex="-1"> <li> <div class="row"> <div class="small-5 columns right-border"> <ul> <li> <a href="/authors">For Authors</a> </li> <li> <a href="/reviewers">For Reviewers</a> </li> <li> <a href="/editors">For Editors</a> </li> <li> <a href="/librarians">For Librarians</a> </li> <li> <a href="/publishing_services">For Publishers</a> </li> <li> <a href="/societies">For Societies</a> </li> <li> <a href="/conference_organizers">For Conference Organizers</a> </li> </ul> </div> <div class="small-7 columns"> <ul> <li> <a href="/openaccess">Open Access Policy</a> </li> <li> <a href="/ioap">Institutional Open Access Program</a> </li> <li> <a href="/special_issues_guidelines">Special Issues Guidelines</a> </li> <li> <a href="/editorial_process">Editorial Process</a> </li> <li> <a href="/ethics">Research and Publication Ethics</a> </li> <li> <a href="/apc">Article Processing Charges</a> </li> <li> <a href="/awards">Awards</a> </li> <li> <a href="/testimonials">Testimonials</a> </li> </ul> </div> </div> </li> </ul> </li> <li class="menu-item"> <a href="/authors/english">Author Services</a> </li> <li class="menu-item"> <a href="/about/initiatives" data-dropdown="initiatives-dropdown" aria-controls="initiatives-dropdown" aria-expanded="false" data-options="is_hover: true; hover_timeout: 200">Initiatives</a> <ul id="initiatives-dropdown" class="f-dropdown dropdown-wrapper dropdown-wrapper__small" data-dropdown-content aria-hidden="true" tabindex="-1"> <li> <div class="row"> <div class="small-12 columns"> <ul> <li> <a href="https://sciforum.net" target="_blank" rel="noopener noreferrer"> Sciforum </a> </li> <li> <a href="https://www.mdpi.com/books" target="_blank" rel="noopener noreferrer"> MDPI Books </a> </li> <li> <a href="https://www.preprints.org" target="_blank" rel="noopener noreferrer"> Preprints.org </a> </li> <li> <a href="https://www.scilit.com" target="_blank" rel="noopener noreferrer"> Scilit </a> </li> <li> <a href="https://sciprofiles.com" target="_blank" rel="noopener noreferrer"> SciProfiles </a> </li> <li> <a href="https://encyclopedia.pub" target="_blank" rel="noopener noreferrer"> Encyclopedia </a> </li> <li> <a href="https://jams.pub" target="_blank" rel="noopener noreferrer"> JAMS </a> </li> <li> <a href="/about/proceedings"> Proceedings Series </a> </li> </ul> </div> </div> </li> </ul> </li> <li class="menu-item"> <a href="/about" data-dropdown="about-dropdown" aria-controls="about-dropdown" aria-expanded="false" data-options="is_hover: true; hover_timeout: 200">About</a> <ul id="about-dropdown" class="f-dropdown dropdown-wrapper dropdown-wrapper__small" data-dropdown-content aria-hidden="true" tabindex="-1"> <li> <div class="row"> <div class="small-12 columns"> <ul> <li> <a href="/about"> Overview </a> </li> <li> <a href="/about/contact"> Contact </a> </li> <li> <a href="https://careers.mdpi.com" target="_blank" rel="noopener noreferrer"> Careers </a> </li> <li> <a href="/about/announcements"> News </a> </li> <li> <a href="/about/press"> Press </a> </li> <li> <a href="http://blog.mdpi.com/" target="_blank" rel="noopener noreferrer"> Blog </a> </li> </ul> </div> </div> </li> </ul> </li> </ul> </div> <div class="large-3 columns text-right full-size-menu__buttons"> <div> <a class="button button--default-inversed UA_SignInUpButton" href="/user/login">Sign In / Sign Up</a> <a class="button button--default js-journal-active-only-link js-journal-active-only-submit-link UC_NavSubmitButton" href=" https://susy.mdpi.com/user/manuscripts/upload?journal=forests " data-disabledmessage="new submissions are not possible.">Submit</a> </div> </div> </div> </div> <div class="header-divider">&nbsp;</div> <div class="search-container hide-for-small-down row search-container__homepage-scheme"> <form id="basic_search" style="background-color: inherit !important;" class="large-12 medium-12 columns " action="/search" method="get"> <div class="row search-container__main-elements"> <div class="large-2 medium-2 small-12 columns text-right1 small-only-text-left"> <div class="show-for-medium-up"> <div class="search-input-label">&nbsp;</div> </div> <span class="search-container__title">Search<span class="hide-for-medium"> for Articles</span><span class="hide-for-small">:</span></span> </div> <div class="custom-accordion-for-small-screen-content"> <div class="large-2 medium-2 small-6 columns "> <div class=""> <div class="search-input-label">Title / Keyword</div> </div> <input type="text" placeholder="Title / Keyword" id="q" tabindex="1" name="q" value="" /> </div> <div class="large-2 medium-2 small-6 columns "> <div class=""> <div class="search-input-label">Author / Affiliation / Email</div> </div> <input type="text" id="authors" placeholder="Author / Affiliation / Email" tabindex="2" name="authors" value="" /> </div> <div class="large-2 medium-2 small-6 columns "> <div class=""> <div class="search-input-label">Journal</div> </div> <select id="journal" tabindex="3" name="journal" class="chosen-select"> <option value="">All Journals</option> <option value="acoustics" > Acoustics </option> <option value="amh" > Acta Microbiologica Hellenica (AMH) </option> <option value="actuators" > Actuators </option> <option value="adhesives" > Adhesives </option> <option value="admsci" > Administrative Sciences </option> <option value="adolescents" > Adolescents </option> <option value="arm" > Advances in Respiratory Medicine (ARM) </option> <option value="aerobiology" > Aerobiology </option> <option value="aerospace" > Aerospace </option> <option value="agriculture" > Agriculture </option> <option value="agriengineering" > AgriEngineering </option> <option value="agrochemicals" > Agrochemicals </option> <option value="agronomy" > Agronomy </option> <option value="ai" > AI </option> <option value="aisens" > AI Sensors </option> <option value="air" > Air </option> <option value="algorithms" > Algorithms </option> <option value="allergies" > Allergies </option> <option value="alloys" > Alloys </option> <option value="analytica" > Analytica </option> <option value="analytics" > Analytics </option> <option value="anatomia" > Anatomia </option> <option value="anesthres" > Anesthesia Research </option> <option value="animals" > Animals </option> <option value="antibiotics" > Antibiotics </option> <option value="antibodies" > Antibodies </option> <option value="antioxidants" > Antioxidants </option> <option value="applbiosci" > Applied Biosciences </option> <option value="applmech" > Applied Mechanics </option> <option value="applmicrobiol" > Applied Microbiology </option> <option value="applnano" > Applied Nano </option> <option value="applsci" > Applied Sciences </option> <option value="asi" > Applied System Innovation (ASI) </option> <option value="appliedchem" > AppliedChem </option> <option value="appliedmath" > AppliedMath </option> <option value="aquacj" > Aquaculture Journal </option> <option value="architecture" > Architecture </option> <option value="arthropoda" > Arthropoda </option> <option value="arts" > Arts </option> <option value="astronomy" > Astronomy </option> <option value="atmosphere" > Atmosphere </option> <option value="atoms" > Atoms </option> <option value="audiolres" > Audiology Research </option> <option value="automation" > Automation </option> <option value="axioms" > Axioms </option> <option value="bacteria" > Bacteria </option> <option value="batteries" > Batteries </option> <option value="behavsci" > Behavioral Sciences </option> <option value="beverages" > Beverages </option> <option value="BDCC" > Big Data and Cognitive Computing (BDCC) </option> <option value="biochem" > BioChem </option> <option value="bioengineering" > Bioengineering </option> <option value="biologics" > Biologics </option> <option value="biology" > Biology </option> <option value="blsf" > Biology and Life Sciences Forum </option> <option value="biomass" > Biomass </option> <option value="biomechanics" > Biomechanics </option> <option value="biomed" > BioMed </option> <option value="biomedicines" > Biomedicines </option> <option value="biomedinformatics" > BioMedInformatics </option> <option value="biomimetics" > Biomimetics </option> <option value="biomolecules" > Biomolecules </option> <option value="biophysica" > Biophysica </option> <option value="biosensors" > Biosensors </option> <option value="biosphere" > Biosphere </option> <option value="biotech" > BioTech </option> <option value="birds" > Birds </option> <option value="blockchains" > Blockchains </option> <option value="brainsci" > Brain Sciences </option> <option value="buildings" > Buildings </option> <option value="businesses" > Businesses </option> <option value="carbon" > C (Journal of Carbon Research) </option> <option value="cancers" > Cancers </option> <option value="cardiogenetics" > Cardiogenetics </option> <option value="catalysts" > Catalysts </option> <option value="cells" > Cells </option> <option value="ceramics" > Ceramics </option> <option value="challenges" > Challenges </option> <option value="ChemEngineering" > ChemEngineering </option> <option value="chemistry" > Chemistry </option> <option value="chemproc" > Chemistry Proceedings </option> <option value="chemosensors" > Chemosensors </option> <option value="children" > Children </option> <option value="chips" > Chips </option> <option value="civileng" > CivilEng </option> <option value="cleantechnol" > Clean Technologies (Clean Technol.) </option> <option value="climate" > Climate </option> <option value="ctn" > Clinical and Translational Neuroscience (CTN) </option> <option value="clinbioenerg" > Clinical Bioenergetics </option> <option value="clinpract" > Clinics and Practice </option> <option value="clockssleep" > Clocks &amp; Sleep </option> <option value="coasts" > Coasts </option> <option value="coatings" > Coatings </option> <option value="colloids" > Colloids and Interfaces </option> <option value="colorants" > Colorants </option> <option value="commodities" > Commodities </option> <option value="complications" > Complications </option> <option value="compounds" > Compounds </option> <option value="computation" > Computation </option> <option value="csmf" > Computer Sciences &amp; Mathematics Forum </option> <option value="computers" > Computers </option> <option value="condensedmatter" > Condensed Matter </option> <option value="conservation" > Conservation </option> <option value="constrmater" > Construction Materials </option> <option value="cmd" > Corrosion and Materials Degradation (CMD) </option> <option value="cosmetics" > Cosmetics </option> <option value="covid" > COVID </option> <option value="cmtr" > Craniomaxillofacial Trauma &amp; Reconstruction (CMTR) </option> <option value="crops" > Crops </option> <option value="cryo" > Cryo </option> <option value="cryptography" > Cryptography </option> <option value="crystals" > Crystals </option> <option value="cimb" > Current Issues in Molecular Biology (CIMB) </option> <option value="curroncol" > Current Oncology </option> <option value="dairy" > Dairy </option> <option value="data" > Data </option> <option value="dentistry" > Dentistry Journal </option> <option value="dermato" > Dermato </option> <option value="dermatopathology" > Dermatopathology </option> <option value="designs" > Designs </option> <option value="diabetology" > Diabetology </option> <option value="diagnostics" > Diagnostics </option> <option value="dietetics" > Dietetics </option> <option value="digital" > Digital </option> <option value="disabilities" > Disabilities </option> <option value="diseases" > Diseases </option> <option value="diversity" > Diversity </option> <option value="dna" > DNA </option> <option value="drones" > Drones </option> <option value="ddc" > Drugs and Drug Candidates (DDC) </option> <option value="dynamics" > Dynamics </option> <option value="earth" > Earth </option> <option value="ecologies" > Ecologies </option> <option value="econometrics" > Econometrics </option> <option value="economies" > Economies </option> <option value="education" > Education Sciences </option> <option value="electricity" > Electricity </option> <option value="electrochem" > Electrochem </option> <option value="electronicmat" > Electronic Materials </option> <option value="electronics" > Electronics </option> <option value="ecm" > Emergency Care and Medicine </option> <option value="encyclopedia" > Encyclopedia </option> <option value="endocrines" > Endocrines </option> <option value="energies" > Energies </option> <option value="esa" > Energy Storage and Applications (ESA) </option> <option value="eng" > Eng </option> <option value="engproc" > Engineering Proceedings </option> <option value="entropy" > Entropy </option> <option value="eesp" > Environmental and Earth Sciences Proceedings </option> <option value="environments" > Environments </option> <option value="epidemiologia" > Epidemiologia </option> <option value="epigenomes" > Epigenomes </option> <option value="ebj" > European Burn Journal (EBJ) </option> <option value="ejihpe" > European Journal of Investigation in Health, Psychology and Education (EJIHPE) </option> <option value="fermentation" > Fermentation </option> <option value="fibers" > Fibers </option> <option value="fintech" > FinTech </option> <option value="fire" > Fire </option> <option value="fishes" > Fishes </option> <option value="fluids" > Fluids </option> <option value="foods" > Foods </option> <option value="forecasting" > Forecasting </option> <option value="forensicsci" > Forensic Sciences </option> <option value="forests" selected='selected'> Forests </option> <option value="fossstud" > Fossil Studies </option> <option value="foundations" > Foundations </option> <option value="fractalfract" > Fractal and Fractional (Fractal Fract) </option> <option value="fuels" > Fuels </option> <option value="future" > Future </option> <option value="futureinternet" > Future Internet </option> <option value="futurepharmacol" > Future Pharmacology </option> <option value="futuretransp" > Future Transportation </option> <option value="galaxies" > Galaxies </option> <option value="games" > Games </option> <option value="gases" > Gases </option> <option value="gastroent" > Gastroenterology Insights </option> <option value="gastrointestdisord" > Gastrointestinal Disorders </option> <option value="gastronomy" > Gastronomy </option> <option value="gels" > Gels </option> <option value="genealogy" > Genealogy </option> <option value="genes" > Genes </option> <option value="geographies" > Geographies </option> <option value="geohazards" > GeoHazards </option> <option value="geomatics" > Geomatics </option> <option value="geometry" > Geometry </option> <option value="geosciences" > Geosciences </option> <option value="geotechnics" > Geotechnics </option> <option value="geriatrics" > Geriatrics </option> <option value="glacies" > Glacies </option> <option value="gucdd" > Gout, Urate, and Crystal Deposition Disease (GUCDD) </option> <option value="grasses" > Grasses </option> <option value="greenhealth" > Green Health </option> <option value="hardware" > Hardware </option> <option value="healthcare" > Healthcare </option> <option value="hearts" > Hearts </option> <option value="hemato" > Hemato </option> <option value="hematolrep" > Hematology Reports </option> <option value="heritage" > Heritage </option> <option value="histories" > Histories </option> <option value="horticulturae" > Horticulturae </option> <option value="hospitals" > Hospitals </option> <option value="humanities" > Humanities </option> <option value="humans" > Humans </option> <option value="hydrobiology" > Hydrobiology </option> <option value="hydrogen" > Hydrogen </option> <option value="hydrology" > Hydrology </option> <option value="hygiene" > Hygiene </option> <option value="immuno" > Immuno </option> <option value="idr" > Infectious Disease Reports </option> <option value="informatics" > Informatics </option> <option value="information" > Information </option> <option value="infrastructures" > Infrastructures </option> <option value="inorganics" > Inorganics </option> <option value="insects" > Insects </option> <option value="instruments" > Instruments </option> <option value="iic" > Intelligent Infrastructure and Construction </option> <option value="ijerph" > International Journal of Environmental Research and Public Health (IJERPH) </option> <option value="ijfs" > International Journal of Financial Studies (IJFS) </option> <option value="ijms" > International Journal of Molecular Sciences (IJMS) </option> <option value="IJNS" > International Journal of Neonatal Screening (IJNS) </option> <option value="ijom" > International Journal of Orofacial Myology and Myofunctional Therapy (IJOM) </option> <option value="ijpb" > International Journal of Plant Biology (IJPB) </option> <option value="ijt" > International Journal of Topology </option> <option value="ijtm" > International Journal of Translational Medicine (IJTM) </option> <option value="ijtpp" > International Journal of Turbomachinery, Propulsion and Power (IJTPP) </option> <option value="ime" > International Medical Education (IME) </option> <option value="inventions" > Inventions </option> <option value="IoT" > IoT </option> <option value="ijgi" > ISPRS International Journal of Geo-Information (IJGI) </option> <option value="J" > J </option> <option value="jal" > Journal of Ageing and Longevity (JAL) </option> <option value="jcdd" > Journal of Cardiovascular Development and Disease (JCDD) </option> <option value="jcto" > Journal of Clinical &amp; Translational Ophthalmology (JCTO) </option> <option value="jcm" > Journal of Clinical Medicine (JCM) </option> <option value="jcs" > Journal of Composites Science (J. Compos. Sci.) </option> <option value="jcp" > Journal of Cybersecurity and Privacy (JCP) </option> <option value="jdad" > Journal of Dementia and Alzheimer&#039;s Disease (JDAD) </option> <option value="jdb" > Journal of Developmental Biology (JDB) </option> <option value="jeta" > Journal of Experimental and Theoretical Analyses (JETA) </option> <option value="jemr" > Journal of Eye Movement Research (JEMR) </option> <option value="jfb" > Journal of Functional Biomaterials (JFB) </option> <option value="jfmk" > Journal of Functional Morphology and Kinesiology (JFMK) </option> <option value="jof" > Journal of Fungi (JoF) </option> <option value="jimaging" > Journal of Imaging (J. Imaging) </option> <option value="jintelligence" > Journal of Intelligence (J. Intell.) </option> <option value="jlpea" > Journal of Low Power Electronics and Applications (JLPEA) </option> <option value="jmmp" > Journal of Manufacturing and Materials Processing (JMMP) </option> <option value="jmse" > Journal of Marine Science and Engineering (JMSE) </option> <option value="jmahp" > Journal of Market Access &amp; Health Policy (JMAHP) </option> <option value="jmms" > Journal of Mind and Medical Sciences (JMMS) </option> <option value="jmp" > Journal of Molecular Pathology (JMP) </option> <option value="jnt" > Journal of Nanotheranostics (JNT) </option> <option value="jne" > Journal of Nuclear Engineering (JNE) </option> <option value="ohbm" > Journal of Otorhinolaryngology, Hearing and Balance Medicine (JOHBM) </option> <option value="jop" > Journal of Parks </option> <option value="jpm" > Journal of Personalized Medicine (JPM) </option> <option value="jpbi" > Journal of Pharmaceutical and BioTech Industry (JPBI) </option> <option value="jor" > Journal of Respiration (JoR) </option> <option value="jrfm" > Journal of Risk and Financial Management (JRFM) </option> <option value="jsan" > Journal of Sensor and Actuator Networks (JSAN) </option> <option value="joma" > Journal of the Oman Medical Association (JOMA) </option> <option value="jtaer" > Journal of Theoretical and Applied Electronic Commerce Research (JTAER) </option> <option value="jvd" > Journal of Vascular Diseases (JVD) </option> <option value="jox" > Journal of Xenobiotics (JoX) </option> <option value="jzbg" > Journal of Zoological and Botanical Gardens (JZBG) </option> <option value="journalmedia" > Journalism and Media </option> <option value="kidneydial" > Kidney and Dialysis </option> <option value="kinasesphosphatases" > Kinases and Phosphatases </option> <option value="knowledge" > Knowledge </option> <option value="labmed" > LabMed </option> <option value="laboratories" > Laboratories </option> <option value="land" > Land </option> <option value="languages" > Languages </option> <option value="laws" > Laws </option> <option value="life" > Life </option> <option value="limnolrev" > Limnological Review </option> <option value="lipidology" > Lipidology </option> <option value="liquids" > Liquids </option> <option value="literature" > Literature </option> <option value="livers" > Livers </option> <option value="logics" > Logics </option> <option value="logistics" > Logistics </option> <option value="lubricants" > Lubricants </option> <option value="lymphatics" > Lymphatics </option> <option value="make" > Machine Learning and Knowledge Extraction (MAKE) </option> <option value="machines" > Machines </option> <option value="macromol" > Macromol </option> <option value="magnetism" > Magnetism </option> <option value="magnetochemistry" > Magnetochemistry </option> <option value="marinedrugs" > Marine Drugs </option> <option value="materials" > Materials </option> <option value="materproc" > Materials Proceedings </option> <option value="mca" > Mathematical and Computational Applications (MCA) </option> <option value="mathematics" > Mathematics </option> <option value="medsci" > Medical Sciences </option> <option value="msf" > Medical Sciences Forum </option> <option value="medicina" > Medicina </option> <option value="medicines" > Medicines </option> <option value="membranes" > Membranes </option> <option value="merits" > Merits </option> <option value="metabolites" > Metabolites </option> <option value="metals" > Metals </option> <option value="meteorology" > Meteorology </option> <option value="methane" > Methane </option> <option value="mps" > Methods and Protocols (MPs) </option> <option value="metrics" > Metrics </option> <option value="metrology" > Metrology </option> <option value="micro" > Micro </option> <option value="microbiolres" > Microbiology Research </option> <option value="micromachines" > Micromachines </option> <option value="microorganisms" > Microorganisms </option> <option value="microplastics" > Microplastics </option> <option value="microwave" > Microwave </option> <option value="minerals" > Minerals </option> <option value="mining" > Mining </option> <option value="modelling" > Modelling </option> <option value="mmphys" > Modern Mathematical Physics </option> <option value="molbank" > Molbank </option> <option value="molecules" > Molecules </option> <option value="mti" > Multimodal Technologies and Interaction (MTI) </option> <option value="muscles" > Muscles </option> <option value="nanoenergyadv" > Nanoenergy Advances </option> <option value="nanomanufacturing" > Nanomanufacturing </option> <option value="nanomaterials" > Nanomaterials </option> <option value="ndt" > NDT </option> <option value="network" > Network </option> <option value="neuroglia" > Neuroglia </option> <option value="neurolint" > Neurology International </option> <option value="neurosci" > NeuroSci </option> <option value="nitrogen" > Nitrogen </option> <option value="ncrna" > Non-Coding RNA (ncRNA) </option> <option value="nursrep" > Nursing Reports </option> <option value="nutraceuticals" > Nutraceuticals </option> <option value="nutrients" > Nutrients </option> <option value="obesities" > Obesities </option> <option value="oceans" > Oceans </option> <option value="onco" > Onco </option> <option value="optics" > Optics </option> <option value="oral" > Oral </option> <option value="organics" > Organics </option> <option value="organoids" > Organoids </option> <option value="osteology" > Osteology </option> <option value="oxygen" > Oxygen </option> <option value="parasitologia" > Parasitologia </option> <option value="particles" > Particles </option> <option value="pathogens" > Pathogens </option> <option value="pathophysiology" > Pathophysiology </option> <option value="pediatrrep" > Pediatric Reports </option> <option value="pets" > Pets </option> <option value="pharmaceuticals" > Pharmaceuticals </option> <option value="pharmaceutics" > Pharmaceutics </option> <option value="pharmacoepidemiology" > Pharmacoepidemiology </option> <option value="pharmacy" > Pharmacy </option> <option value="philosophies" > Philosophies </option> <option value="photochem" > Photochem </option> <option value="photonics" > Photonics </option> <option value="phycology" > Phycology </option> <option value="physchem" > Physchem </option> <option value="psf" > Physical Sciences Forum </option> <option value="physics" > Physics </option> <option value="physiologia" > Physiologia </option> <option value="plants" > Plants </option> <option value="plasma" > Plasma </option> <option value="platforms" > Platforms </option> <option value="pollutants" > Pollutants </option> <option value="polymers" > Polymers </option> <option value="polysaccharides" > Polysaccharides </option> <option value="populations" > Populations </option> <option value="poultry" > Poultry </option> <option value="powders" > Powders </option> <option value="proceedings" > Proceedings </option> <option value="processes" > Processes </option> <option value="prosthesis" > Prosthesis </option> <option value="proteomes" > Proteomes </option> <option value="psychiatryint" > Psychiatry International </option> <option value="psychoactives" > Psychoactives </option> <option value="psycholint" > Psychology International </option> <option value="publications" > Publications </option> <option value="qubs" > Quantum Beam Science (QuBS) </option> <option value="quantumrep" > Quantum Reports </option> <option value="quaternary" > Quaternary </option> <option value="radiation" > Radiation </option> <option value="reactions" > Reactions </option> <option value="realestate" > Real Estate </option> <option value="receptors" > Receptors </option> <option value="recycling" > Recycling </option> <option value="rsee" > Regional Science and Environmental Economics (RSEE) </option> <option value="religions" > Religions </option> <option value="remotesensing" > Remote Sensing </option> <option value="reports" > Reports </option> <option value="reprodmed" > Reproductive Medicine (Reprod. Med.) </option> <option value="resources" > Resources </option> <option value="rheumato" > Rheumato </option> <option value="risks" > Risks </option> <option value="robotics" > Robotics </option> <option value="ruminants" > Ruminants </option> <option value="safety" > Safety </option> <option value="sci" > Sci </option> <option value="scipharm" > Scientia Pharmaceutica (Sci. Pharm.) </option> <option value="sclerosis" > Sclerosis </option> <option value="seeds" > Seeds </option> <option value="sensors" > Sensors </option> <option value="separations" > Separations </option> <option value="sexes" > Sexes </option> <option value="signals" > Signals </option> <option value="sinusitis" > Sinusitis </option> <option value="smartcities" > Smart Cities </option> <option value="socsci" > Social Sciences </option> <option value="siuj" > Société Internationale d’Urologie Journal (SIUJ) </option> <option value="societies" > Societies </option> <option value="software" > Software </option> <option value="soilsystems" > Soil Systems </option> <option value="solar" > Solar </option> <option value="solids" > Solids </option> <option value="spectroscj" > Spectroscopy Journal </option> <option value="sports" > Sports </option> <option value="standards" > Standards </option> <option value="stats" > Stats </option> <option value="stresses" > Stresses </option> <option value="surfaces" > Surfaces </option> <option value="surgeries" > Surgeries </option> <option value="std" > Surgical Techniques Development </option> <option value="sustainability" > Sustainability </option> <option value="suschem" > Sustainable Chemistry </option> <option value="symmetry" > Symmetry </option> <option value="synbio" > SynBio </option> <option value="systems" > Systems </option> <option value="targets" > Targets </option> <option value="taxonomy" > Taxonomy </option> <option value="technologies" > Technologies </option> <option value="telecom" > Telecom </option> <option value="textiles" > Textiles </option> <option value="thalassrep" > Thalassemia Reports </option> <option value="therapeutics" > Therapeutics </option> <option value="thermo" > Thermo </option> <option value="timespace" > Time and Space </option> <option value="tomography" > Tomography </option> <option value="tourismhosp" > Tourism and Hospitality </option> <option value="toxics" > Toxics </option> <option value="toxins" > Toxins </option> <option value="transplantology" > Transplantology </option> <option value="traumacare" > Trauma Care </option> <option value="higheredu" > Trends in Higher Education </option> <option value="tropicalmed" > Tropical Medicine and Infectious Disease (TropicalMed) </option> <option value="universe" > Universe </option> <option value="urbansci" > Urban Science </option> <option value="uro" > Uro </option> <option value="vaccines" > Vaccines </option> <option value="vehicles" > Vehicles </option> <option value="venereology" > Venereology </option> <option value="vetsci" > Veterinary Sciences </option> <option value="vibration" > Vibration </option> <option value="virtualworlds" > Virtual Worlds </option> <option value="viruses" > Viruses </option> <option value="vision" > Vision </option> <option value="waste" > Waste </option> <option value="water" > Water </option> <option value="wild" > Wild </option> <option value="wind" > Wind </option> <option value="women" > Women </option> <option value="world" > World </option> <option value="wevj" > World Electric Vehicle Journal (WEVJ) </option> <option value="youth" > Youth </option> <option value="zoonoticdis" > Zoonotic Diseases </option> </select> </div> <div class="large-2 medium-2 small-6 columns "> <div class=""> <div class="search-input-label">Article Type</div> </div> <select id="article_type" tabindex="4" name="article_type" class="chosen-select"> <option value="">All Article Types</option> <option value="research-article">Article</option> <option value="review-article">Review</option> <option value="rapid-communication">Communication</option> <option value="editorial">Editorial</option> <option value="abstract">Abstract</option> <option value="book-review">Book Review</option> <option value="brief-communication">Brief Communication</option> <option value="brief-report">Brief Report</option> <option value="case-report">Case Report</option> <option value="clinicopathological-challenge">Clinicopathological Challenge</option> <option value="article-commentary">Comment</option> <option value="commentary">Commentary</option> <option value="concept-paper">Concept Paper</option> <option value="conference-report">Conference Report</option> <option value="correction">Correction</option> <option value="creative">Creative</option> <option value="data-descriptor">Data Descriptor</option> <option value="discussion">Discussion</option> <option value="Entry">Entry</option> <option value="essay">Essay</option> <option value="expression-of-concern">Expression of Concern</option> <option value="extended-abstract">Extended Abstract</option> <option value="field-guide">Field Guide</option> <option value="giants-in-urology">Giants in Urology</option> <option value="guidelines">Guidelines</option> <option value="hypothesis">Hypothesis</option> <option value="interesting-image">Interesting Images</option> <option value="letter">Letter</option> <option value="books-received">New Book Received</option> <option value="obituary">Obituary</option> <option value="opinion">Opinion</option> <option value="perspective">Perspective</option> <option value="proceedings">Proceeding Paper</option> <option value="project-report">Project Report</option> <option value="protocol">Protocol</option> <option value="registered-report">Registered Report</option> <option value="reply">Reply</option> <option value="retraction">Retraction</option> <option value="note">Short Note</option> <option value="study-protocol">Study Protocol</option> <option value="systematic_review">Systematic Review</option> <option value="technical-note">Technical Note</option> <option value="tutorial">Tutorial</option> <option value="urology-around-the-world">Urology around the World</option> <option value="viewpoint">Viewpoint</option> </select> </div> <div class="large-1 medium-1 small-6 end columns small-push-6 medium-reset-order large-reset-order js-search-collapsed-button-container"> <div class="search-input-label">&nbsp;</div> <input type="submit" id="search" value="Search" class="button button--dark button--full-width searchButton1 US_SearchButton" tabindex="12"> </div> <div class="large-1 medium-1 small-6 end columns large-text-left small-only-text-center small-pull-6 medium-reset-order large-reset-order js-search-collapsed-link-container"> <div class="search-input-label">&nbsp;</div> <a class="main-search-clear search-container__link" href="#" onclick="openAdvanced(''); return false;">Advanced<span class="show-for-small-only"> Search</span></a> </div> </div> </div> <div class="search-container__advanced" style="margin-top: 0; padding-top: 0px; background-color: inherit; color: inherit;"> <div class="row"> <div class="large-2 medium-2 columns show-for-medium-up">&nbsp;</div> <div class="large-2 medium-2 small-6 columns "> <div class=""> <div class="search-input-label">Section</div> </div> <select id="section" tabindex="5" name="section" class="chosen-select"> <option value=""></option> </select> </div> <div class="large-2 medium-2 small-6 columns "> <div class=""> <div class="search-input-label">Special Issue</div> </div> <select id="special_issue" tabindex="6" name="special_issue" class="chosen-select"> <option value=""></option> </select> </div> <div class="large-1 medium-1 small-6 end columns "> <div class="search-input-label">Volume</div> <input type="text" id="volume" tabindex="7" name="volume" placeholder="..." value="14" /> </div> <div class="large-1 medium-1 small-6 end columns "> <div class="search-input-label">Issue</div> <input type="text" id="issue" tabindex="8" name="issue" placeholder="..." value="5" /> </div> <div class="large-1 medium-1 small-6 end columns "> <div class="search-input-label">Number</div> <input type="text" id="number" tabindex="9" name="number" placeholder="..." value="" /> </div> <div class="large-1 medium-1 small-6 end columns "> <div class="search-input-label">Page</div> <input type="text" id="page" tabindex="10" name="page" placeholder="..." value="" /> </div> <div class="large-1 medium-1 small-6 columns small-push-6 medium-reset order large-reset-order medium-reset-order js-search-expanded-button-container"></div> <div class="large-1 medium-1 small-6 columns large-text-left small-only-text-center small-pull-6 medium-reset-order large-reset-order js-search-expanded-link-container"></div> </div> </div> </form> <form id="advanced-search" class="large-12 medium-12 columns"> <div class="search-container__advanced"> <div id="advanced-search-template" class="row advanced-search-row"> <div class="large-2 medium-2 small-12 columns show-for-medium-up">&nbsp;</div> <div class="large-2 medium-2 small-3 columns connector-div"> <div class="search-input-label"><span class="show-for-medium-up">Logical Operator</span><span class="show-for-small">Operator</span></div> <select class="connector"> <option value="and">AND</option> <option value="or">OR</option> </select> </div> <div class="large-3 medium-3 small-6 columns search-text-div"> <div class="search-input-label">Search Text</div> <input type="text" class="search-text" placeholder="Search text"> </div> <div class="large-2 medium-2 small-6 large-offset-0 medium-offset-0 small-offset-3 columns search-field-div"> <div class="search-input-label">Search Type</div> <select class="search-field"> <option value="all">All fields</option> <option value="title">Title</option> <option value="abstract">Abstract</option> <option value="keywords">Keywords</option> <option value="authors">Authors</option> <option value="affiliations">Affiliations</option> <option value="doi">Doi</option> <option value="full_text">Full Text</option> <option value="references">References</option> </select> </div> <div class="large-1 medium-1 small-3 columns"> <div class="search-input-label">&nbsp;</div> <div class="search-action-div"> <div class="search-plus"> <i class="material-icons">add_circle_outline</i> </div> </div> <div class="search-action-div"> <div class="search-minus"> <i class="material-icons">remove_circle_outline</i> </div> </div> </div> <div class="large-1 medium-1 small-6 large-offset-0 medium-offset-0 small-offset-3 end columns"> <div class="search-input-label">&nbsp;</div> <input class="advanced-search-button button button--dark search-submit" type="submit" value="Search"> </div> <div class="large-1 medium-1 small-6 end columns show-for-medium-up"></div> </div> </div> </form> </div> <div class="header-divider">&nbsp;</div> <div class="breadcrumb row full-row"> <div class="breadcrumb__element"> <a href="/about/journals">Journals</a> </div> <div class="breadcrumb__element"> <a href="/journal/forests">Forests</a> </div> <div class="breadcrumb__element"> <a href="/1999-4907/14">Volume 14</a> </div> <div class="breadcrumb__element"> <a href="/1999-4907/14/5">Issue 5</a> </div> <div class="breadcrumb__element"> <a href="#">10.3390/f14050945</a> </div> </div> </header> <div id="main-content" class=""> <div class="row full-width row-fixed-left-column"> <div id="left-column" class="content__column large-3 medium-3 small-12 columns"> <div class="content__container"> <a href="/journal/forests"> <img src="https://pub.mdpi-res.com/img/journals/forests-logo.png?8600e93ff98dbf14" alt="forests-logo" title="Forests" style="max-height: 60px; margin: 0 0 0 0;"> </a> <div class="generic-item no-border"> <a class="button button--color button--full-width js-journal-active-only-link js-journal-active-only-submit-link UC_ArticleSubmitButton" href="https://susy.mdpi.com/user/manuscripts/upload?form%5Bjournal_id%5D%3D42" data-disabledmessage="creating new submissions is not possible."> Submit to this Journal </a> <a class="button button--color button--full-width js-journal-active-only-link UC_ArticleReviewButton" href="https://susy.mdpi.com/volunteer/journals/review" data-disabledmessage="volunteering as journal reviewer is not possible."> Review for this Journal </a> <a class="button button--color-inversed button--color-journal button--full-width js-journal-active-only-link UC_ArticleEditIssueButton" href="/journalproposal/sendproposalspecialissue/forests" data-path="/1999-4907/14/5/945" data-disabledmessage="proposing new special issue is not possible."> Propose a Special Issue </a> </div> <div class="generic-item link-article-menu show-for-small"> <a href="#" class="link-article-menu show-for-small"> <span class="closed">&#9658;</span> <span class="open" style="display: none;">&#9660;</span> Article Menu </a> </div> <div class="hide-small-down-initially UI_ArticleMenu"> <div class="generic-item"> <h2>Article Menu</h2> </div> <ul class="accordion accordion__menu" data-accordion data-options="multi_expand:true;toggleable: true"> <li class="accordion-navigation"> <a href="#academic_editors" class="accordion__title">Academic Editor</a> <div id="academic_editors" class="content active"> <div class="academic-editor-container " title="Department of Biology, University of Regina 3737 Wascana Pkwy, Regina, SK S4S 0A2, Canada"> <div class="sciprofiles-link" style="display: inline-block"><a class="sciprofiles-link__link" href="https://sciprofiles.com/profile/165979?utm_source=mdpi.com&amp;utm_medium=website&amp;utm_campaign=avatar_name" target="_blank" rel="noopener noreferrer"><img class="sciprofiles-link__image" src="/profiles/165979/thumb/Mark_Vanderwel.png" style="width: auto; height: 16px; border-radius: 50%;"><span class="sciprofiles-link__name">Mark Vanderwel</span></a></div> </div> </div> </li> <li class="accordion-direct-link"> <a href="/1999-4907/14/5/945/scifeed_display" data-reveal-id="scifeed-modal" data-reveal-ajax="true">Subscribe SciFeed</a> </li> <li class="accordion-direct-link js-article-similarity-container" style="display: none"> <a href="#" class="js-similarity-related-articles">Recommended Articles</a> </li> <li class="accordion-navigation"> <a href="#related" class="accordion__title">Related Info Link</a> <div id="related" class="content UI_ArticleMenu_RelatedLinks"> <ul> <li class="li-link"> <a href="https://scholar.google.com/scholar?q=Tree%20Species%20Classification%20in%20a%20Complex%20Brazilian%20Tropical%20Forest%20Using%20Hyperspectral%20and%20LiDAR%20Data" target="_blank" rel="noopener noreferrer">Google Scholar</a> </li> </ul> </div> </li> <li class="accordion-navigation"> <a href="#authors" class="accordion__title">More by Authors Links</a> <div id="authors" class="content UI_ArticleMenu_AuthorsLinks"> <ul class="side-menu-ul"> <li> <a class="expand" onclick='$(this).closest("li").next("div").toggle(); return false;'>on DOAJ</a> </li> <div id="AuthorDOAJExpand" style="display:none;"> <ul class="submenu"> <li class="li-link"> <a href='http://doaj.org/search/articles?source=%7B%22query%22%3A%7B%22query_string%22%3A%7B%22query%22%3A%22%5C%22Rorai%20Pereira%20Martins-Neto%5C%22%22%2C%22default_operator%22%3A%22AND%22%2C%22default_field%22%3A%22bibjson.author.name%22%7D%7D%7D' target="_blank" rel="noopener noreferrer">Pereira Martins-Neto, R.</a> <li> </li> <li class="li-link"> <a href='http://doaj.org/search/articles?source=%7B%22query%22%3A%7B%22query_string%22%3A%7B%22query%22%3A%22%5C%22Antonio%20Maria%20Garcia%20Tommaselli%5C%22%22%2C%22default_operator%22%3A%22AND%22%2C%22default_field%22%3A%22bibjson.author.name%22%7D%7D%7D' target="_blank" rel="noopener noreferrer">Garcia Tommaselli, A. Maria</a> <li> </li> <li class="li-link"> <a href='http://doaj.org/search/articles?source=%7B%22query%22%3A%7B%22query_string%22%3A%7B%22query%22%3A%22%5C%22Nilton%20Nobuhiro%20Imai%5C%22%22%2C%22default_operator%22%3A%22AND%22%2C%22default_field%22%3A%22bibjson.author.name%22%7D%7D%7D' target="_blank" rel="noopener noreferrer">Imai, N. Nobuhiro</a> <li> </li> <li class="li-link"> <a href='http://doaj.org/search/articles?source=%7B%22query%22%3A%7B%22query_string%22%3A%7B%22query%22%3A%22%5C%22Eija%20Honkavaara%5C%22%22%2C%22default_operator%22%3A%22AND%22%2C%22default_field%22%3A%22bibjson.author.name%22%7D%7D%7D' target="_blank" rel="noopener noreferrer">Honkavaara, E.</a> <li> </li> <li class="li-link"> <a href='http://doaj.org/search/articles?source=%7B%22query%22%3A%7B%22query_string%22%3A%7B%22query%22%3A%22%5C%22Milto%20Miltiadou%5C%22%22%2C%22default_operator%22%3A%22AND%22%2C%22default_field%22%3A%22bibjson.author.name%22%7D%7D%7D' target="_blank" rel="noopener noreferrer">Miltiadou, M.</a> <li> </li> <li class="li-link"> <a href='http://doaj.org/search/articles?source=%7B%22query%22%3A%7B%22query_string%22%3A%7B%22query%22%3A%22%5C%22Erika%20Akemi%20Saito%20Moriya%5C%22%22%2C%22default_operator%22%3A%22AND%22%2C%22default_field%22%3A%22bibjson.author.name%22%7D%7D%7D' target="_blank" rel="noopener noreferrer">Saito Moriya, E. Akemi</a> <li> </li> <li class="li-link"> <a href='http://doaj.org/search/articles?source=%7B%22query%22%3A%7B%22query_string%22%3A%7B%22query%22%3A%22%5C%22Hassan%20Camil%20David%5C%22%22%2C%22default_operator%22%3A%22AND%22%2C%22default_field%22%3A%22bibjson.author.name%22%7D%7D%7D' target="_blank" rel="noopener noreferrer">David, H. Camil</a> <li> </li> </ul> </div> <li> <a class="expand" onclick='$(this).closest("li").next("div").toggle(); return false;'>on Google Scholar</a> </li> <div id="AuthorGoogleExpand" style="display:none;"> <ul class="submenu"> <li class="li-link"> <a href="https://scholar.google.com/scholar?q=Rorai%20Pereira%20Martins-Neto" target="_blank" rel="noopener noreferrer">Pereira Martins-Neto, R.</a> <li> </li> <li class="li-link"> <a href="https://scholar.google.com/scholar?q=Antonio%20Maria%20Garcia%20Tommaselli" target="_blank" rel="noopener noreferrer">Garcia Tommaselli, A. Maria</a> <li> </li> <li class="li-link"> <a href="https://scholar.google.com/scholar?q=Nilton%20Nobuhiro%20Imai" target="_blank" rel="noopener noreferrer">Imai, N. Nobuhiro</a> <li> </li> <li class="li-link"> <a href="https://scholar.google.com/scholar?q=Eija%20Honkavaara" target="_blank" rel="noopener noreferrer">Honkavaara, E.</a> <li> </li> <li class="li-link"> <a href="https://scholar.google.com/scholar?q=Milto%20Miltiadou" target="_blank" rel="noopener noreferrer">Miltiadou, M.</a> <li> </li> <li class="li-link"> <a href="https://scholar.google.com/scholar?q=Erika%20Akemi%20Saito%20Moriya" target="_blank" rel="noopener noreferrer">Saito Moriya, E. Akemi</a> <li> </li> <li class="li-link"> <a href="https://scholar.google.com/scholar?q=Hassan%20Camil%20David" target="_blank" rel="noopener noreferrer">David, H. Camil</a> <li> </li> </ul> </div> <li> <a class="expand" onclick='$(this).closest("li").next("div").toggle(); return false;'>on PubMed</a> </li> <div id="AuthorPubMedExpand" style="display:none;"> <ul class="submenu"> <li class="li-link"> <a href="http://www.pubmed.gov/?cmd=Search&amp;term=Rorai%20Pereira%20Martins-Neto" target="_blank" rel="noopener noreferrer">Pereira Martins-Neto, R.</a> <li> </li> <li class="li-link"> <a href="http://www.pubmed.gov/?cmd=Search&amp;term=Antonio%20Maria%20Garcia%20Tommaselli" target="_blank" rel="noopener noreferrer">Garcia Tommaselli, A. Maria</a> <li> </li> <li class="li-link"> <a href="http://www.pubmed.gov/?cmd=Search&amp;term=Nilton%20Nobuhiro%20Imai" target="_blank" rel="noopener noreferrer">Imai, N. Nobuhiro</a> <li> </li> <li class="li-link"> <a href="http://www.pubmed.gov/?cmd=Search&amp;term=Eija%20Honkavaara" target="_blank" rel="noopener noreferrer">Honkavaara, E.</a> <li> </li> <li class="li-link"> <a href="http://www.pubmed.gov/?cmd=Search&amp;term=Milto%20Miltiadou" target="_blank" rel="noopener noreferrer">Miltiadou, M.</a> <li> </li> <li class="li-link"> <a href="http://www.pubmed.gov/?cmd=Search&amp;term=Erika%20Akemi%20Saito%20Moriya" target="_blank" rel="noopener noreferrer">Saito Moriya, E. Akemi</a> <li> </li> <li class="li-link"> <a href="http://www.pubmed.gov/?cmd=Search&amp;term=Hassan%20Camil%20David" target="_blank" rel="noopener noreferrer">David, H. Camil</a> <li> </li> </ul> </div> </ul> </div> </li> </ul> <span style="display:none" id="scifeed_hidden_flag"></span> <span style="display:none" id="scifeed_subscribe_url">/ajax/scifeed/subscribe</span> </div> </div> <div class="content__container responsive-moving-container large medium active hidden" data-id="article-counters"> <div id="counts-wrapper" class="row generic-item no-border" data-equalizer> <div id="js-counts-wrapper__views" class="small-12 hide columns count-div-container"> <a href="#metrics" > <div class="count-div" data-equalizer-watch> <span class="name">Article Views</span> <span class="count view-number"></span> </div> </a> </div> <div id="js-counts-wrapper__citations" class="small-12 columns hide count-div-container"> <a href="#metrics" > <div class="count-div" data-equalizer-watch> <span class="name">Citations</span> <span class="count citations-number Var_ArticleMaxCitations">-</span> </div> </a> </div> </div> </div> <div class="content__container"> <div class="hide-small-down-initially"> <ul class="accordion accordion__menu" data-accordion data-options="multi_expand:true;toggleable: true"> <li class="accordion-navigation"> <a href="#table_of_contents" class="accordion__title">Table of Contents</a> <div id="table_of_contents" class="content active"> <div class="menu-caption" id="html-quick-links-title"></div> </div> </li> </ul> </div> </div> <!-- PubGrade code --> <div id="pbgrd-sky"></div> <script src="https://cdn.pbgrd.com/core-mdpi.js"></script> <style>.content__container { min-width: 300px; }</style> <!-- PubGrade code --> </div> <div id="middle-column" class="content__column large-9 medium-9 small-12 columns end middle-bordered"> <div class="middle-column__help"> <div class="middle-column__help__fixed show-for-medium-up"> <span id="js-altmetrics-donut" href="#" target="_blank" rel="noopener noreferrer" style="display: none;"> <span data-badge-type='donut' class='altmetric-embed' data-doi='10.3390/f14050945'></span> <span>Altmetric</span> </span> <a href="#" class="UA_ShareButton" data-reveal-id="main-share-modal" title="Share"> <i class="material-icons">share</i> <span>Share</span> </a> <a href="#" data-reveal-id="main-help-modal" title="Help"> <i class="material-icons">announcement</i> <span>Help</span> </a> <a href="javascript:void(0);" data-reveal-id="cite-modal" data-counterslink = "https://www.mdpi.com/1999-4907/14/5/945/cite" > <i class="material-icons">format_quote</i> <span>Cite</span> </a> <a href="https://sciprofiles.com/discussion-groups/public/10.3390/f14050945?utm_source=mpdi.com&utm_medium=publication&utm_campaign=discuss_in_sciprofiles" target="_blank" rel="noopener noreferrer" title="Discuss in Sciprofiles"> <i class="material-icons">question_answer</i> <span>Discuss in SciProfiles</span> </a> </div> <div id="main-help-modal" class="reveal-modal reveal-modal-new" data-reveal aria-labelledby="modalTitle" aria-hidden="true" role="dialog"> <div class="row"> <div class="small-12 columns"> <h2 style="margin: 0;">Need Help?</h2> </div> <div class="small-6 columns"> <h3>Support</h3> <p> Find support for a specific problem in the support section of our website. </p> <a target="_blank" href="/about/contactform" class="button button--color button--full-width"> Get Support </a> </div> <div class="small-6 columns"> <h3>Feedback</h3> <p> Please let us know what you think of our products and services. </p> <a target="_blank" href="/feedback/send" class="button button--color button--full-width"> Give Feedback </a> </div> <div class="small-6 columns end"> <h3>Information</h3> <p> Visit our dedicated information section to learn more about MDPI. </p> <a target="_blank" href="/authors" class="button button--color button--full-width"> Get Information </a> </div> </div> <a class="close-reveal-modal" aria-label="Close"> <i class="material-icons">clear</i> </a> </div> </div> <div class="middle-column__main "> <div class="page-highlight"> <style type="text/css"> img.review-status { width: 30px; } </style> <div id="jmolModal" class="reveal-modal" data-reveal aria-labelledby="Captcha" aria-hidden="true" role="dialog"> <h2>JSmol Viewer</h2> <div class="row"> <div class="small-12 columns text-center"> <iframe style="width: 520px; height: 520px;" frameborder="0" id="jsmol-content"></iframe> <div class="content"></div> </div> </div> <a class="close-reveal-modal" aria-label="Close"> <i class="material-icons">clear</i> </a> </div> <div itemscope itemtype="http://schema.org/ScholarlyArticle" id="abstract" class="abstract_div"> <div class="js-check-update-container"></div> <div class="html-content__container content__container content__container__combined-for-large__first" style="overflow: auto; position: inherit;"> <div class='html-profile-nav'> <div class='top-bar'> <div class='nav-sidebar-btn show-for-large-up' data-status='opened' > <i class='material-icons'>first_page</i> </div> <a id="js-button-download" class="button button--color-inversed" style="display: none;" href="/1999-4907/14/5/945/pdf?version=1683252910" data-name="Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data" data-journal="forests"> <i class="material-icons custom-download"></i> Download PDF </a> <div class='nav-btn'> <i class='material-icons'>settings</i> </div> <a href="/1999-4907/14/5/945/reprints" id="js-button-reprints" class="button button--color-inversed"> Order Article Reprints </a> </div> <div class='html-article-menu'> <div class='html-first-step row'> <div class='html-font-family large-6 medium-6 small-12 columns'> <div class='row'> <div class='html-font-label large-4 medium-4 small-12 columns'> Font Type: </div> <div class='large-8 medium-8 small-12 columns'> <span class="html-article-menu-option"><i style='font-family:Arial, Arial, Helvetica, sans-serif;' data-fontfamily='Arial, Arial, Helvetica, sans-serif'>Arial</i></span> <span class="html-article-menu-option"><i style='font-family:Georgia1, Georgia, serif;' data-fontfamily='Georgia1, Georgia, serif'>Georgia</i></span> <span class="html-article-menu-option"><i style='font-family:Verdana, Verdana, Geneva, sans-serif;' data-fontfamily='Verdana, Verdana, Geneva, sans-serif' >Verdana</i></span> </div> </div> </div> <div class='html-font-resize large-6 medium-6 small-12 columns'> <div class='row'> <div class='html-font-label large-4 medium-4 small-12 columns'>Font Size:</div> <div class='large-8 medium-8 small-12 columns'> <span class="html-article-menu-option a1" data-percent="100">Aa</span> <span class="html-article-menu-option a2" data-percent="120">Aa</span> <span class="html-article-menu-option a3" data-percent="160">Aa</span> </div> </div> </div> </div> <div class='row'> <div class='html-line-space large-6 medium-6 small-12 columns'> <div class='row'> <div class='html-font-label large-4 medium-4 small-12 columns' >Line Spacing:</div> <div class='large-8 medium-8 small-12 columns'> <span class="html-article-menu-option a1" data-line-height="1.5em"> <i class="fa">&#xf034;</i> </span> <span class="html-article-menu-option a2" data-line-height="1.8em"> <i class="fa">&#xf034;</i> </span> <span class="html-article-menu-option a3" data-line-height="2.1em"> <i class="fa">&#xf034;</i> </span> </div> </div> </div> <div class='html-column-width large-6 medium-6 small-12 columns'> <div class='row'> <div class='html-font-label large-4 medium-4 small-12 columns' >Column Width:</div> <div class='large-8 medium-8 small-12 columns'> <span class="html-article-menu-option a1" data-column-width="20%"> <i class="fa">&#xf035;</i> </span> <span class="html-article-menu-option a2" data-column-width="10%"> <i class="fa">&#xf035;</i> </span> <span class="html-article-menu-option a3" data-column-width="0%"> <i class="fa">&#xf035;</i> </span> </div> </div> </div> </div> <div class='row'> <div class='html-font-bg large-6 medium-6 small-12 columns end'> <div class='row'> <div class='html-font-label large-4 medium-4 small-12 columns'>Background:</div> <div class='large-8 medium-8 small-12 columns'> <div class="html-article-menu-option html-nav-bg html-nav-bright" data-bg="bright"> <i class="fa fa-file-text"></i> </div> <div class="html-article-menu-option html-nav-bg html-nav-dark" data-bg="dark"> <i class="fa fa-file-text-o"></i> </div> <div class="html-article-menu-option html-nav-bg html-nav-creme" data-bg="creme"> <i class="fa fa-file-text"></i> </div> </div> </div> </div> </div> </div> </div> <article ><div class='html-article-content'> <span itemprop="publisher" content="Multidisciplinary Digital Publishing Institute"></span><span itemprop="url" content="https://www.mdpi.com/1999-4907/14/5/945"></span> <div class="article-icons"><span class="label openaccess" data-dropdown="drop-article-label-openaccess" aria-expanded="false">Open Access</span><span class='label choice' data-dropdown='drop-article-label-choice' aria-expanded='false' data-editorschoiceaddition='<a href="/journal/forests/editors_choice">More Editor’s choice articles in journal <em>Forests</em>.</a>'>Editor’s Choice</span><span class="label articletype">Article</span></div> <h1 class="title hypothesis_container" itemprop="name"> Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data </h1> <div class="art-authors hypothesis_container"> by <span class="inlineblock "><div class='profile-card-drop' data-dropdown='profile-card-drop11084417' data-options='is_hover:true, hover_timeout:5000'> Rorai Pereira Martins-Neto</div><div id="profile-card-drop11084417" data-dropdown-content class="f-dropdown content profile-card-content" aria-hidden="true" tabindex="-1"><div class="profile-card__title"><div class="sciprofiles-link" style="display: inline-block"><div class="sciprofiles-link__link"><img class="sciprofiles-link__image" src="/profiles/1540398/thumb/Rorai_Pereira_Martins-Neto.jpeg" style="width: auto; height: 16px; border-radius: 50%;"><span class="sciprofiles-link__name">Rorai Pereira Martins-Neto</span></div></div></div><div class="profile-card__buttons" style="margin-bottom: 10px;"><a href="https://sciprofiles.com/profile/1540398?utm_source=mdpi.com&amp;utm_medium=website&amp;utm_campaign=avatar_name" class="button button--color-inversed" target="_blank"> SciProfiles </a><a href="https://scilit.com/scholars?q=Rorai%20Pereira%20Martins-Neto" class="button button--color-inversed" target="_blank"> Scilit </a><a href="https://www.preprints.org/search?search1=Rorai%20Pereira%20Martins-Neto&field1=authors" class="button button--color-inversed" target="_blank"> Preprints.org </a><a href="https://scholar.google.com/scholar?q=Rorai%20Pereira%20Martins-Neto" class="button button--color-inversed" target="_blank" rels="noopener noreferrer"> Google Scholar </a></div></div><sup> 1,2</sup><span style="display: inline; margin-left: 5px;"></span><a class="toEncode emailCaptcha visibility-hidden" data-author-id="11084417" href="/cdn-cgi/l/email-protection#92bdf1fcf6bff1f5fbbdfebdf7fff3fbfebfe2e0fde6f7f1e6fbfdfcb1a2a2a3a7a5a2a2a0a3a7a3aba2a0a3a3a0f4a3f6a3a3a2a0a2a6a3aba3f7a2a1a0f4a3f7a3a7a2a6a3f4a1a2a3a4a3f1a3a6a7f7a3a1a2f3a2a7a7f7a3a1a2f3"><sup><i class="fa fa-envelope-o"></i></sup></a>, </span><span class="inlineblock "><div class='profile-card-drop' data-dropdown='profile-card-drop11084418' data-options='is_hover:true, hover_timeout:5000'> Antonio Maria Garcia Tommaselli</div><div id="profile-card-drop11084418" data-dropdown-content class="f-dropdown content profile-card-content" aria-hidden="true" tabindex="-1"><div class="profile-card__title"><div class="sciprofiles-link" style="display: inline-block"><div class="sciprofiles-link__link"><img class="sciprofiles-link__image" src="/profiles/38744/thumb/Antonio_Tommaselli.png" style="width: auto; height: 16px; border-radius: 50%;"><span class="sciprofiles-link__name">Antonio Maria Garcia Tommaselli</span></div></div></div><div class="profile-card__buttons" style="margin-bottom: 10px;"><a href="https://sciprofiles.com/profile/38744?utm_source=mdpi.com&amp;utm_medium=website&amp;utm_campaign=avatar_name" class="button button--color-inversed" target="_blank"> SciProfiles </a><a href="https://scilit.com/scholars?q=Antonio%20Maria%20Garcia%20Tommaselli" class="button button--color-inversed" target="_blank"> Scilit </a><a href="https://www.preprints.org/search?search1=Antonio%20Maria%20Garcia%20Tommaselli&field1=authors" class="button button--color-inversed" target="_blank"> Preprints.org </a><a href="https://scholar.google.com/scholar?q=Antonio%20Maria%20Garcia%20Tommaselli" class="button button--color-inversed" target="_blank" rels="noopener noreferrer"> Google Scholar </a></div></div><sup> 2,*</sup><span style="display: inline; margin-left: 5px;"></span><a class="toEncode emailCaptcha visibility-hidden" data-author-id="11084418" href="/cdn-cgi/l/email-protection#604f030e044d0307094f0c4f050d01090c4d10120f14050314090f0e435050540656515155500550035003505051525054500450045058525151545006505451525151540650535153"><sup><i class="fa fa-envelope-o"></i></sup></a><a href="https://orcid.org/0000-0003-0483-1103" target="_blank" rel="noopener noreferrer"><img src="https://pub.mdpi-res.com/img/design/orcid.png?0465bc3812adeb52?1739771134" title="ORCID" style="position: relative; width: 13px; margin-left: 3px; max-width: 13px !important; height: auto; top: -5px;"></a>, </span><span class="inlineblock "><div class='profile-card-drop' data-dropdown='profile-card-drop11084419' data-options='is_hover:true, hover_timeout:5000'> Nilton Nobuhiro Imai</div><div id="profile-card-drop11084419" data-dropdown-content class="f-dropdown content profile-card-content" aria-hidden="true" tabindex="-1"><div class="profile-card__title"><div class="sciprofiles-link" style="display: inline-block"><div class="sciprofiles-link__link"><img class="sciprofiles-link__image" src="/bundles/mdpisciprofileslink/img/unknown-user.png" style="width: auto; height: 16px; border-radius: 50%;"><span class="sciprofiles-link__name">Nilton Nobuhiro Imai</span></div></div></div><div class="profile-card__buttons" style="margin-bottom: 10px;"><a href="https://sciprofiles.com/profile/126982?utm_source=mdpi.com&amp;utm_medium=website&amp;utm_campaign=avatar_name" class="button button--color-inversed" target="_blank"> SciProfiles </a><a href="https://scilit.com/scholars?q=Nilton%20Nobuhiro%20Imai" class="button button--color-inversed" target="_blank"> Scilit </a><a href="https://www.preprints.org/search?search1=Nilton%20Nobuhiro%20Imai&field1=authors" class="button button--color-inversed" target="_blank"> Preprints.org </a><a href="https://scholar.google.com/scholar?q=Nilton%20Nobuhiro%20Imai" class="button button--color-inversed" target="_blank" rels="noopener noreferrer"> Google Scholar </a></div></div><sup> 2</sup><span style="display: inline; margin-left: 5px;"></span><a class="toEncode emailCaptcha visibility-hidden" data-author-id="11084419" href="/cdn-cgi/l/email-protection#b996dad7dd94daded096d596dcd4d8d0d594c9cbd6cddcdacdd0d6d79a8989898e8fdc898b88d8898889898d89898e898a89df898e8bdc88db898989db88dd88dc8d8989da88da"><sup><i class="fa fa-envelope-o"></i></sup></a><a href="https://orcid.org/0000-0003-0516-0567" target="_blank" rel="noopener noreferrer"><img src="https://pub.mdpi-res.com/img/design/orcid.png?0465bc3812adeb52?1739771134" title="ORCID" style="position: relative; width: 13px; margin-left: 3px; max-width: 13px !important; height: auto; top: -5px;"></a>, </span><span class="inlineblock "><div class='profile-card-drop' data-dropdown='profile-card-drop11084420' data-options='is_hover:true, hover_timeout:5000'> Eija Honkavaara</div><div id="profile-card-drop11084420" data-dropdown-content class="f-dropdown content profile-card-content" aria-hidden="true" tabindex="-1"><div class="profile-card__title"><div class="sciprofiles-link" style="display: inline-block"><div class="sciprofiles-link__link"><img class="sciprofiles-link__image" src="/bundles/mdpisciprofileslink/img/unknown-user.png" style="width: auto; height: 16px; border-radius: 50%;"><span class="sciprofiles-link__name">Eija Honkavaara</span></div></div></div><div class="profile-card__buttons" style="margin-bottom: 10px;"><a href="https://sciprofiles.com/profile/421?utm_source=mdpi.com&amp;utm_medium=website&amp;utm_campaign=avatar_name" class="button button--color-inversed" target="_blank"> SciProfiles </a><a href="https://scilit.com/scholars?q=Eija%20Honkavaara" class="button button--color-inversed" target="_blank"> Scilit </a><a href="https://www.preprints.org/search?search1=Eija%20Honkavaara&field1=authors" class="button button--color-inversed" target="_blank"> Preprints.org </a><a href="https://scholar.google.com/scholar?q=Eija%20Honkavaara" class="button button--color-inversed" target="_blank" rels="noopener noreferrer"> Google Scholar </a></div></div><sup> 3</sup><span style="display: inline; margin-left: 5px;"></span><a class="toEncode emailCaptcha visibility-hidden" data-author-id="11084420" href="/cdn-cgi/l/email-protection#84abe7eae0a9e7e3edabe8abe1e9e5ede8a9f4f6ebf0e1e7f0edebeaa7b4b4b4e7b2b1b4e2b4b0b0e6b4e0b4e5b4e6b4e1b4b0b5b7b4b0b4b0b5b3b4b0b6b1b4e6b4bdb5b2b0e6b4b7b4e7"><sup><i class="fa fa-envelope-o"></i></sup></a><a href="https://orcid.org/0000-0002-7236-2145" target="_blank" rel="noopener noreferrer"><img src="https://pub.mdpi-res.com/img/design/orcid.png?0465bc3812adeb52?1739771134" title="ORCID" style="position: relative; width: 13px; margin-left: 3px; max-width: 13px !important; height: auto; top: -5px;"></a>, </span><span class="inlineblock "><div class='profile-card-drop' data-dropdown='profile-card-drop11084421' data-options='is_hover:true, hover_timeout:5000'> Milto Miltiadou</div><div id="profile-card-drop11084421" data-dropdown-content class="f-dropdown content profile-card-content" aria-hidden="true" tabindex="-1"><div class="profile-card__title"><div class="sciprofiles-link" style="display: inline-block"><div class="sciprofiles-link__link"><img class="sciprofiles-link__image" src="/profiles/622681/thumb/Milto_Miltiadou.jpg" style="width: auto; height: 16px; border-radius: 50%;"><span class="sciprofiles-link__name">Milto Miltiadou</span></div></div></div><div class="profile-card__buttons" style="margin-bottom: 10px;"><a href="https://sciprofiles.com/profile/622681?utm_source=mdpi.com&amp;utm_medium=website&amp;utm_campaign=avatar_name" class="button button--color-inversed" target="_blank"> SciProfiles </a><a href="https://scilit.com/scholars?q=Milto%20Miltiadou" class="button button--color-inversed" target="_blank"> Scilit </a><a href="https://www.preprints.org/search?search1=Milto%20Miltiadou&field1=authors" class="button button--color-inversed" target="_blank"> Preprints.org </a><a href="https://scholar.google.com/scholar?q=Milto%20Miltiadou" class="button button--color-inversed" target="_blank" rels="noopener noreferrer"> Google Scholar </a></div></div><sup> 4</sup><span style="display: inline; margin-left: 5px;"></span><a class="toEncode emailCaptcha visibility-hidden" data-author-id="11084421" href="/cdn-cgi/l/email-protection#0c236f6268216f6b6523602369616d6560217c7e6378696f786563622f3c3c3c3c3a68396a396d396839343e683c693c6f3c3c383f3c6f3c69383f3d343c3a"><sup><i class="fa fa-envelope-o"></i></sup></a><a href="https://orcid.org/0000-0002-4715-5048" target="_blank" rel="noopener noreferrer"><img src="https://pub.mdpi-res.com/img/design/orcid.png?0465bc3812adeb52?1739771134" title="ORCID" style="position: relative; width: 13px; margin-left: 3px; max-width: 13px !important; height: auto; top: -5px;"></a>, </span><span class="inlineblock "><div class='profile-card-drop' data-dropdown='profile-card-drop11084422' data-options='is_hover:true, hover_timeout:5000'> Erika Akemi Saito Moriya</div><div id="profile-card-drop11084422" data-dropdown-content class="f-dropdown content profile-card-content" aria-hidden="true" tabindex="-1"><div class="profile-card__title"><div class="sciprofiles-link" style="display: inline-block"><div class="sciprofiles-link__link"><img class="sciprofiles-link__image" src="/bundles/mdpisciprofileslink/img/unknown-user.png" style="width: auto; height: 16px; border-radius: 50%;"><span class="sciprofiles-link__name">Erika Akemi Saito Moriya</span></div></div></div><div class="profile-card__buttons" style="margin-bottom: 10px;"><a href="https://sciprofiles.com/profile/134478?utm_source=mdpi.com&amp;utm_medium=website&amp;utm_campaign=avatar_name" class="button button--color-inversed" target="_blank"> SciProfiles </a><a href="https://scilit.com/scholars?q=Erika%20Akemi%20Saito%20Moriya" class="button button--color-inversed" target="_blank"> Scilit </a><a href="https://www.preprints.org/search?search1=Erika%20Akemi%20Saito%20Moriya&field1=authors" class="button button--color-inversed" target="_blank"> Preprints.org </a><a href="https://scholar.google.com/scholar?q=Erika%20Akemi%20Saito%20Moriya" class="button button--color-inversed" target="_blank" rels="noopener noreferrer"> Google Scholar </a></div></div><sup> 2</sup><span style="display: inline; margin-left: 5px;"></span><a class="toEncode emailCaptcha visibility-hidden" data-author-id="11084422" href="/cdn-cgi/l/email-protection#a08fc3cec48dc3c7c98fcc8fc5cdc1c9cc8dd0d2cfd4c5c3d4c9cfce8390909197969590c390c5909490949196909490c3919190c1929590929098909490c3909994c2909690c19098"><sup><i class="fa fa-envelope-o"></i></sup></a> and </span><span class="inlineblock "><div class='profile-card-drop' data-dropdown='profile-card-drop11084423' data-options='is_hover:true, hover_timeout:5000'> Hassan Camil David</div><div id="profile-card-drop11084423" data-dropdown-content class="f-dropdown content profile-card-content" aria-hidden="true" tabindex="-1"><div class="profile-card__title"><div class="sciprofiles-link" style="display: inline-block"><div class="sciprofiles-link__link"><img class="sciprofiles-link__image" src="/bundles/mdpisciprofileslink/img/unknown-user.png" style="width: auto; height: 16px; border-radius: 50%;"><span class="sciprofiles-link__name">Hassan Camil David</span></div></div></div><div class="profile-card__buttons" style="margin-bottom: 10px;"><a href="https://sciprofiles.com/profile/author/eEtnR00vNzRxdVY2c1Z1d0h2eWp1cmxVd0cyb3NDVlNyTlhwa1UvOTQxTT0=?utm_source=mdpi.com&amp;utm_medium=website&amp;utm_campaign=avatar_name" class="button button--color-inversed" target="_blank"> SciProfiles </a><a href="https://scilit.com/scholars?q=Hassan%20Camil%20David" class="button button--color-inversed" target="_blank"> Scilit </a><a href="https://www.preprints.org/search?search1=Hassan%20Camil%20David&field1=authors" class="button button--color-inversed" target="_blank"> Preprints.org </a><a href="https://scholar.google.com/scholar?q=Hassan%20Camil%20David" class="button button--color-inversed" target="_blank" rels="noopener noreferrer"> Google Scholar </a></div></div><sup> 5</sup><span style="display: inline; margin-left: 5px;"></span><a class="toEncode emailCaptcha visibility-hidden" data-author-id="11084423" href="/cdn-cgi/l/email-protection#b19ed2dfd59cd2d6d89edd9ed4dcd0d8dd9cc1c3dec5d4d2c5d8dedf9281818188878980d380d381888187858781d2818880d4818081d28389818481848188858781d7818680d4858781d080d0"><sup><i class="fa fa-envelope-o"></i></sup></a></span> </div> <div class="nrm"></div> <span style="display:block; height:6px;"></span> <div></div> <div style="margin: 5px 0 15px 0;" class="hypothesis_container"> <div class="art-affiliations"> <div class="affiliation "> <div class="affiliation-item"><sup>1</sup></div> <div class="affiliation-name ">Faculty of Forestry and Wood Sciences, Czech University of Life Sciences Prague (CULS), Kamýcká 129, 165-00 Prague, Czech Republic</div> </div> <div class="affiliation "> <div class="affiliation-item"><sup>2</sup></div> <div class="affiliation-name ">Department of Cartography, São Paulo State University (FCT/UNSEP), Roberto Simonsen 305, Presidente Prudente 19060-900, SP, Brazil</div> </div> <div class="affiliation "> <div class="affiliation-item"><sup>3</sup></div> <div class="affiliation-name ">Department of Remote Sensing and Photogrammetry, Finnish Geospatial Research Institute (FGI), National Land Survey of Finland (NLS), Vuorimiehentie 5, FI-02150 Espoo, Finland</div> </div> <div class="affiliation "> <div class="affiliation-item"><sup>4</sup></div> <div class="affiliation-name ">Department of Geography, University of Cambridge, Downing Site, 20 Downing Place, Cambridge CB2 3EL, UK</div> </div> <div class="affiliation "> <div class="affiliation-item"><sup>5</sup></div> <div class="affiliation-name ">Brazilian Forest Service (SFB), SCEN Trecho 2, Sede do Ibama, Brasília 70818-900, DF, Brazil</div> </div> <div class="affiliation"> <div class="affiliation-item"><sup>*</sup></div> <div class="affiliation-name ">Author to whom correspondence should be addressed. </div> </div> </div> </div> <div class="bib-identity" style="margin-bottom: 10px;"> <em>Forests</em> <b>2023</b>, <em>14</em>(5), 945; <a href="https://doi.org/10.3390/f14050945">https://doi.org/10.3390/f14050945</a> </div> <div class="pubhistory" style="font-weight: bold; padding-bottom: 10px;"> <span style="display: inline-block">Submission received: 31 March 2023</span> / <span style="display: inline-block">Revised: 28 April 2023</span> / <span style="display: inline-block">Accepted: 29 April 2023</span> / <span style="display: inline-block">Published: 4 May 2023</span> </div> <div class="belongsTo" style="margin-bottom: 10px;"> (This article belongs to the Special Issue <a href=" /journal/forests/special_issues/L9XZ43ZVF3 ">Remote Sensing Approaches to Mapping and Monitoring Forest Vegetation Conditions</a>)<br/> </div> <div class="highlight-box1"> <div class="download"> <a class="button button--color-inversed button--drop-down" data-dropdown="drop-download-1139025" aria-controls="drop-supplementary-1139025" aria-expanded="false"> Download <i class="material-icons">keyboard_arrow_down</i> </a> <div id="drop-download-1139025" class="f-dropdown label__btn__dropdown label__btn__dropdown--button" data-dropdown-content aria-hidden="true" tabindex="-1"> <a class="UD_ArticlePDF" href="/1999-4907/14/5/945/pdf?version=1683252910" data-name="Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data" data-journal="forests">Download PDF</a> <br/> <a id="js-pdf-with-cover-access-captcha" href="#" data-target="/1999-4907/14/5/945/pdf-with-cover" class="accessCaptcha">Download PDF with Cover</a> <br/> <a id="js-xml-access-captcha" href="#" data-target="/1999-4907/14/5/945/xml" class="accessCaptcha">Download XML</a> <br/> <a href="/1999-4907/14/5/945/epub" id="epub_link">Download Epub</a> <br/> <a href="javascript:void(0);" data-reveal-id="supplementaryModal">Download Supplementary Material</a> <br/> </div> <div class="js-browse-figures" style="display: inline-block;"> <a href="#" class="button button--color-inversed margin-bottom-10 openpopupgallery UI_BrowseArticleFigures" data-target='article-popup' data-counterslink = "https://www.mdpi.com/1999-4907/14/5/945/browse" >Browse Figures</a> </div> <div id="article-popup" class="popupgallery" style="display: inline; line-height: 200%"> <a href="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-ag.png?1683253038" title=" <strong>Graphical abstract</strong><br/> "> </a> <a href="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g001.png?1683253012" title=" <strong>Figure 1</strong><br/> &lt;p&gt;Location of the &lt;span class=&quot;html-italic&quot;&gt;Ponte Branca&lt;/span&gt; Forest remnant with the field plots and the different successional stages found (Source: Martins-Neto et al., 2022 [&lt;a href=&quot;#B33-forests-14-00945&quot; class=&quot;html-bibr&quot;&gt;33&lt;/a&gt;]).&lt;/p&gt; "> </a> <a href="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g002.png?1683253029" title=" <strong>Figure 2</strong><br/> &lt;p&gt;Vertical stratification of &lt;span class=&quot;html-italic&quot;&gt;Ponte Branca&lt;/span&gt; Forest remnant. (&lt;b&gt;a&lt;/b&gt;) All tree heights. (&lt;b&gt;b&lt;/b&gt;) Lower stratum. (&lt;b&gt;c&lt;/b&gt;) Middle stratum. (&lt;b&gt;d&lt;/b&gt;) Upper stratum.&lt;/p&gt; "> </a> <a href="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g003.png?1683253022" title=" <strong>Figure 3</strong><br/> &lt;p&gt;Individual tree crowns delineated manually for each species identified in the field for hyperspectral orthomosaics in green (R: 780.49 nm; G: 650.96 nm; B: 535.09 nm), and for RGB imagens in red.&lt;/p&gt; "> </a> <a href="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g004.png?1683253010" title=" <strong>Figure 4</strong><br/> &lt;p&gt;(&lt;b&gt;a&lt;/b&gt;) Rikola hyperspectral camera. (&lt;b&gt;b&lt;/b&gt;) UAV quadcopter with Rikola camera mounted. (Source: Miyoshi, 2020 [&lt;a href=&quot;#B47-forests-14-00945&quot; class=&quot;html-bibr&quot;&gt;47&lt;/a&gt;]).&lt;/p&gt; "> </a> <a href="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g005.png?1683253016" title=" <strong>Figure 5</strong><br/> &lt;p&gt;Targets located near the overflown area. The radiometric targets are in red, and the GCPs are in blue.&lt;/p&gt; "> </a> <a href="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g006.png?1683253013" title=" <strong>Figure 6</strong><br/> &lt;p&gt;Hyperspectral images processing flowchart (Source: adapted from Näsi et al., 2015 and Moriya et al., 2017 [&lt;a href=&quot;#B51-forests-14-00945&quot; class=&quot;html-bibr&quot;&gt;51&lt;/a&gt;,&lt;a href=&quot;#B53-forests-14-00945&quot; class=&quot;html-bibr&quot;&gt;53&lt;/a&gt;]).&lt;/p&gt; "> </a> <a href="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g007.png?1683253019" title=" <strong>Figure 7</strong><br/> &lt;p&gt;LiDAR data processing flowchart. Dark blue are steps for PR data and light blue for FWF data.&lt;/p&gt; "> </a> <a href="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g008.png?1683253025" title=" <strong>Figure 8</strong><br/> &lt;p&gt;Canopy height model with the superpixels; on the left, it was generated with 100,000 superpixels, and on the right, it was generated with 200,000 superpixels.&lt;/p&gt; "> </a> <a href="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g009.png?1683253032" title=" <strong>Figure 9</strong><br/> &lt;p&gt;The selected superpixels are depicted in blue and were derived based on the criteria present in &lt;a href=&quot;#forests-14-00945-t004&quot; class=&quot;html-table&quot;&gt;Table 4&lt;/a&gt;. The merged superpixels are depicted with yellow9 while the white colour shows the comparison of the superpixels with the reference ITC. Regarding the case of SyRo, manual corrections was performed on the merged superpixels since an excessive number of cells were selected.&lt;/p&gt; "> </a> <a href="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g010.png?1683253018" title=" <strong>Figure 10</strong><br/> &lt;p&gt;Mean spectra for each tree species.&lt;/p&gt; "> </a> <a href="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g011.png?1683253033" title=" <strong>Figure 11</strong><br/> &lt;p&gt;Accuracy assessment of the 13 scenarios tested for the classification of tree species.&lt;/p&gt; "> </a> <a href="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g012.png?1683253006" title=" <strong>Figure 12</strong><br/> &lt;p&gt;Confusion matrix for the classification of the eight tree species for the two best scenarios.&lt;/p&gt; "> </a> <a href="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g013.png?1683253027" title=" <strong>Figure 13</strong><br/> &lt;p&gt;Feature importance of tree species classification for S11 and S13.&lt;/p&gt; "> </a> <a href="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g014.png?1683253008" title=" <strong>Figure 14</strong><br/> &lt;p&gt;Projection of features for the first and fourth principal components and their respective contribution.&lt;/p&gt; "> </a> </div> <a class="button button--color-inversed" href="/1999-4907/14/5/945/notes">Versions&nbsp;Notes</a> </div> </div> <div class="responsive-moving-container small hidden" data-id="article-counters" style="margin-top: 15px;"></div> <div class="html-dynamic"> <section> <div class="art-abstract art-abstract-new in-tab hypothesis_container"> <p> <div><section class="html-abstract" id="html-abstract"> <h2 id="html-abstract-title">Abstract</h2><b>:</b> <div class="html-p">This study experiments with different combinations of UAV hyperspectral data and LiDAR metrics for classifying eight tree species found in a Brazilian Atlantic Forest remnant, the most degraded Brazilian biome with high fragmentation but with huge structural complexity. The selection of the species was done based on the number of tree samples, which exist in the plot data and in the fact the UAV imagery does not acquire information below the forest canopy. Due to the complexity of the forest, only species that exist in the upper canopy of the remnant were included in the classification. A combination of hyperspectral UAV images and LiDAR point clouds were in the experiment. The hyperspectral images were photogrammetric and radiometric processed to obtain orthomosaics with reflectance factor values. Raw spectra were extracted from the trees, and vegetation indices (VIs) were calculated. Regarding the LiDAR data, both the point cloud&mdash;referred to as Peak Returns (PR)&mdash;and the full-waveform (FWF) LiDAR were included in this study. The point clouds were processed to normalize the intensities and heights, and different metrics for each data type (PR and FWF) were extracted. Segmentation was preformed semi-automatically using the superpixel algorithm, followed with manual correction to ensure precise tree crown delineation before tree species classification. Thirteen different classification scenarios were tested. The scenarios included spectral features and LiDAR metrics either combined or not. The best result was obtained with all features transformed with principal component analysis with an accuracy of 76%, which did not differ significantly from the scenarios using the raw spectra or VIs with PR or FWF LiDAR metrics. The combination of spectral data with geometric information from LiDAR improved the classification of tree species in a complex tropical forest, and these results can serve to inform management and conservation practices of these forest remnants.</div> </section> <div id="html-keywords"> <div class="html-gwd-group"><div id="html-keywords-title">Keywords: </div><a href="/search?q=Brazilian+Atlantic+Forest">Brazilian Atlantic Forest</a>; <a href="/search?q=tree+species+mapping">tree species mapping</a>; <a href="/search?q=LiDAR">LiDAR</a>; <a href="/search?q=hyperspectral+imaging">hyperspectral imaging</a>; <a href="/search?q=superpixel+segmentation">superpixel segmentation</a></div> <div> </div> </div> </div> </p> </div> <div class="row"> <div class="columns large-12 text-center"> <div class="abstract-image-preview open js-browse-figures"> <a href="#" class="openpopupgallery" data-target='article-popup-ga'> <img src="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-ag-550.jpg?1683253040" style="max-width: 100%; max-height: 280px; padding: 10px;"> </a> <div id="article-popup-ga" class="popupgallery" style="display: inline; line-height: 200%"> <a href="https://pub.mdpi-res.com/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-ag.png?1683253038" title="<strong>Graphical Abstract</strong>"></a> </div> <div>Graphical Abstract</div> </div> </div> </div> </section> </div> <div class="hypothesis_container"> <ul class="menu html-nav" data-prev-node="#html-quick-links-title"> </ul> <div class="html-body"> <section id='sec1-forests-14-00945' type='intro'><h2 data-nested='1'> 1. Introduction</h2><div class='html-p'>Tropical forests are very complex ecosystems due to the high biodiversity of fauna and flora. They also play an important role in carbon sequestration and the carbon cycle for climate regulation [<a href="#B1-forests-14-00945" class="html-bibr">1</a>,<a href="#B2-forests-14-00945" class="html-bibr">2</a>,<a href="#B3-forests-14-00945" class="html-bibr">3</a>]. The discrimination of tree species is essential for forest ecology, as it supports monitoring the biodiversity and invasive species, sustainable forest management, and conservation practices, floristic and phytosociological forest inventory and wildlife habitat mapping [<a href="#B4-forests-14-00945" class="html-bibr">4</a>,<a href="#B5-forests-14-00945" class="html-bibr">5</a>]. However, tree species classification for tropical forests has not been very exploited due to their complexity. Many layers are present within the forest; there is a large number of tree species, the tree heights and canopies are very heterogeneous, and the distribution of tree individuals significantly varies across the different strata of the forest. Thus, many tree species classification workflows were developed for temperate forests [<a href="#B4-forests-14-00945" class="html-bibr">4</a>,<a href="#B6-forests-14-00945" class="html-bibr">6</a>]. </div><div class='html-p'>The continuous advancement of remote sensing technologies aims to tackle the classifications and identification of the tree species problem. In this paper, two important advancements in remote sensing that support species classifications are investigated: hyperspectral imagery and LiDAR data. The hyperspectral sensors are narrowband sensors and can acquire nearly continuous reflectance spectra in many narrow bands for each pixel. These spectra can be used for detailed quantitative analyses and, consequently, can increase the separability of tree species, which absorb and reflect light differently along the spectrum. Classifications in native tropical forests, though, is even more challenging when using only spectral data due to the increased number of spectral differences that need to be identified to classify the increased number of species while the same species with a different age also have different spectral reflectance, making the separation between species more complicated and increasing errors in the classification [<a href="#B7-forests-14-00945" class="html-bibr">7</a>,<a href="#B8-forests-14-00945" class="html-bibr">8</a>,<a href="#B9-forests-14-00945" class="html-bibr">9</a>]. Furthermore, hyperspectral sensors are not suitable to derive structural parameters of the forest, such as tree height, canopy volume and density, or the number of strata in the forest, since only the reflectance of the tops of the objects is recorded [<a href="#B10-forests-14-00945" class="html-bibr">10</a>]. </div><div class='html-p'>LiDAR (Light Detection and Ranging) systems can provide three-dimensional information about the vegetation, thus allowing a better understanding of the geometry and the intensities of the vertical structure of forests [<a href="#B11-forests-14-00945" class="html-bibr">11</a>,<a href="#B12-forests-14-00945" class="html-bibr">12</a>]. This geometric/structural information of vegetation can provide important information to improve the separability between species in complex forests. </div><div class='html-p'>In this paper, we refer to LiDAR data as either peak returns (PR) and full-waveform (FWF). Traditionally, discrete LiDAR used to record multiple returns per emitted pulse [<a href="#B13-forests-14-00945" class="html-bibr">13</a>] when there was an intense return signal to the sensor and there was an offset between each recorded return. The first and intermediate returns are indicated to extract information from partially penetrable objects, such as tree canopies and the structures present below, and the last return is often indicated to obtain information from non-penetrable surfaces like the terrain [<a href="#B13-forests-14-00945" class="html-bibr">13</a>,<a href="#B14-forests-14-00945" class="html-bibr">14</a>,<a href="#B15-forests-14-00945" class="html-bibr">15</a>]. Full-waveform LiDAR systems record and digitize the entire amount of energy returned to the sensor after being backscattered by objects present on the scanned area [<a href="#B16-forests-14-00945" class="html-bibr">16</a>,<a href="#B17-forests-14-00945" class="html-bibr">17</a>]. More information is recorded in full-waveform data than by using discrete return systems. The waveform contains the properties of all elements intercepting the path of the emitted beam, and its analysis allows a better interpretation of the physical structure and geometric backscatter properties of the intercepted objects, which can improve the representation of the forest structure, including its vertical structure, canopy volume, understory, and terrain [<a href="#B13-forests-14-00945" class="html-bibr">13</a>,<a href="#B17-forests-14-00945" class="html-bibr">17</a>,<a href="#B18-forests-14-00945" class="html-bibr">18</a>,<a href="#B19-forests-14-00945" class="html-bibr">19</a>]. In this paper, we do not use the term “discrete LiDAR” since the system used to collect the LiDAR points cloud is a waveform sensor and the point clouds return peaks exported from waveform data either in real-time by the system or in post-processing [<a href="#B20-forests-14-00945" class="html-bibr">20</a>]. </div><div class='html-p'>The fusion of features obtained from multisource remote sensing data, such as hyperspectral images with LiDAR metrics has been used to complement sources and obtain high-quality estimations [<a href="#B21-forests-14-00945" class="html-bibr">21</a>]. The complementary data provided by spectral information and LiDAR geometric/structural features can provide a more comprehensive interpretation for mapping tree species [<a href="#B22-forests-14-00945" class="html-bibr">22</a>].</div><div class='html-p'>Among the existing biomes in Brazil, the Atlantic Forest domain is the most degraded with approximately only 11.6% of the original forest cover, and the remaining areas are small and very fragmented [<a href="#B23-forests-14-00945" class="html-bibr">23</a>,<a href="#B24-forests-14-00945" class="html-bibr">24</a>,<a href="#B25-forests-14-00945" class="html-bibr">25</a>]. The current forest remnants are insufficient to preserve the biodiversity; thus, many efforts have been taken to restore these areas ecologically [<a href="#B26-forests-14-00945" class="html-bibr">26</a>,<a href="#B27-forests-14-00945" class="html-bibr">27</a>]. Due to their high ecological importance, studies are required for understanding the composition and spatial distribution of the species present in these small remnants [<a href="#B28-forests-14-00945" class="html-bibr">28</a>]. This could support the monitoring of changes in the forest canopy, a response to deforestation and climate change, and proposals for conservation and ecological restoration of tree species [<a href="#B8-forests-14-00945" class="html-bibr">8</a>,<a href="#B29-forests-14-00945" class="html-bibr">29</a>].</div><div class='html-p'>The aim of this study was to combine remote sensing data from different sources and determine which combination of data is the best while classifying eight tree species that exist in the upper canopy of a remnant of Brazilian Atlantic Forest. This is particularly important considering the high floristic diversity in tropical forests at it confers difficulties in separating different species and the lack of knowledge about species composition and distribution. The spectral data from hyperspectral images obtained from a lightweight camera onboard of a UAV (unmanned aerial vehicle) and the structural information from both peak return and full waveform LiDAR data were investigated. Spectral and structural information were either combined or not into 13 unique scenarios used for classifying the tree species. These scenarios were evaluated. Before the classification processes, we tested a semi-automatic method for tree crown segmentation to demonstrate the challenge of delineating crowns in complex and heterogeneous forests. </div><div class='html-p'>Studies related to the classification of tropical forest species, mainly in other Brazilian forest typologies other than the Amazon Forest, are scarce, and thus, the methodology and results obtained in this study will serve as a guide for future studies involving the composition of species in Brazilian forests and other tropical forest remnants. In addition, the methodologies were all performed with well-established algorithms and open-source software, allowing any user to apply the methodology to their dataset and obtain results that can help in conservation practices of tropical forests.</div></section><section id='sec2-forests-14-00945' type=''><h2 data-nested='1'> 2. Materials and Methods</h2><section id='sec2dot1-forests-14-00945' type=''><h4 class='html-italic' data-nested='2'> 2.1. Study Area and Inventory Data </h4><div class='html-p'>The study area is located in southeastern Brazil. It is a remnant of Atlantic Forest, protected by federal environmental laws, called <span class='html-italic'>Ponte Branca</span> (<a href="#forests-14-00945-f001" class="html-fig">Figure 1</a>). It has a high ecological importance because it is a transition zone between the Brazilian Savannah and one of the few remnants of semideciduous seasonal forest (inland Atlantic Forest) in the state of São Paulo. A detailed description of the vegetation and ecological succession that occurred in the <span class='html-italic'>Ponte Branca</span> Forest remnant can be found in [<a href="#B30-forests-14-00945" class="html-bibr">30</a>,<a href="#B31-forests-14-00945" class="html-bibr">31</a>,<a href="#B32-forests-14-00945" class="html-bibr">32</a>,<a href="#B33-forests-14-00945" class="html-bibr">33</a>].</div><div class='html-p'>The forest inventory was performed in 15 plots [<a href="#B33-forests-14-00945" class="html-bibr">33</a>], covering the different successional stages found in <span class='html-italic'>Ponte Branca</span> Forest remnant. All trees with DBH (diameter at breast height) greater than 3.5 cm were measured, counted, and identified by species by a specialist based on the APG VI system (Angiosperm Phylogeny Group) [<a href="#B34-forests-14-00945" class="html-bibr">34</a>].</div><div class='html-p'>In total, 3181 trees from 64 different species were measured. However, only some of these trees/species were selected as samples for automatic classification. Tropical forests have a heterogeneous structure with several layers, and many species are present in the understory and middle canopy of the forest while few species and individuals reach the upper canopy. Thus, it is impracticable to classify species that are in the lower layers, below the crowns of the tallest trees, and is mainly done using passive sensors (e.g., images from UAVs (unmanned aerial vehicles)). In addition, smaller trees in the lower and middle stratum of the forest have the presence of lianas and vines, which can significantly modify the spectral response of these trees [<a href="#B35-forests-14-00945" class="html-bibr">35</a>].</div><div class='html-p'>The tree sample selection for classification was done in two steps. First, stratification was performed to locate trees that belonged to the upper canopy since UAV imagery does not acquire data from lower canopy structures. Second, at least six trees of each species existed in the forest inventory.</div><div class='html-p'>Due to the complexity of the tropical natural forests, tree height is difficult to be obtained in the field. For the vertical stratification of the forest, we used the CHM (Canopy Height Model) obtained from the LiDAR survey. For more information on how CHM was derived, please refer to <a href="#sec2dot3-forests-14-00945" class="html-sec">Section 2.3</a>. The vertical stratification of vegetation was based on [<a href="#B36-forests-14-00945" class="html-bibr">36</a>,<a href="#B37-forests-14-00945" class="html-bibr">37</a>], which divided the forest into three strata (lower, middle, and upper), based on the average height of the trees and the standard deviation of the heights. The lower stratum comprised trees with heights (Ht) less than the average height (Hm) minus one standard deviation (1σ); the upper stratum was defined as Ht ≥ (Hm + 1σ), and the middle stratum comprised trees with (Hm − 1σ) ≤ Ht &lt; (Hm + 1σ).</div><div class='html-p'>To avoid including ground points in forest stratification, we considered that only points above 1 m were vegetation. As a result, the lower stratum comprised trees below 4.8 m, the middle stratum included trees with heights between 4.9 m and 12.3 m, and the upper stratum comprised trees above 12.4 m (<a href="#forests-14-00945-f002" class="html-fig">Figure 2</a>). As most trees had a height between 6.4 m and 12.8 m, the middle stratum was more prominent. However, the trees present in the upper canopy had high ecological importance. These trees contributed the greatest amount to the forest biomass; they were important as seed carriers and dispersers, the fruits served as food for fauna, and some tree species had potential timber and non-timber products [<a href="#B29-forests-14-00945" class="html-bibr">29</a>,<a href="#B33-forests-14-00945" class="html-bibr">33</a>,<a href="#B38-forests-14-00945" class="html-bibr">38</a>,<a href="#B39-forests-14-00945" class="html-bibr">39</a>]. </div><div class='html-p'>Based on the two criteria for choosing tree samples, a total of 81 individuals of eight different species were selected. The position of the selected trees was determined by the distance and azimuth from the centre of the plot to the tree. Due to the closed canopy, it was necessary to use a dual-frequency GNSS (global navigation satellite system) with static relative positioning rather than collecting data for a long time to obtain the precise coordinate of the trees, which would be unfeasible. RGB orthophotos were also used with a GSD (ground sample distance) of 0.1 m to determine the positions of the trees as a reference map to correspond to the location in the field and on the map. </div><div class='html-p'>With the samples selected and located, the individual three crowns (ITC) were manually delineated in the hyperspectral orthophotos of the Rikola camera (<a href="#sec2dot2-forests-14-00945" class="html-sec">Section 2.2</a>) with a GSD of 0.25 m, and infrared false-colour composition. The manual delineation was also performed in RGB orthophotos available for the study area with a GSD of 0.10 m (<a href="#forests-14-00945-f003" class="html-fig">Figure 3</a>). Both images were collected at the same time and without displacement. To ensure correct delineation, care was taken to avoid delineating structures that did not belong exclusively to the crown of the interest tree, such as crowns of other trees and vines. Furthermore, for a correct delineation of tree crown boundaries, the normalized point cloud obtained with LiDAR data (<a href="#sec2dot3-forests-14-00945" class="html-sec">Section 2.3</a>) was used to have a three-dimensional view of tree structures, allowing better accuracy of manual delineation. This ITC delineation served as a ground reference for semi-automatic segmentation and classification. A summary description of the selected species, number of samples for each tree species, as well the sum and average number of pixels for the hyperspectral orthomosaics is presented in <a href="#forests-14-00945-t001" class="html-table">Table 1</a>.</div></section><section id='sec2dot2-forests-14-00945' type=''><h4 class='html-italic' data-nested='2'> 2.2. Hyperspectral Imagery Data Acquisition and Processing</h4><div class='html-p'>The images were acquired by the Rikola hyperspectral frame camera, model DT-0011 (<a href="#forests-14-00945-f004" class="html-fig">Figure 4</a>), produced by Senop Ltd. [<a href="#B44-forests-14-00945" class="html-bibr">44</a>,<a href="#B45-forests-14-00945" class="html-bibr">45</a>,<a href="#B46-forests-14-00945" class="html-bibr">46</a>]. The camera was mounted onboard a UAV (Unmanned Aerial Vehicle) quadcopter. </div><div class='html-p'>The Rikola camera is based on the FPI (Fabry-Perrot Interferometer). It consists of two partially reflective parallel surfaces, separated by air gap. This separation determines the wavelength transmitted by the interferometer, as the light rays that pass through the surfaces undergo multiple reflections according to the separation distance. Thus, changing the separation distance between the surfaces makes it possible to sensitize the camera sensor at different wavelengths [<a href="#B48-forests-14-00945" class="html-bibr">48</a>,<a href="#B49-forests-14-00945" class="html-bibr">49</a>,<a href="#B50-forests-14-00945" class="html-bibr">50</a>]. In addition, the camera has two CMOS sensors that operate simultaneously, with a pixel size of 5.5 μm generating images with 1017 × 648 pixels. The first sensor collects images at wavelengths between 647–900 nm and the second sensor between 500–635 nm with a minimum spectral resolution of 10 nm at the full width at half maximum (FWHM) [<a href="#B44-forests-14-00945" class="html-bibr">44</a>]. These features allow a flexible configuration of spectral bands [<a href="#B44-forests-14-00945" class="html-bibr">44</a>]. In addition, the Rikola camera has an auxiliary sensor that measures the irradiance. A GNSS receiver also exists that provides the camera’s latitude and longitude at the moment of image acquisition. </div><div class='html-p'>Regarding the settings used in this study, the camera was set to standalone mode, and the images were stored in a memory card. The number of acquired spectral bands was limited to 25 due to the transfer time of the images between the sensor and the memory card, the acquisition interval between two sequential images, and exposure time of each image [<a href="#B44-forests-14-00945" class="html-bibr">44</a>,<a href="#B46-forests-14-00945" class="html-bibr">46</a>].</div><div class='html-p'>Due to the limited number of bands, the wavelengths were selected according to the best ones that characterise tree species present in <span class='html-italic'>Ponte Branca</span> Forest remnant, as indicated by [<a href="#B44-forests-14-00945" class="html-bibr">44</a>]. These bands are shown in <a href="#forests-14-00945-t002" class="html-table">Table 2</a>. </div><div class='html-p'>The integration time was 10 ms, with an interval of 0.061 s between adjacent band exposures. Thus, each hyperspectral cube with 25 bands took 0.899 s to be acquired. Due to the misalignment between the two sensors of the Rikola camera and the UAV displacement during acquisition, the spectral bands of the hyperspectral cubes showed a slight difference in orientation and position. These misalignments were corrected with orthorectification of all hyperspectral image bands [<a href="#B44-forests-14-00945" class="html-bibr">44</a>,<a href="#B51-forests-14-00945" class="html-bibr">51</a>].</div><div class='html-p'>Four flight campaigns were needed to obtain images of the 15 surveyed plots in the <span class='html-italic'>Ponte Branca</span> Forest remnant. The flight campaigns were carried out in 2016 and 2017 (<a href="#forests-14-00945-t003" class="html-table">Table 3</a>) in the same season (winter) and under the same clear day conditions with few clouds and wind to avoid differences of solar angle and illumination. The same characteristics were also maintained in different years to avoid phenological differences between the various trees of the same species. On each flight, image blocks were acquired with a longitudinal overlap of at least 70% and lateral overlap of at least 50%.</div><div class='html-p'>On each flight campaign, signalized Ground Control Points (GCP) and radiometric targets were placed close to the study site to be used as reference bundle block adjustment and radiometric calibration (<a href="#forests-14-00945-f005" class="html-fig">Figure 5</a>). GCP coordinates were determined with a dual-frequency GNSS receiver. The radiometric targets were produced with EVA (ethylene vinyl acetate) with approximated dimensions of 0.90 m × 0.90 m in three colours: black, dark grey, and light grey. On these targets, reflectance measurements were taken with a FieldSpec<sup>®</sup> Handheld spectroradiometer, manufactured by ASD [<a href="#B52-forests-14-00945" class="html-bibr">52</a>], to transform the images’ DN (digital numbers) into physical values of reflectance factor.</div><div class='html-p'>The images obtained by the Rikola hyperspectral camera required some processing to produce the orthomosaic and to correct the anisotropy factor and possible illumination variations during image acquisition [<a href="#B44-forests-14-00945" class="html-bibr">44</a>,<a href="#B48-forests-14-00945" class="html-bibr">48</a>,<a href="#B53-forests-14-00945" class="html-bibr">53</a>]. The processing flow of hyperspectral images is shown in <a href="#forests-14-00945-f006" class="html-fig">Figure 6</a>. This process was applied for each flight campaign.</div><div class='html-p'>The hyperspectral images processing was performed using the same methodology as described in [<a href="#B35-forests-14-00945" class="html-bibr">35</a>,<a href="#B44-forests-14-00945" class="html-bibr">44</a>,<a href="#B48-forests-14-00945" class="html-bibr">48</a>,<a href="#B51-forests-14-00945" class="html-bibr">51</a>,<a href="#B54-forests-14-00945" class="html-bibr">54</a>,<a href="#B55-forests-14-00945" class="html-bibr">55</a>]. First, the images were corrected of dark current using an image with the camera lens obstructed by an opaque low-reflectance object to remove the electronic noise from the camera [<a href="#B53-forests-14-00945" class="html-bibr">53</a>]. </div><div class='html-p'>The geometric processing was performed using the Agisoft PhotoScan software (Agisoft LLC, St. Petersburg, Russia). To optimize processing time, image orientations were estimated for four bands of the Rikola camera, two from each sensor (bands 1: 506.22 nm and 10: 628.75 nm from sensor two; bands 11: 650.96 nm and 25: 819.66 nm from sensor one) in a simultaneous bundle adjustment. The IOPs (interior orientation parameters) and EOPs (exterior orientation parameters) were estimated using self-calibrating BBA (Bundle Block Adjustment). The IOPs were estimated with individual sets for each sensor. The EOPs were estimated using the camera’s GNSS positions as initial values and refined in the BBA and with the GCPs. From the generated point cloud, the estimated parameters were optimized with the manual removal of outliers and using gradual selection of tie points to verify the projection error. After these procedures, calibrated IOPs and EOPs and a sparse point cloud were generated, then a dense point cloud, a DSM (digital surface model) with a GSD of 0.25 cm, and a resampled DSM with GSD of 5 m [<a href="#B47-forests-14-00945" class="html-bibr">47</a>,<a href="#B53-forests-14-00945" class="html-bibr">53</a>,<a href="#B56-forests-14-00945" class="html-bibr">56</a>]. </div><div class='html-p'>After this initial geometric processing, further photogrammetric techniques and radiometric processing were applied based on the methodology developed by [<a href="#B48-forests-14-00945" class="html-bibr">48</a>,<a href="#B55-forests-14-00945" class="html-bibr">55</a>,<a href="#B57-forests-14-00945" class="html-bibr">57</a>]. The EOPs for the remaining bands were computed with spatial resection using the sparse point cloud as a source of control. However, due to the anisotropic characteristics of vegetation reflectance, quality of the sensor system, stability and atmospheric conditions, illumination changes caused by the clouds, and solar position, the same object did not present the same DN in different images. While the reflectance anisotropy is modelled by the bidirectional reflectance distribution function (BRDF), the relative differences in the overlapped images must be estimated in radiometric block adjustment [<a href="#B55-forests-14-00945" class="html-bibr">55</a>]. </div><div class='html-p'>The radiometric block adjustment assumes that the same object must provide a similar DN in all the images in which it appears. This method uses DN values of radiometric tie points, which are obtained from the resampled DSM in the overlapped images. This information determines the parameters of the radiometric model describing the differences between the DNs in the different images using the principle of weighted least squares. The DN values in the radiometric tie points were determined from a search window of predefined size (5 m × 5 m). Relative correction parameters were determined in relation to a reference image obtained in the nadir to correct the differences in illumination between the images, and a linear BRDF model was applied (Equation (1)). <div class='html-disp-formula-info' id='FD1-forests-14-00945'> <div class='f'> <math display='block'><semantics> <mrow> <mi>D</mi> <msub> <mi>N</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msub> <mo>=</mo> <msub> <mi>a</mi> <mrow> <mi>r</mi> <mi>e</mi> <mi>l</mi> <mi>j</mi> </mrow> </msub> <mfenced> <mrow> <msub> <mi>a</mi> <mrow> <mi>a</mi> <mi>b</mi> <mi>s</mi> </mrow> </msub> <mo>·</mo> <msub> <mi>R</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msub> <mfenced> <mrow> <msub> <mi>θ</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>θ</mi> <mi>r</mi> </msub> <mo>,</mo> <mo> </mo> <mi>φ</mi> </mrow> </mfenced> <mo>+</mo> <msub> <mi>b</mi> <mrow> <mi>a</mi> <mi>b</mi> <mi>s</mi> </mrow> </msub> </mrow> </mfenced> </mrow> </semantics></math> </div> <div class='l'> <label >(1)</label> </div> </div> where <math display='inline'><semantics> <mrow> <mi>D</mi> <msub> <mi>N</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msub> </mrow> </semantics></math> was the digital number of pixel <math display='inline'><semantics> <mi>k</mi> </semantics></math> in image <math display='inline'><semantics> <mi>j</mi> </semantics></math>; <math display='inline'><semantics> <mrow> <msub> <mi>R</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msub> <mfenced> <mrow> <msub> <mi>θ</mi> <mi>i</mi> </msub> <mo>,</mo> <msub> <mi>θ</mi> <mi>r</mi> </msub> <mo>,</mo> <mo> </mo> <mi>φ</mi> </mrow> </mfenced> </mrow> </semantics></math> was the reflectance factor with respect to the zenith angle of incident light <math display='inline'><semantics> <mrow> <msub> <mi>θ</mi> <mi>i</mi> </msub> </mrow> </semantics></math> and reflected light <math display='inline'><semantics> <mrow> <msub> <mi>θ</mi> <mi>r</mi> </msub> </mrow> </semantics></math> and with respect to the relative azimuthal angle <math display='inline'><semantics> <mrow> <mi>φ</mi> <mfenced> <mrow> <msub> <mi>φ</mi> <mi>r</mi> </msub> <mo>−</mo> <msub> <mi>φ</mi> <mi>i</mi> </msub> </mrow> </mfenced> </mrow> </semantics></math> related to the incident <math display='inline'><semantics> <mrow> <msub> <mi>φ</mi> <mi>i</mi> </msub> </mrow> </semantics></math> and reflected <math display='inline'><semantics> <mrow> <msub> <mi>φ</mi> <mi>r</mi> </msub> </mrow> </semantics></math> azimuthal angle (further details about this model see [<a href="#B48-forests-14-00945" class="html-bibr">48</a>]); and <math display='inline'><semantics> <mrow> <msub> <mi>a</mi> <mrow> <mi>r</mi> <mi>e</mi> <mi>l</mi> <mi>j</mi> </mrow> </msub> </mrow> </semantics></math> is the relative correction factor of different illumination with respect to the reference image. Miyoshi et al. (2018) [<a href="#B44-forests-14-00945" class="html-bibr">44</a>], working in the same study area, defined the optimal value for <math display='inline'><semantics> <mrow> <msub> <mi>a</mi> <mrow> <mi>r</mi> <mi>e</mi> <mi>l</mi> <mi>j</mi> </mrow> </msub> </mrow> </semantics></math> as 1; <math display='inline'><semantics> <mrow> <msub> <mi>a</mi> <mrow> <mi>a</mi> <mi>b</mi> <mi>s</mi> </mrow> </msub> </mrow> </semantics></math> and <math display='inline'><semantics> <mrow> <msub> <mi>b</mi> <mrow> <mi>a</mi> <mi>b</mi> <mi>s</mi> </mrow> </msub> </mrow> </semantics></math> were the parameters defined by the empirical line [<a href="#B58-forests-14-00945" class="html-bibr">58</a>]. </div><div class='html-p'>In the radiometric block adjustment step, orthorectification of each image band was also performed. A DSM with the same GSD of the final orthomosaic was used (0.25 m). Furthermore, the bands’ misalignments were also corrected in this process of orthorectification. At the end of this process, an orthomosaic for each dataset with 25 spectral bands and corrected of illumination and anisotropy variations was produced. </div><div class='html-p'>The empirical line method was applied in the orthomosaics from the data obtained with the spectroradiometer in the radiometric reference targets for each flight campaign (<a href="#forests-14-00945-t003" class="html-table">Table 3</a>, <a href="#forests-14-00945-f005" class="html-fig">Figure 5</a>). The objective was to calculate the relation between the values obtained in the Rikola images and the reflectance of the targets in the field from a linear regression. The values of gain and offset were estimated, transforming the DN into physical values of reflectance factor [<a href="#B58-forests-14-00945" class="html-bibr">58</a>,<a href="#B59-forests-14-00945" class="html-bibr">59</a>]. </div><div class='html-p'>The spectroradiometer used to collect spectra information from the radiometric targets had a range between 325 nm and 1075 nm with a resolution of 1 nm. Thus, it was necessary to adjust the wavelength ranges to match the settings of the Rikola camera bands. Radiometric targets spectra were simulated according to the spectral ranges of the camera bands, adopting the Gaussian curve for spectral sensitivity [<a href="#B53-forests-14-00945" class="html-bibr">53</a>]. This simulation allowed for evaluating the adherence of the spectral response of the targets obtained with the bands of any camera with the spectral response obtained with the spectroradiometer, which had a more refined spectral resolution [<a href="#B56-forests-14-00945" class="html-bibr">56</a>]. With the physical values of reflectance factor, it was possible to characterize the targets spectrally (i.e., characterize different tree species), compare data from different sensors, and obtain vegetation indices [<a href="#B60-forests-14-00945" class="html-bibr">60</a>,<a href="#B61-forests-14-00945" class="html-bibr">61</a>]. For more information related to this process, refer to <a href="#sec2dot5dot1-forests-14-00945" class="html-sec">Section 2.5.1</a>. </div></section><section id='sec2dot3-forests-14-00945' type=''><h4 class='html-italic' data-nested='2'> 2.3. LiDAR Data Acquisition and Processing</h4><div class='html-p'>LiDAR data were collected with the RIEGL LMS-Q680i full-waveform sensor, which used the multiple time around (MTA) technique to operate with a high repetition frequency. This unit acquired pulses that arrived after a delay of more than one pulse repetition interval, allowing measurements with a range beyond the unambiguous maximum measuring range [<a href="#B20-forests-14-00945" class="html-bibr">20</a>]. Data were collected at a flight height of 900 m, and the waveforms were processed in post-processing mode; thus, the point clouds—traditionally called discrete LiDAR—were delivered as the peak returns (PR) of the waveforms and as a full-waveform (FWF) with a density of 19.8 points·m<sup>−2</sup> [<a href="#B62-forests-14-00945" class="html-bibr">62</a>].</div><div class='html-p'>Different processing was performed for each type of LiDAR data. For PR LiDAR data, the objective was to normalize the point cloud intensities and heights to extract metrics related to height distribution, pulse return, and intensity statistics. In addition to digital models needed for the segmentation step (<a href="#sec2dot4-forests-14-00945" class="html-sec">Section 2.4</a>). The objective of FWF LiDAR data was the extraction of metrics related to the distribution of voxels that were or not intercepted by a wave sample, as well as the signal intensity. Both metrics (PR and FWF) were used as attributes for classifying tree species. The flowchart with the LiDAR point cloud processing steps is shown in <a href="#forests-14-00945-f007" class="html-fig">Figure 7</a>.</div><div class='html-p'>The detailed description of the PR LiDAR data processing for the same dataset is in [<a href="#B62-forests-14-00945" class="html-bibr">62</a>]. The point cloud was classified into ground and vegetation points using <span class='html-italic'>LAStools</span> software [<a href="#B63-forests-14-00945" class="html-bibr">63</a>]. The ground points were rasterized using the TIN (triangular irregular network), producing a DTM (digital terrain model) with a GSD of 0.50 m. The next processing steps were performed in the R environment [<a href="#B64-forests-14-00945" class="html-bibr">64</a>] with the <span class='html-italic'>lidR</span> package [<a href="#B65-forests-14-00945" class="html-bibr">65</a>]. First, the point cloud had the intensity values normalized based on the range method [<a href="#B66-forests-14-00945" class="html-bibr">66</a>,<a href="#B67-forests-14-00945" class="html-bibr">67</a>] since the distance between the laser beam and the target is not constant throughout the LiDAR survey, in addition there are other intensities distortions caused by topography, equipment and atmospheric effects, for example, not faithfully representing the target radiometry [<a href="#B68-forests-14-00945" class="html-bibr">68</a>]. </div><div class='html-p'>Once normalized, it was possible to use LiDAR metrics related to intensities as attributes for classification. The next step was to normalize the heights of the point cloud subtracting the DTM, resulting in a point cloud with vegetation points mapped on flat terrain. Outliers’ were removed from the point cloud using the statistical outlier’s removal algorithm. For each point, the average distance to all <span class='html-italic'>k</span>-nearest neighbours was calculated. If the points are further than a threshold, this point will be considered noise. This threshold is empirically defined by the user as the average distance plus the standard deviation multiplied by a scale shift. We used the default values provided by the <span class='html-italic'>lidR</span> package [<a href="#B65-forests-14-00945" class="html-bibr">65</a>]. Then, the normalized point cloud was rasterized using the point-to-raster algorithm (<span class='html-italic'>p2r</span>) [<a href="#B69-forests-14-00945" class="html-bibr">69</a>] to produce the CHM (canopy height model) that contained the height value of the trees for each pixel with a GSD of 0.50 m. The CHM was necessary for vegetation stratification and tree crown segmentation. In addition, the point cloud with normalized intensities and heights was used to extract PR LiDAR metrics that describe the vegetation structure and will be explained in <a href="#sec2dot5dot2-forests-14-00945" class="html-sec">Section 2.5.2</a>.</div><div class='html-p'>The process of intensity normalization of the FWF point clouds was the same as for the PR point cloud. The other processing steps were performed in the open-source software <span class='html-italic'>DASOS</span> (forest in Greek) developed by [<a href="#B70-forests-14-00945" class="html-bibr">70</a>,<a href="#B71-forests-14-00945" class="html-bibr">71</a>]. The FWF data (i.e., the waveform samples) were voxelized by <span class='html-italic'>DASOS</span>, which creates a 3D discrete density volume, such as a 3D grayscale image, by accumulating intensities of multiple pulses. First, 3D space was divided into voxels (i.e., 3D pixels), and the waveform samples were inserted into this voxelized space. The intensity values of each voxel were averaged resulting into the size of the pulse width associated with each voxel to be consistent [<a href="#B72-forests-14-00945" class="html-bibr">72</a>]. Even though an intensity value is preserved into the voxels, the majority of the metrics exported by <span class='html-italic'>DASOS</span> focus on structural elements and consider whether the voxels are either empty or not (e.g., it measures distribution of non-empty voxels). </div><div class='html-p'>Denoising is a necessary step when using FWF data, as the sensor records low amplitude signals that are not real vegetation returns. The <span class='html-italic'>DASOS</span> software performs low-level filtering. A threshold is selected by the user, and waveform samples amplitudes below the selected threshold are eliminated. For our data, some tests with different thresholds were performed. The best result was obtained using the average of the wave samples plus one standard deviation, which implied that all wave samples with amplitude less than 20 were eliminated. Voxel size was also selected based on preliminary tests. Large voxels sizes can aggregate information from several trees within the same voxel, which made the separability analysis for tree species difficult. Small voxel sizes greatly increase processing time, and it can be difficult to find patterns across species due to the high level of detail. The chosen voxel size was 1 m × 1 m × 1 m with the subsequent extraction of 2D FWF metrics from the voxelized 3D data. In this strategy, each pixel contains information related to the column of voxels intercepted by the waveform samples. The extracted metrics are explained in <a href="#sec2dot5dot2-forests-14-00945" class="html-sec">Section 2.5.2</a>, and these metrics show promising potential for tree species classification and biophysical properties estimation at the tree level [<a href="#B72-forests-14-00945" class="html-bibr">72</a>]. During the voxelization process, the DTM produced with the PR LiDAR data was used to normalize the heights of the voxel columns.</div></section><section id='sec2dot4-forests-14-00945' type=''><h4 class='html-italic' data-nested='2'> 2.4. Tree Crowns Segmentation </h4><div class='html-p'>Segmentation or ITC delineation is a crucial step when classifying tree species in individual trees. This is because it increases the accuracy of the classification and enables the production of maps that depict the distribution of various tree species [<a href="#B8-forests-14-00945" class="html-bibr">8</a>,<a href="#B29-forests-14-00945" class="html-bibr">29</a>,<a href="#B73-forests-14-00945" class="html-bibr">73</a>,<a href="#B74-forests-14-00945" class="html-bibr">74</a>]. Trees in tropical forests have a wide range of heights and heterogeneous crown shapes and are usually overlapping with neighbouring individuals. This makes the ITC delineation a challenging task itself [<a href="#B29-forests-14-00945" class="html-bibr">29</a>,<a href="#B75-forests-14-00945" class="html-bibr">75</a>].</div><div class='html-p'>Among the methods available for segmentation, we used the superpixel method with an adaptation of SLIC (simple linear iterative clustering) algorithm [<a href="#B76-forests-14-00945" class="html-bibr">76</a>]. Within Brazilian tropical forests, the SLIC method has been successfully used for classification of different successional stages and their evolution [<a href="#B31-forests-14-00945" class="html-bibr">31</a>], as well as for the classification of emerging tree species [<a href="#B47-forests-14-00945" class="html-bibr">47</a>]. The adaptation of the SLIC algorithm was developed by [<a href="#B77-forests-14-00945" class="html-bibr">77</a>] and is available in the supercells package for the R environment [<a href="#B64-forests-14-00945" class="html-bibr">64</a>]. Although the SLIC method necessitates conversion of the data into a false-colour RGB image, the adaptation approach is more regarding the data structure, as it eliminates the need for such conversion. This allows for the usage of a more specific distance measure, using a custom function for mean values of cluster centres. The adapted SLIC method starts with regularly located cluster centres spaced by the interval of S. Then the distance between a cluster centre and each cell in its 2S × 2S region is calculated. Superpixels are created by assigning each cell to the centre of the cluster with the smallest overall distance. While SLIC uses the average distance, the adapted method allows for the use of measurements from any distance to calculate the distance from the data. It also allows any function (not just arithmetic mean) to be used as average values of cluster centres. Then, the centres of the clusters are updated to values equal to the adopted distance measure of all the cells belonging to their respective clusters. The algorithm works iteratively, repeating the above process until reaching the expected number of iterations [<a href="#B77-forests-14-00945" class="html-bibr">77</a>].</div><div class='html-p'>The ITC semi-automatic delineation was composed of the following steps:</div><div class='html-p'><dl class='html-roman-lower'><dt id=''>(i)</dt><dd><div class='html-p'>For the segmentation using superpixels, the following parameters were defined by the user: the number of superpixels to be generated (k); compactness, which defines the shape of the superpixels, with higher values resulting in more regular superpixels (square grid) while lower values of compactness will create more spatially adapted superpixels but with irregular shape; and the distance measure (dist_fun) to be adopted. We used the CHM raster as input data to create the superpixels. Two k values were tested: 100,000 and 200,000, compactness equal to 1 and dist_fun was Euclidean (<a href="#forests-14-00945-f008" class="html-fig">Figure 8</a>).</div></dd><dt id=''>(ii)</dt><dd><div class='html-p'>Due the small displacement between the LiDAR data and the hyperspectral orthomosaics, they needed to be co-registered. The images were registered based on the LiDAR point cloud, which had better geometry, using homologous points between the two data sources. The co-registration was done using the Nearest Neighbor method to preserve the original value of the image pixels, and a first-order affine transformation was applied. The error achieved in the co-registration process was less than one pixel for all hyperspectral orthomosaics.</div></dd><dt id=''>(iii)</dt><dd><div class='html-p'>The value of 200,000 superpixels generated the best results for the segmentation, preventing the inclusion of more than one tree crown in just one segment (undersegmentation). Thus, oversegmentation (i.e., several segments representing one tree crown) occurred, and it was necessary to merge these segments. We used a combined method for automatic merging and manual editing. The automatic method for merging the superpixels was an adaptation of the methodology described in [<a href="#B47-forests-14-00945" class="html-bibr">47</a>]. We considered the maximum height of each segment, the standard deviation of the heights and the Jeffreys–Matusita spectral distance (JM) [<a href="#B78-forests-14-00945" class="html-bibr">78</a>] for each tree species. Three height classes were created based on the heights of selected trees, and the standard deviations were selected for each of the classes (<a href="#forests-14-00945-t004" class="html-table">Table 4</a>). Due to the different number of pixels in each ITC, we used the same amount of pixels to extract the reflectance factor of each ITC. The number of pixels was based on the smallest delineated ITC (252 pixels) [<a href="#B79-forests-14-00945" class="html-bibr">79</a>], and the average of the 10 brightest pixels was extracted. Then, the JM distance was calculated for each pairwise combination of tree species for the 25 hyperspectral bands of the orthomosaic from the Rikola camera. If a given superpixel was contained in a height class with a standard deviation smaller than the threshold and the JM was less than the minimum difference for species separability, the superpixel was merged (<a href="#forests-14-00945-t004" class="html-table">Table 4</a>).</div></dd><dt id=''>(iv)</dt><dd><div class='html-p'>Finally, due to the low number of samples, superpixels that did not accurately correspond to the tree crown were edited manually, ensuring that 100% of the trees were correctly delineated (<a href="#forests-14-00945-f009" class="html-fig">Figure 9</a>).</div></dd></dl></div><div class='html-p'>From the delineated segments, it was possible to extract attributes from the three data sets analysed; raw spectra and vegetation indices from the hyperspectral images, metrics from the PR LiDAR data and their reduction using the principal component analysis, and metrics extracted from <span class='html-italic'>DASOS</span> using the FWF LIDAR with some additional processing. These steps are detailed in the following section.</div></section><section id='sec2dot5-forests-14-00945' type=''><h4 class='html-italic' data-nested='2'> 2.5. Feature Extraction </h4><div class='html-p'>The features were extracted using the segments generated by the superpixels. Due to the variant segment sizes, the segment with the lowest number of pixels was used as a parameter to extract the features in each dataset.</div><section id='sec2dot5dot1-forests-14-00945' type=''><h4 class='' data-nested='3'> 2.5.1. Hyperspectral Images Features</h4><div class='html-p'>Many hyperspectral features were included in the classifier. Some of them were directly derived from the band reflectance’s and others by combining multiple bands. The features extracted from the hyperspectral orthomosaics were the spectral signature of each tree, which is also referred to as the reflectance factor of each tree. We used the average of the 10 brightest pixels from each ITC. Due to the different ITC size, we used the segment with the smallest number of pixels as parameter, and we called it the raw spectrum. The raw spectrum was used as a feature for classification. A spectral average curve was calculated for each tree species as well.</div><div class='html-p'>The extracted spectra were visually analysed, and those wavelengths that best differentiated the species were verified. Then, vegetation indices (VI) derived from these wavelengths were calculated. These VIs related to structure, leaf pigments, and plant physiology (<a href="#forests-14-00945-t005" class="html-table">Table 5</a>) and were included in the classification. All VIs were adapted to the spectral bands of the Rikola hyperspectral camera. The spectral bands closest to the wavelength of a specific VI were adopted as a choice criterion.</div><div class='html-p'>Thus, from the hyperspectral images orthomosaics, two sets of attributes were extracted and used for the classification of tree species: the reflectance factor of each tree in the 25 bands from the orthomosaics of the Rikola camera and 11 vegetation indices. </div></section><section id='sec2dot5dot2-forests-14-00945' type=''><h4 class='' data-nested='3'> 2.5.2. LiDAR Features</h4><div class='html-p'>After the processing performed in the point clouds (<a href="#sec2dot3-forests-14-00945" class="html-sec">Section 2.3</a>, <a href="#forests-14-00945-f007" class="html-fig">Figure 7</a>), 53 PR LiDAR metrics were extracted for each tree from 1 m off the ground to avoid ground points. A description of the metrics extracted from the <span class='html-italic'>lidR</span> package [<a href="#B65-forests-14-00945" class="html-bibr">65</a>] is available in [<a href="#B62-forests-14-00945" class="html-bibr">62</a>], and it is related to statistics of the distribution of heights, intensities, and pulse returns (e.g., measures of central tendency, cumulative percentage, and percentiles).</div><div class='html-p'>As PR LiDAR metrics were highly correlated, applying some attribute selection method was necessary. We used PCA (Principal Components Analysis), available in <span class='html-italic'>FactoMineR</span> [<a href="#B93-forests-14-00945" class="html-bibr">93</a>] package for R, to transform the metrics into a new set of uncorrelated orthogonal metrics [<a href="#B94-forests-14-00945" class="html-bibr">94</a>]. Based on the Kaiser criterion [<a href="#B95-forests-14-00945" class="html-bibr">95</a>], the first five PC (principal components) explained 76.5% of the data variability.</div><div class='html-p'>Thus, from PR LiDAR metrics, two sets of attributes were extracted for tree species classification: 53 metrics and the transformation of the same set into five PC.</div><div class='html-p'>From the FWF LiDAR data, 2D metrics were extracted (i.e., in raster format by the software <span class='html-italic'>DASOS</span>) of the information contained in each voxel column intercepted or not by the wave sample. The set of extracted metrics can be seen in <a href="#forests-14-00945-t006" class="html-table">Table 6</a>.</div><div class='html-p'>After extracting the metrics, a “salt and pepper” effect was observed in the images, which was caused by the absence of a pulse passing through the corresponding column of the voxelized space. This effect was removed with a 3 × 3 median filter. Subsequently, using the number of pixels of the smallest ITC, the average of each of the metrics for each ITC was extracted, totalling nine features for the LiDAR FWF data.</div></section></section><section id='sec2dot6-forests-14-00945' type=''><h4 class='html-italic' data-nested='2'> 2.6. Automatic Classification and Performance Assessment</h4><div class='html-p'>The sets of features extracted from the different datasets, hyperspectral images, and PR and FWF LiDAR point cloud were used either independently or combined as training and testing data for the tree species classifier targeting tropical forests. We investigated whether or not the classification accuracy improved when spectral and structural attributes were combined. Thirteen different scenarios were tested (<a href="#forests-14-00945-t007" class="html-table">Table 7</a>). We decided to merge all the extracted features into a single classification feature vector contain the combination of all metrics (spectra, VIs, PR LiDAR, FWF LiDAR) and their respective transformation by PCA as well. The data were transformed in the same way, and criteria are explained in <a href="#sec2dot5dot2-forests-14-00945" class="html-sec">Section 2.5.2</a>. The first 10 components explained 84.8% of the data variability. For that reason, the 10 PC were used as a new set of non-correlated and orthogonally transformed metrics.</div><div class='html-p'>The algorithm used for the classification of tree species was RF (random forest), which consists of several decision trees, with the class of a given sample being determined with the most frequent vote among the decision trees [<a href="#B96-forests-14-00945" class="html-bibr">96</a>]. The algorithm randomly creates new training sets with substitution and resampling of the original data, many times the number of samples [<a href="#B97-forests-14-00945" class="html-bibr">97</a>]. The classification was performed in R environment [<a href="#B64-forests-14-00945" class="html-bibr">64</a>] using the <span class='html-italic'>randomForest</span> package [<a href="#B98-forests-14-00945" class="html-bibr">98</a>]. The following parameters were selected: number of trees built (<span class='html-italic'>ntree</span> = 1000) and number of candidate variables in each tree node (<span class='html-italic'>mtry</span>), defined as the square root of the number of input data in each of the tested scenarios (<a href="#forests-14-00945-t007" class="html-table">Table 7</a>). With RF, it was possible to obtain the degree of importance of each one of the features used as input for the classification of tree species. Moreover, this algorithm handles data with high dimensionality in classification problems [<a href="#B99-forests-14-00945" class="html-bibr">99</a>,<a href="#B100-forests-14-00945" class="html-bibr">100</a>]. </div><div class='html-p'>Of the 81 tree samples, 60% were used for training and 40% for validating the classification. Due the low number of samples, the LOOCV (leave-one-out cross-validation) method was used for validation. According to [<a href="#B47-forests-14-00945" class="html-bibr">47</a>,<a href="#B79-forests-14-00945" class="html-bibr">79</a>,<a href="#B101-forests-14-00945" class="html-bibr">101</a>], the LOOCV technique presented successful when working with less than ten samples per class or for an unbalanced number of samples per class. </div><div class='html-p'>The classification evaluation was performed with the results obtained in the LOOCV process. Then, the following statistics were calculated: confidence interval for overall accuracy (OA) at 95% probability and Cohen’s Kappa index (κ) [<a href="#B102-forests-14-00945" class="html-bibr">102</a>] for each tested scenario. For the best scenario, the confusion matrix was generated with producer accuracy (PA) and user accuracy (UA). The relative importance of the features that best separated the tree species was also analyzed. A map with the distribution of tree species classified by the best scenario was also produced.</div></section></section><section id='sec3-forests-14-00945' type='results'><h2 data-nested='1'> 3. Results</h2><div class='html-p'>From the hyperspectral orthomosaics it was possible to extract the reflectance factor spectrum of each tree and understand the wavelengths that are more suitable for the separability of tree species, as well as the possible confusion between some species due to spectral similarity. The average spectra for each tree species are shown in <a href="#forests-14-00945-f010" class="html-fig">Figure 10</a>. The species AnPe and AsPo presented spectra with different shapes when compared to the other species. While AnPe has a low reflectance factor in the visible wavelengths (506.22 nm to 700.28 nm), AsPo has a high reflectance factor not only in the visible range, but also in the near infrared wavelengths (700.28 nm to 819.66 nm). The other tree species showed similar behaviour along the spectrum with a subtle difference between the tree species in the visible wavelengths, with the difference gradually increasing in the near infrared wavelengths. However, the tree species AnPe, ApLe, and HyCo showed small differences in their spectral responses, mainly between 720.17 nm and 819.66 nm, which made it difficult to differentiate and classify these species using solely the spectral information.</div><div class='html-p'>In <a href="#app1-forests-14-00945" class="html-app">Figure S1 of the Supplementary Material</a>, the spectra of all samples of trees of each species are present. Some inferences can be made, such as the tree’s level of development, crown transparency, leaf distribution, leaf senescence, and whether the tree is under biotic and abiotic stress conditions. Explaining the reasons for the changes in the spectral response of the species requires further studies. The physiological condition of the trees at the time of the acquisition of the images and during different seasons should be observed and compared, according to [<a href="#B35-forests-14-00945" class="html-bibr">35</a>]. However, this kind of study is very challenging in tropical forests, and it is recommended for future studies. </div><div class='html-p'>From the 13 classification scenarios tested (<a href="#forests-14-00945-t007" class="html-table">Table 7</a>), the S3 to S5 scenarios, using only LiDAR metrics for classification (PR LiDAR, PR LiDAR PCA, and FWF LiDAR), resulted in the lowest classification accuracy, with an average OA between 33% and 36% and a Kappa index between 0.22 and 0.24. For the studied tropical forest, the LiDAR metrics containing forest structural information only were not effective enough for the classification of tree species. </div><div class='html-p'>Regarding the spectral information, an average OA of 55% (S1) was achieved. Even though it solely used the raw spectra and the classifier better differentiated the tree species in comparison to using only the LiDAR metrics, it was still not very effective in classifying the tropical tree species. It is further worth noting that no significant difference was observed in the classification results when using the raw spectra and when using the VIs (S2) as input data.</div><div class='html-p'>The scenarios that combine spectral data with LiDAR metrics showed improved classification accuracy, with an OA above 64%, except for S7 and S10, which had an accuracy close to using spectral information only (i.e., 55% and 58%, respectively). Both scenarios used PR LiDAR metrics transformed by PCA; in other words, even combined with spectral information and VIs, the PR LiDAR metrics (derived from <span class='html-italic'>lidR</span> package [<a href="#B65-forests-14-00945" class="html-bibr">65</a>]) transformation was not as effective in differentiating and classifying tree species. </div><div class='html-p'>However, when all the features we included (raw spectra, VIs, and PR and FWF LiDAR metrics) were combined and transformed by PCA (S13), the best classification results were achieved with an average OA of 76% and kappa index of 0.71. This could be explained because the PR and FWF metrics extracted were different types of metrics, and they could supplement each other. Nevertheless, the results of S11 that contains only the FWF features (extracted from <span class='html-italic'>DASOS</span>) and VI metrics are very close to S13 that contains all the metrics, and this could indicate that cleaner but fewer metrics—reduced dimensionality—can confer great results. In <a href="#forests-14-00945-f011" class="html-fig">Figure 11</a>, there is a summary of the overall accuracy confidence interval and kappa index for all tested scenarios. It is possible to verify that the confidence intervals for scenarios S6, S11, and S12 do not differ from the best scenario, S13.</div><div class='html-p'>The confusion matrix for the two best scenarios (S11 and S13) is depicted <a href="#forests-14-00945-f012" class="html-fig">Figure 12</a>. The AnPe and AsPo tree species had a UA of 100% for both scenarios. The ApLe and HeAp species showed 100% of UA only for S13 and the CoLa species only for S11. For Scenario 11, the PtPu species did not have any tree correctly classified, which was confusing with the HeAp and SyRo species. Most likely, the crowns of these trees were very close to each other, causing confusion mainly in the spectral response. </div><div class='html-p'>The species HyCo was not classified in the S13. There was confusion with the CoLa species, which are from the same botanical family (Fabaceae—Caesalpionideae, <a href="#forests-14-00945-t001" class="html-table">Table 1</a>). Thus, depending on the developmental stage and phenology, the trees of these two species may present a similar spectral response and structure. There was also confusion between the HyCo and SyRo species. SyRo is prominent in the <span class='html-italic'>Ponte Branca</span> Forest remnant, and as a palm tree, this tree crown has a star shaped structure (<a href="#forests-14-00945-f003" class="html-fig">Figure 3</a>) that can intertwine with the crowns of other trees and interfere in the spectral response and structure of a given tree.</div><div class='html-p'>The features importance for the best classification scenarios (S11 e S13) obtained by the RF classifier, in terms of MDA (mean decrease accuracy), is shown in <a href="#forests-14-00945-f013" class="html-fig">Figure 13</a>. The reflectance factor at the red edge position, obtained from hyperspectral orthomosaics, was the variable that mostly contributed to the separability of tree species in S11, followed by VIs MCARI and ND_682_553 (<a href="#forests-14-00945-t007" class="html-table">Table 7</a>). For S13, the fourth and first PCs were the variables that contributed most to the classification of tree species. The contribution of each spectral feature, VIs, and LiDAR metric in the PC is shown in <a href="#forests-14-00945-f014" class="html-fig">Figure 14</a>. </div><div class='html-p'>Analysing the features importance, the raw spectra and VIs used for classifying tree species were not as effective when used independently (<a href="#forests-14-00945-f011" class="html-fig">Figure 11</a>). However, when combined with FWF LiDAR metrics, their potential in classifying tree species in tropical forests was increased. Regarding the PCs (<a href="#forests-14-00945-f014" class="html-fig">Figure 14</a>), we can see the raw spectra have a high contribution to the first component and the VIs have a high contribution to the fourth component. Together with these features, the FWF LiDAR metrics, such as lowest return, average height difference, thickness, non-empty voxels, and maximum intensity, proved to be very effective in classification for both scenarios. The PR LiDAR metrics showed a similar degree of contribution on each PC. </div><div class='html-p'>In respect to visualization, <a href="#app1-forests-14-00945" class="html-app">Figure S2 of the Supplementary Materials</a> shows the distribution of the classified tree species by the best scenario. As the superpixel segmentation was done semi-automatically, implying that the generated segments were corrected to ensure precise delineation, the produced distribution maps are reliable. Species that were not classified correctly have the edge of the segment with the colorization of the correct tree species.</div></section><section id='sec4-forests-14-00945' type='discussion'><h2 data-nested='1'> 4. Discussion</h2><section id='sec4dot1-forests-14-00945' type=''><h4 class='html-italic' data-nested='2'> 4.1. ITC Deliniation</h4><div class='html-p'>The delineation of individual tree crowns was performed in the CHM using the superpixel method. Since the main focus of the study is to investigate which combination of hyperspectral and LiDAR metrics better classify tree species and the superpixel method leads to over-segmentation, the over-segmentation was corrected in a semi-automated way. At first, small segments created by the superpixel algorithm were merged according to the predefined criteria (<a href="#forests-14-00945-t004" class="html-table">Table 4</a>). After this automatic step, the merged superpixels were checked, and the quality of the segmentation was improved using vector editing tools. This method ensured that all trees were correctly delineated, and no samples were left out. However, this worked for this study with a small sample of trees, but more robust methods of delineating tree crowns in tropical forests are needed.</div><div class='html-p'>It was shown that the superpixel approach was superior to the watershed algorithm for delineating tree crowns from the CHM at <span class='html-italic'>Ponte Branca</span> Forest remnant [<a href="#B47-forests-14-00945" class="html-bibr">47</a>]. A segmentation accuracy of 62% was achieved. However, at [<a href="#B47-forests-14-00945" class="html-bibr">47</a>], the presence of a SyRo palm tree whose crown is not circular was reported, making the automatic process challenging. It required smaller superpixels to distinguish palm tree crowns (star-shaped, <a href="#forests-14-00945-f009" class="html-fig">Figure 9</a>), but it conferred over-segmentation to the other species within the tropical forest that have wider and more circular-like shape crowns. The same problem was found in our study, and the segments referring to the SyRo palm tree species had to be manually corrected in most cases.</div><div class='html-p'>In Brazilian Amazon forest, ref. [<a href="#B103-forests-14-00945" class="html-bibr">103</a>] tested some ITC delineation algorithms using the CHM from LiDAR available in <span class='html-italic'>lidR</span> package for R [<a href="#B65-forests-14-00945" class="html-bibr">65</a>]. The best result was obtained by the method developed by [<a href="#B104-forests-14-00945" class="html-bibr">104</a>], which is based on seeds and Voronoi tessellation, with an accuracy of 65%. The authors mentioned that raster CHM-based methods are ineffective to detect trees present in lower strata.</div><div class='html-p'>In another inland Atlantic Forest remnant in Brazil, ref. [<a href="#B105-forests-14-00945" class="html-bibr">105</a>] tested a new automatic method for delineating ITC using high-resolution multispectral satellite images. The method encompasses several steps, namely pre-processing, selection of forest pixels, enhancement and detection of pixels in the crown borders, correction of shade in large trees, and segmentation of the tree crowns. The accuracy of the method was 79%, showing it to be an effective method for large tree crowns; however, the method is ineffective in detecting trees in the understory and trees located in shadowed areas due to other trees or terrain shade.</div><div class='html-p'>All the authors cited above mention the difficulty of delineating ITCs in tropical forests due to the complexity and heterogeneity of forest formations and the difficulty of performing the segmentation of species in the lower strata, mainly because smaller trees are below the crowns of larger trees. According to [<a href="#B75-forests-14-00945" class="html-bibr">75</a>], a perfect ITC delineation in tropical forests is unrealistic. However, partial information that allows the delineation of dominant, rare, or invasive tree species that could be important ecological indicators is of great value for better understanding these complex ecosystems.</div></section><section id='sec4dot2-forests-14-00945' type=''><h4 class='html-italic' data-nested='2'> 4.2. Tree Species Classification</h4><div class='html-p'>In this study, we classified eight tree species in a Brazilian Atlantic Forest remnant using multisource remote sensing data. Three different datasets were used: hyperspectral images, PR LiDAR, and FWF LiDAR data. These data were used independently or combined to train and evaluate an RF tree species classifier. Many studies have addressed the classification of tree species in temperate and subtropical forests using spectral and/or geometric data (i.e., LiDAR) [<a href="#B4-forests-14-00945" class="html-bibr">4</a>,<a href="#B6-forests-14-00945" class="html-bibr">6</a>,<a href="#B106-forests-14-00945" class="html-bibr">106</a>,<a href="#B107-forests-14-00945" class="html-bibr">107</a>,<a href="#B108-forests-14-00945" class="html-bibr">108</a>,<a href="#B109-forests-14-00945" class="html-bibr">109</a>,<a href="#B110-forests-14-00945" class="html-bibr">110</a>,<a href="#B111-forests-14-00945" class="html-bibr">111</a>,<a href="#B112-forests-14-00945" class="html-bibr">112</a>,<a href="#B113-forests-14-00945" class="html-bibr">113</a>,<a href="#B114-forests-14-00945" class="html-bibr">114</a>], but few studies have been realized to classify tree species in Brazilian tropical forests, mainly due to the difficulty in access to these areas, difficulty in obtaining a sufficient number of samples of each species, great heterogeneity, and diversity of tree species in these forests. Thus, our discussion will be based on studies with similar applications for at least tropical and subtropical forests, whenever possible.</div><div class='html-p'>To classify eight emerging tree species in the <span class='html-italic'>Ponte Branca</span> Forest remnant (same study area), ref. [<a href="#B35-forests-14-00945" class="html-bibr">35</a>] used hyperspectral data from Rikola camera onboard of UAV, collected on three different dates, to understand whether multitemporal data can improve the classification. The variables used were normalized and non-normalized tree species spectra. The use of temporal spectral information improved classification performance for three of the eight analysed species. However, for the other species, a difference in environmental conditions between years influenced flowering and defoliation of the species even in the same season, thus altering the spectral response, as well as the time of image acquisition. The best result was obtained with the normalized spectra (OA of 50%). In our study, the OA was similar (55%) using the raw spectra (<a href="#forests-14-00945-f011" class="html-fig">Figure 11</a>). The aforementioned authors reported a difference in spatial position between ITCs over the years, and some neighbouring trees interfere the spectral response of the tree species to be classified. In our case, there was no misalignment between the same ITC sample, but for the same species, there were samples on two different dates (2016 and 2017). Even though the data in different years were collected in the same season, there was a lag of one year and one month (<a href="#forests-14-00945-t003" class="html-table">Table 3</a>). Thus, the same species can present different physiological and phenological behaviour in different years, which may explain why the raw spectra and VIs were not effective in differentiating tree species in the <span class='html-italic'>Ponte Branca</span> Forest remnant. The most important features found by [<a href="#B35-forests-14-00945" class="html-bibr">35</a>] were the wavelengths, mainly from 628.73 nm to 780.49 nm (Band 10 to Band 23), which coincide with the bands that most contribute to the PC (<a href="#forests-14-00945-f014" class="html-fig">Figure 14</a>) for the best classification scenario in our study. This makes sense as band configuration of the camera used in both studies were the same, and the species chosen for classification were similar, as well.</div><div class='html-p'>In another inland Atlantic Forest remnant in Brazil, [<a href="#B29-forests-14-00945" class="html-bibr">29</a>] classified eight forest species using airborne hyperspectral images obtained with the AisaEAGLE sensor in the VNIR spectrum (visible and near-infrared) and the AisaHAWK sensor in the SWIR spectrum (shortwave-infrared). Various combinations of wavelengths and VIs were tested. Species discrimination performed best using visible bands (mainly wavelengths located at 550 nm and 650 nm) and SWIR bands. Vegetation indices contributed positively to the classification when integrated with VNIR features and should be used if the sensor does not acquire data in the SWIR wavelengths. The PSRI vegetation index (see <a href="#forests-14-00945-t005" class="html-table">Table 5</a>) was one of the most important for tree species differentiation and, in our case, had the fourth highest relative importance (see <a href="#forests-14-00945-f013" class="html-fig">Figure 13</a>) for the second-best classification scenario (S11). The best classification accuracy obtained by [<a href="#B29-forests-14-00945" class="html-bibr">29</a>] was 90.1%. This result was better than the result obtained by us, even for the best classification scenario (S13—OA of 76%). One of the reasons is the sample sufficiency. The forest remnant in [<a href="#B29-forests-14-00945" class="html-bibr">29</a>] study has a larger area and a better conservation state; consequently, more trees are present in the upper canopy. A total of 273 samples of eight species were selected while in our study, only 81 samples of eight species were selected. In addition, the Rikola camera does not collect data in the SWIR spectrum. </div><div class='html-p'>It was possible to verify that data from narrowband sensors (i.e., hyperspectral) are an important tool in the discrimination of tropical species since it is possible to obtain specific wavelengths and vegetation indices in parts of the spectrum where it is possible to differentiate species according to the spectral response. Even the results achieved by us using only spectral information were not satisfactory; wavelengths in the VIS, Red-edge, and NIR spectrum proved to be suitable to calculate most of the VIs and for classifying different tree species until certain level in our study. </div><div class='html-p'>It is possible with PR LiDAR data extract information that describe the structure and geometry of forests, and this information also has the potential to discriminate tree species, but their isolated use was not effective for classification in our study. The PR LiDAR metrics and their transformation by PCA showed the lowest classification accuracy and OA (33% and 36%, respectively). Michalowska and Rapiński (2021) [<a href="#B6-forests-14-00945" class="html-bibr">6</a>] commented that using only vertical structural features from PR LiDAR (i.e., height distribution) can decrease classification accuracy. Considering only the structure of the vegetation, it is not species-specific but more conditioned to the position on ecological succession (e.g., if the species is pioneer, secondary, or climax) or layer in the forest (e.g., understory, lower, medium, or upper stratum). Furthermore, tropical forests have multiple layers with smaller trees below the canopy, and therefore, PR LiDAR height distributions and pulse returns are ineffective for species separation in lower strata [<a href="#B6-forests-14-00945" class="html-bibr">6</a>,<a href="#B47-forests-14-00945" class="html-bibr">47</a>,<a href="#B113-forests-14-00945" class="html-bibr">113</a>] when used independently as input for tree species classification. In more complex forests, the spectral differences are usually more pronounced than structural differences when used independently [<a href="#B110-forests-14-00945" class="html-bibr">110</a>], which was confirmed in our study when analysing each of these features (hyperspectral images, PR LiDAR, and FWF LiDAR) separately.</div><div class='html-p'>While PR LiDAR metrics can decrease classification accuracy, the isolated use of FWF LiDAR metrics have great potential for classification of tree species, as the analysis of the complete waveform allows a better interpretation of the physical structure and geometric backscatter properties of the intercepted objects [<a href="#B13-forests-14-00945" class="html-bibr">13</a>,<a href="#B17-forests-14-00945" class="html-bibr">17</a>,<a href="#B115-forests-14-00945" class="html-bibr">115</a>,<a href="#B116-forests-14-00945" class="html-bibr">116</a>]. Some authors, such as [<a href="#B19-forests-14-00945" class="html-bibr">19</a>,<a href="#B117-forests-14-00945" class="html-bibr">117</a>], used metrics related to the number of waveform peaks, waveform distance, height of median energy, roughness, and return of waveform energy for tree species classification. Hollaus et al. (2009) [<a href="#B118-forests-14-00945" class="html-bibr">118</a>] used FWF LiDAR metrics related to echo height distributions, mean and standard deviation of echo widths, mean intensities, and backscatter cross-sections. In China’s subtropical forests, the OA was 68.6% for classification of six tree species [<a href="#B117-forests-14-00945" class="html-bibr">117</a>]. Reitberger et al. (2008) [<a href="#B19-forests-14-00945" class="html-bibr">19</a>] found an OA of 79% for the classification of leaf-on tree species and OA of 95% for the classification of leaf-off tree species in Bavarian Forest National Park in Germany, and in pre-Alps region of Austria, three species (beech, spruce and larch) were classified with an OA of 85% by [<a href="#B118-forests-14-00945" class="html-bibr">118</a>]. All these authors used metrics extracted from FWF LiDAR data. However, in our study, using only FWF LiDAR metrics to classify tree species was unsuccessful (OA of 36%). It is noteworthy that none of the cited studies were performed in complex tropical forests; in addition, the <span class='html-italic'>DASOS</span> software provided a different set of FWF metrics that related to the spatial distribution of voxels that contain or do not contain a waveform sample. It is worth noting that these metrics relating to the voxel distribution could also be extracted using point clouds, as each waveform sample is actually a point associated with an intensity. </div><div class='html-p'>When used isolated, spectral data from the Rikola camera and the geometric/structural data from LiDAR were not effective for classification (S1 to S5), LiDAR geometric data, especially when combined with radiometric data, intensity, and spectral data, provided valuable information for the classification of tree species in complex forests [<a href="#B4-forests-14-00945" class="html-bibr">4</a>,<a href="#B6-forests-14-00945" class="html-bibr">6</a>]. </div><div class='html-p'>It is possible to notice that the metrics related to the voxels extracted from the FWF LiDAR data using <span class='html-italic'>DASOS</span>, and the VIs from the Rikola camera were one of the best combinations for the classification of the eight species of the <span class='html-italic'>Ponte Branca</span> Forest remnant, with an OA of 73%., an improvement of 18% comparing the classification performed with the spectra and VIs extracted from the Rikola camera, and 36% using the FWF LiDAR metrics alone. Buddenbaum et al. (2013) [<a href="#B10-forests-14-00945" class="html-bibr">10</a>], to classify two different forest species (Spruce and Douglas Fir) that were in different age classes, used 122 spectral bands of the HyMap hyperspectral sensor and normalized intensities of the waveforms that intercepted voxel columns with dimensions of 0.5 m. Combining these two data sources, the OA was 72.2%, improving classification accuracy by 16% when compared to using spectral bands only and 5.5% when using spectral bands and percentile heights of PR LiDAR data that was isolated. The FWF LiDAR metric related to the intensity of the voxels intercepted by the waveform samples proved to be an important metric in the differentiation of forest species, including in our study, in which it was important both for S11 and for S13 (<a href="#forests-14-00945-f013" class="html-fig">Figure 13</a>). Liao et al. (2018) [<a href="#B22-forests-14-00945" class="html-bibr">22</a>] also confirmed an improvement in the accuracy of the classification of seven forest species in the western part of Belgium using different height percentiles of FWF LiDAR with hyperspectral images. The improvement was 7.7% compared to the classification using only hyperspectral bands and 24.8% compared to using only raster LiDAR FWF. It is worth mentioning that the metrics related to height percentiles are dependent on the point cloud density, which makes it difficult to compare with other study areas or different acquisition parameters of the point cloud. Thus, a benchmarking data to test algorithms across different acquisitions and study areas is necessary for understanding how the tree species classification performs using different types of LiDAR metrics [<a href="#B119-forests-14-00945" class="html-bibr">119</a>]. The <span class='html-italic'>DASOS</span> software normalizes the data during voxelization and deals with the irregular scan pattern, and the extracted metrics are not dependent on the point density [<a href="#B72-forests-14-00945" class="html-bibr">72</a>].</div><div class='html-p'>Although the FWF LiDAR metrics performed better when combined with the VIs in classifying tree species, the difference was small when we looked at the confidence intervals (<a href="#forests-14-00945-f011" class="html-fig">Figure 11</a>) for the scenarios that combine both PR LiDAR and FWF LiDAR metrics with the information spectra and VIs (S8–S9 and S11 to S13). S13 includes all the metrics included in S11, and a small increase in classification accuracy was observed from including the extra metrics extracted from the PR data and the spectral data except for the scenarios, whose combinations were made with the PR LiDAR metrics transformed by PCA (S7 and S10). </div><div class='html-p'>As FWF LiDAR data require more computer memory than PR LiDAR data, the data processing is more time consuming and requires more computational effort, and few tools are available for processing and extracting information from FWF LiDAR point clouds [<a href="#B72-forests-14-00945" class="html-bibr">72</a>,<a href="#B120-forests-14-00945" class="html-bibr">120</a>]. In addition, due to the advancement of LiDAR systems, it is possible to obtain several returns for each emitted pulse. Thus, PR LiDAR metrics can be effective for tree species classification when FWF LiDAR data are not available and/or it is not possible to process them.</div><div class='html-p'>There are many tools and workflows for processing PR LiDAR point clouds, and several authors have already proven that the combined use of LiDAR metrics and spectral information from hyperspectral sensors (e.g., raw spectrum and VIs) improve the classification of tree species in different types of forest. Asner et al. (2008) [<a href="#B121-forests-14-00945" class="html-bibr">121</a>], for the classification of invasive tree species in Hawaiian tropical forests, found an improvement in accuracy from 63% to 91% using tree species spectra and LiDAR heights. Shen and Cao et al. (2017) [<a href="#B122-forests-14-00945" class="html-bibr">122</a>] found an improvement of approximately 6% in the classification of five species, and ref. [<a href="#B110-forests-14-00945" class="html-bibr">110</a>] found an increase in classification accuracy of 18 tree species of approximately 7% using both VIs and LiDAR metrics in Chinese subtropical forests. In our study area, improvements were 15% when combining raw spectra with LiDAR PR metrics and 9% when combining VIs with LiDAR PR metrics in classifying eight tree species. This corroborates the potential that the combination of hyperspectral and geometric data from LiDAR, mainly related to height percentiles and percentage of returns, have to improve the accuracy of tree species classifications, including in tropical forests. </div><div class='html-p'>Using all features extracted from different data sources for classification (i.e. spectra and VIs from images of the camera Rikola, PR, and FWF LiDAR metrics), we achieved a classification OA of 70% (Scenario 12), and transforming all features by PCA, the OA was 76% (Scenario 13), the best result obtained in this study. Using a large set of features as input data for classification, ref. [<a href="#B110-forests-14-00945" class="html-bibr">110</a>] combined VIs and PCA from hyperspectral images, textural information from RGB images, and structural metrics from LiDAR, totalling 64 features for the classification of tree species in Chinese subtropical forests. The OA was 91.8%, a better result than just using the isolated features or combining them two by two (e.g., LiDAR and VIs). For the classification of 12 tree species in a Brazilian subtropical forest, ref. [<a href="#B79-forests-14-00945" class="html-bibr">79</a>] tested several scenarios with different inputs, and one of the scenarios contained 68 features (e.g., raw spectra and VIs obtained from hyperspectral images, photogrammetric point cloud metrics, CHM, and textural information). The OA was 70.7% with a difference of less than 2% obtained for the best scenario that used raw spectra, VIs, and structural information from the photogrammetric point cloud as input.</div><div class='html-p'>When using many features as input data for classification with the RF algorithm, the features are randomly selected at each node of the tree [<a href="#B123-forests-14-00945" class="html-bibr">123</a>]. If any feature that does not contribute to species differentiation is selected, the classification performance may decline [<a href="#B123-forests-14-00945" class="html-bibr">123</a>]. The more features that do not contribute to the separation of tree species are added in the RF algorithm, the greater the probability of these features being selected at each node, increasing the algorithm generalization error in addition to generating very large trees [<a href="#B123-forests-14-00945" class="html-bibr">123</a>,<a href="#B124-forests-14-00945" class="html-bibr">124</a>]. Thus, a pre-selection of features that have the greatest potential to differentiate species can optimize and improve classification accuracy. However, as trees are complex structures, different features from different data sources (i.e., different remote sensors) can be complementary to improve the separability and classification of tree species. This can be seen in the results obtained in this study, as well as in other studies cited above, in which the use of many features did not decrease accuracy, but rather, it was similar to or even improved the classification accuracy when compared with a pre-selection or use of fewer features as input data for classifying tree species.</div><div class='html-p'>The transformation of the PR LiDAR metrics by the PCA and the use of these isolated features with the spectra or with the VIs were not very effective in the classification. However, when using all the features extracted from the hyperspectral images and the PR and FWF LiDAR point clouds transformed by the PCA, it resulted in the best classification accuracy (OA of 76%). According to [<a href="#B22-forests-14-00945" class="html-bibr">22</a>], the high dimensionality and redundancy when using hyperspectral data mainly makes it difficult to extract information from moving windows in raster data. Thus, the transformation by PCA facilitates the extraction of information using moving windows. In addition to decreasing the correlation between features that may be redundant, the PCA transformation can improve the classification accuracy when using small training set sizes [<a href="#B125-forests-14-00945" class="html-bibr">125</a>]. </div></section></section><section id='sec5-forests-14-00945' type='conclusions'><h2 data-nested='1'> 5. Conclusions</h2><div class='html-p'>In this study, we tested the automatic classification of eight tree species present in the upper canopy of a remnant of the Brazilian Atlantic Forest. Thirteen different classification scenarios were tested using remote sensing data from different sources (i.e., hyperspectral images collected from a UAV and airborne PR (peak return) and FWF (full-waveform LiDAR) as input data for the classification.</div><div class='html-p'>The segmentation of tree crowns was performed using the superpixels algorithm. Due to the low number of samples (81 trees), a manual correction of the segments that were not correctly delineated was made. Despite being effective, the method is time consuming, especially when working with a large number of samples, and it is recommended to have more studies on this topic for tropical forests.</div><div class='html-p'>Among the tested scenarios, the isolated use of LiDAR metrics for classification regardless of the type of return and the transformation by PCA (principal component analysis), was not effective for the classification, resulting in the lowest overall accuracy (between 33% and 36%). The use of the raw spectra of the hyperspectral images and the VIs (vegetation indices) had a better accuracy than the use of LiDAR data (55% for both features). However, the results with this data configuration were still not satisfactory. </div><div class='html-p'>When spectral features were combined with LiDAR metrics, there was a considerable increase in classification accuracy, between 64% and 76%, except when using the combination of raw spectra or VIs with PR LiDAR metrics, transformed by PCA (accuracy of 55% and 58%, respectively). The use of all features (raw spectra, VIs, PR, and FWF LiDAR), transformed by PCA, was the best classification scenario (overall accuracy of 76%), followed by the use of VIs and FWF LiDAR metrics (overall accuracy of 73%). However, considering the confidence intervals were at 95% probability, there was no significant difference between the scenarios using PR or FWF LiDAR metrics with the raw spectra or VIs.</div><div class='html-p'>Analysing the results of the overall accuracies obtained in the different classification scenarios analysed and the most important features for the best scenarios provided by the RF (random forest) algorithm, it can be concluded that cameras that collect data in visible, Red-edge, and NIR wavelengths are sufficient to calculate most of the VIs providing sufficient spectral information. Combined with PR LiDAR metrics (e.g., height percentiles and number of returns for each emitted pulse), they can achieve satisfactory accuracies in the classification of tree species in complex forests. </div><div class='html-p'>Data acquisition with UAVs can reduce costs and improve usability, but it requires the development of suitable sensors, such as lightweight multispectral cameras and LiDAR with the ability to record multiple returns, intensity, and with a greater density of points. Because UAVs can operate at a lower flight height, they allow greater flexibility for data collection in different areas and can generate outputs with good accuracy.</div><div class='html-p'>Despite the difficulties observed in this study, mainly in relation to the low sampling sufficiency of the trees, the time lag between the different flight campaigns, and the high heterogeneity of the forest canopy, the results of the classification were satisfactory for the complex forest environment studied. These results can serve for management and conservation practices of these forest remnants, allowing for a better understanding of the spatial distribution of species with a potential for forest restoration.</div></section> </div> <div class="html-back"> <section><section id='app1-forests-14-00945' type=''><h2 data-nested='1'> Supplementary Materials</h2><div class='html-p'>The following supporting information can be downloaded at: <a href='https://www.mdpi.com/article/10.3390/f14050945/s1' target='_blank' rel="noopener noreferrer">https://www.mdpi.com/article/10.3390/f14050945/s1</a>, Figure S1: Spectra obtained from each tree for each species, from hyperspectral orthomosaics. Figure S2: Distribution of species classified in the best scenario (S13). The segment colour represents the classification result, and the border colour refers to the correct field identification.</div></section></section><section class='html-notes'><h2 >Author Contributions</h2><div class='html-p'>Conceptualization, R.P.M.-N., A.M.G.T., N.N.I., E.H., E.A.S.M., M.M. and H.C.D.; methodology, R.P.M.-N., A.M.G.T., N.N.I., E.H. and E.A.S.M.; software, R.P.M.-N., E.H., E.A.S.M. and M.M.; validation, R.P.M.-N. and H.C.D.; formal analysis, R.P.M.-N.; investigation, R.P.M.-N., A.M.G.T., N.N.I., E.H. and M.M.; resources, A.M.G.T., N.N.I. and E.H.; data curation, R.P.M.-N.; writing—original draft preparation, R.P.M.-N.; writing—review and editing, R R.P.M.-N., A.M.G.T., N.N.I., E.H., E.A.S.M., M.M. and H.C.D.; visualization, R.P.M.-N., A.M.G.T., N.N.I. and H.C.D.; supervision, A.M.G.T., N.N.I., M.M. and H.C.D.; project administration, A.M.G.T., N.N.I. and E.H.; funding acquisition, A.M.G.T., N.N.I. and E.H. All authors have read and agreed to the published version of the manuscript.</div></section><section class='html-notes'><h2>Funding</h2><div class='html-p'>This study was funded by the Coordenação de Aperfeiçoamento de Pessoal de Nível Superior–Brazil (CAPES)–Finance Code 001 (process number 88882.433953/2019-01); by the Programa Institucional de Internacionalização (CAPES/PrInt)–process number 88881.310314/2018-01; by the Conselho Nacional de Desenvolvimento Científico e Tecnológico–Brazil (CNPq)–process numbers 404379/2016-8 and 303670/2018-5; and by the Brazilian–Finnish joint project “Unmanned Airborne Vehicle–Based 4D Remote Sensing for Mapping Rain Forest Biodiversity and its Change in Brazil”, financed part by São Paulo Research Foundation (FAPESP), grant number 2013/50426-4 and part by Academy of Finland (AKA), grant number 273806.</div></section><section class='html-notes'><h2 >Data Availability Statement</h2><div class='html-p'>The data presented in this study are not available on request. The data are not publicly available due to as the study area is protected by federal laws.</div></section><section id='html-ack' class='html-ack'><h2 >Acknowledgments</h2><div class='html-p'>The authors would like to thank Baltazar Casagrande and Valter Ribeiro Campos for their assistance with the field surveys and species recognition and the company ENGEMAP for providing the ALS point cloud from the study area. M.M. was funded by a UKRI Future Leaders Fellowship (MR/T019832/1). For the purpose of open access, the author has applied a Creative Commons Attribution (CC BY) licence to any Author Accepted Manuscript version arising.</div></section><section class='html-notes'><h2 >Conflicts of Interest</h2><div class='html-p'>The authors declare no conflict of interest.</div></section><section id='html-references_list'><h2>References</h2><ol class='html-xxx'><li id='B1-forests-14-00945' class='html-x' data-content='1.'>Zhou, X.; Fu, Y.; Zhou, L.; Li, B.; Luo, Y. An Imperative Need for Global Change Research in Tropical Forests. <span class='html-italic'>Tree Physiol.</span> <b>2013</b>, <span class='html-italic'>33</span>, 903–912. [<a href="https://scholar.google.com/scholar_lookup?title=An+Imperative+Need+for+Global+Change+Research+in+Tropical+Forests&author=Zhou,+X.&author=Fu,+Y.&author=Zhou,+L.&author=Li,+B.&author=Luo,+Y.&publication_year=2013&journal=Tree+Physiol.&volume=33&pages=903%E2%80%93912&doi=10.1093/treephys/tpt064&pmid=24128847" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1093/treephys/tpt064" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>] [<a href="http://www.ncbi.nlm.nih.gov/pubmed/24128847" class='cross-ref' data-typ='pmid' target='_blank' rel='noopener noreferrer'>PubMed</a>]</li><li id='B2-forests-14-00945' class='html-x' data-content='2.'>Hassan, R.; Scholes, R.; Ash, N. <span class='html-italic'>Ecosystems and Human Well-Being: Current State and Trends</span>; Island Press: Washington, DC, USA, 2005. [<a href="https://scholar.google.com/scholar_lookup?title=Ecosystems+and+Human+Well-Being:+Current+State+and+Trends&author=Hassan,+R.&author=Scholes,+R.&author=Ash,+N.&publication_year=2005" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B3-forests-14-00945' class='html-x' data-content='3.'>Lopez-Gonzalez, G.; Lewis, S.L.; Burkitt, M.; Phillips, O.L. ForestPlots. Net: A Web Application and Research Tool to Manage and Analyse Tropical Forest Plot Data. <span class='html-italic'>J. Veg. Sci.</span> <b>2011</b>, <span class='html-italic'>22</span>, 610–613. [<a href="https://scholar.google.com/scholar_lookup?title=ForestPlots.+Net:+A+Web+Application+and+Research+Tool+to+Manage+and+Analyse+Tropical+Forest+Plot+Data&author=Lopez-Gonzalez,+G.&author=Lewis,+S.L.&author=Burkitt,+M.&author=Phillips,+O.L.&publication_year=2011&journal=J.+Veg.+Sci.&volume=22&pages=610%E2%80%93613&doi=10.1111/j.1654-1103.2011.01312.x" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1111/j.1654-1103.2011.01312.x" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B4-forests-14-00945' class='html-x' data-content='4.'>Fassnacht, F.E.; Latifi, H.; Stereńczak, K.; Modzelewska, A.; Lefsky, M.; Waser, L.T.; Straub, C.; Ghosh, A. Review of Studies on Tree Species Classification from Remotely Sensed Data. <span class='html-italic'>Remote Sens. Environ.</span> <b>2016</b>, <span class='html-italic'>186</span>, 64–87. [<a href="https://scholar.google.com/scholar_lookup?title=Review+of+Studies+on+Tree+Species+Classification+from+Remotely+Sensed+Data&author=Fassnacht,+F.E.&author=Latifi,+H.&author=Stere%C5%84czak,+K.&author=Modzelewska,+A.&author=Lefsky,+M.&author=Waser,+L.T.&author=Straub,+C.&author=Ghosh,+A.&publication_year=2016&journal=Remote+Sens.+Environ.&volume=186&pages=64%E2%80%9387&doi=10.1016/j.rse.2016.08.013" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.rse.2016.08.013" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B5-forests-14-00945' class='html-x' data-content='5.'>Piiroinen, R.; Heiskanen, J.; Maeda, E.; Viinikka, A.; Pellikka, P. Classification of Tree Species in a Diverse African Agroforestry Landscape Using Imaging Spectroscopy and Laser Scanning. <span class='html-italic'>Remote Sens.</span> <b>2017</b>, <span class='html-italic'>9</span>, 875. [<a href="https://scholar.google.com/scholar_lookup?title=Classification+of+Tree+Species+in+a+Diverse+African+Agroforestry+Landscape+Using+Imaging+Spectroscopy+and+Laser+Scanning&author=Piiroinen,+R.&author=Heiskanen,+J.&author=Maeda,+E.&author=Viinikka,+A.&author=Pellikka,+P.&publication_year=2017&journal=Remote+Sens.&volume=9&pages=875&doi=10.3390/rs9090875" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/rs9090875" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B6-forests-14-00945' class='html-x' data-content='6.'>Michalowska, M.; Rapiński, J. A Review of Tree Species Classification Based on Airborne LiDAR Data and Applied Classifiers. <span class='html-italic'>Remote Sens.</span> <b>2021</b>, <span class='html-italic'>13</span>, 353. [<a href="https://scholar.google.com/scholar_lookup?title=A+Review+of+Tree+Species+Classification+Based+on+Airborne+LiDAR+Data+and+Applied+Classifiers&author=Michalowska,+M.&author=Rapi%C5%84ski,+J.&publication_year=2021&journal=Remote+Sens.&volume=13&pages=353&doi=10.3390/rs13030353" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/rs13030353" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B7-forests-14-00945' class='html-x' data-content='7.'>Cochrane, M.A. Using Vegetation Reflectance Variability for Species Level Classification of Hyperspectral Data. <span class='html-italic'>Int. J. Remote Sens.</span> <b>2000</b>, <span class='html-italic'>21</span>, 2075–2087. [<a href="https://scholar.google.com/scholar_lookup?title=Using+Vegetation+Reflectance+Variability+for+Species+Level+Classification+of+Hyperspectral+Data&author=Cochrane,+M.A.&publication_year=2000&journal=Int.+J.+Remote+Sens.&volume=21&pages=2075%E2%80%932087&doi=10.1080/01431160050021303" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1080/01431160050021303" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B8-forests-14-00945' class='html-x' data-content='8.'>Féret, J.-B.; Asner, G.P. Tree Species Discrimination in Tropical Forests Using Airborne Imaging Spectroscopy. <span class='html-italic'>IEEE Trans. Geosci. Remote Sens.</span> <b>2012</b>, <span class='html-italic'>51</span>, 73–84. [<a href="https://scholar.google.com/scholar_lookup?title=Tree+Species+Discrimination+in+Tropical+Forests+Using+Airborne+Imaging+Spectroscopy&author=F%C3%A9ret,+J.-B.&author=Asner,+G.P.&publication_year=2012&journal=IEEE+Trans.+Geosci.+Remote+Sens.&volume=51&pages=73%E2%80%9384&doi=10.1109/TGRS.2012.2199323" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1109/TGRS.2012.2199323" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B9-forests-14-00945' class='html-x' data-content='9.'>Zhang, J.; Rivard, B.; Sánchez-Azofeifa, A.; Castro-Esau, K. Intra-and Inter-Class Spectral Variability of Tropical Tree Species at La Selva, Costa Rica: Implications for Species Identification Using HYDICE Imagery. <span class='html-italic'>Remote Sens. Environ.</span> <b>2006</b>, <span class='html-italic'>105</span>, 129–141. [<a href="https://scholar.google.com/scholar_lookup?title=Intra-and+Inter-Class+Spectral+Variability+of+Tropical+Tree+Species+at+La+Selva,+Costa+Rica:+Implications+for+Species+Identification+Using+HYDICE+Imagery&author=Zhang,+J.&author=Rivard,+B.&author=S%C3%A1nchez-Azofeifa,+A.&author=Castro-Esau,+K.&publication_year=2006&journal=Remote+Sens.+Environ.&volume=105&pages=129%E2%80%93141&doi=10.1016/j.rse.2006.06.010" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.rse.2006.06.010" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B10-forests-14-00945' class='html-xx' data-content='10.'>Buddenbaum, H.; Seeling, S.; Hill, J. Fusion of Full-Waveform Lidar and Imaging Spectroscopy Remote Sensing Data for the Characterization of Forest Stands. <span class='html-italic'>Int. J. Remote Sens.</span> <b>2013</b>, <span class='html-italic'>34</span>, 4511–4524. [<a href="https://scholar.google.com/scholar_lookup?title=Fusion+of+Full-Waveform+Lidar+and+Imaging+Spectroscopy+Remote+Sensing+Data+for+the+Characterization+of+Forest+Stands&author=Buddenbaum,+H.&author=Seeling,+S.&author=Hill,+J.&publication_year=2013&journal=Int.+J.+Remote+Sens.&volume=34&pages=4511%E2%80%934524&doi=10.1080/01431161.2013.776721" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1080/01431161.2013.776721" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B11-forests-14-00945' class='html-xx' data-content='11.'>Kim, S.; McGaughey, R.J.; Andersen, H.-E.; Schreuder, G. Tree Species Differentiation Using Intensity Data Derived from Leaf-on and Leaf-off Airborne Laser Scanner Data. <span class='html-italic'>Remote Sens. Environ.</span> <b>2009</b>, <span class='html-italic'>113</span>, 1575–1586. [<a href="https://scholar.google.com/scholar_lookup?title=Tree+Species+Differentiation+Using+Intensity+Data+Derived+from+Leaf-on+and+Leaf-off+Airborne+Laser+Scanner+Data&author=Kim,+S.&author=McGaughey,+R.J.&author=Andersen,+H.-E.&author=Schreuder,+G.&publication_year=2009&journal=Remote+Sens.+Environ.&volume=113&pages=1575%E2%80%931586&doi=10.1016/j.rse.2009.03.017" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.rse.2009.03.017" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B12-forests-14-00945' class='html-xx' data-content='12.'>Morsdorf, F.; Mårell, A.; Koetz, B.; Cassagne, N.; Pimont, F.; Rigolot, E.; Allgöwer, B. Discrimination of Vegetation Strata in a Multi-Layered Mediterranean Forest Ecosystem Using Height and Intensity Information Derived from Airborne Laser Scanning. <span class='html-italic'>Remote Sens. Environ.</span> <b>2010</b>, <span class='html-italic'>114</span>, 1403–1415. [<a href="https://scholar.google.com/scholar_lookup?title=Discrimination+of+Vegetation+Strata+in+a+Multi-Layered+Mediterranean+Forest+Ecosystem+Using+Height+and+Intensity+Information+Derived+from+Airborne+Laser+Scanning&author=Morsdorf,+F.&author=M%C3%A5rell,+A.&author=Koetz,+B.&author=Cassagne,+N.&author=Pimont,+F.&author=Rigolot,+E.&author=Allg%C3%B6wer,+B.&publication_year=2010&journal=Remote+Sens.+Environ.&volume=114&pages=1403%E2%80%931415&doi=10.1016/j.rse.2010.01.023" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.rse.2010.01.023" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B13-forests-14-00945' class='html-xx' data-content='13.'>Shan, J.; Toth, C.K. <span class='html-italic'>Topographic Laser Ranging and Scanning: Principles and Processing</span>; CRC Press: New York, NY, USA, 2018. [<a href="https://scholar.google.com/scholar_lookup?title=Topographic+Laser+Ranging+and+Scanning:+Principles+and+Processing&author=Shan,+J.&author=Toth,+C.K.&publication_year=2018" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B14-forests-14-00945' class='html-xx' data-content='14.'>Favorskaya, M.N.; Jain, L.C. Overview of LiDAR Technologies and Equipment for Land Cover Scanning. In <span class='html-italic'>Handbook on Advances in Remote Sensing and Geographic Information Systems</span>; Springer: Berlin/Heidelberg, Germany, 2017; Volume 122, pp. 19–68. [<a href="https://scholar.google.com/scholar_lookup?title=Overview+of+LiDAR+Technologies+and+Equipment+for+Land+Cover+Scanning&author=Favorskaya,+M.N.&author=Jain,+L.C.&publication_year=2017&pages=19%E2%80%9368" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B15-forests-14-00945' class='html-xx' data-content='15.'>Thiel, K.H.; Wehr, A. Performance Capabilities of Laser Scanners–an Overview and Measurement Principle Analysis. <span class='html-italic'>Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.</span> <b>2004</b>, <span class='html-italic'>36</span>, 14–18. [<a href="https://scholar.google.com/scholar_lookup?title=Performance+Capabilities+of+Laser+Scanners%E2%80%93an+Overview+and+Measurement+Principle+Analysis&author=Thiel,+K.H.&author=Wehr,+A.&publication_year=2004&journal=Int.+Arch.+Photogramm.+Remote+Sens.+Spat.+Inf.+Sci.&volume=36&pages=14%E2%80%9318" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B16-forests-14-00945' class='html-xx' data-content='16.'>Lim, K.; Treitz, P.; Wulder, M.; St-Onge, B.; Flood, M. LiDAR Remote Sensing of Forest Structure. <span class='html-italic'>Prog. Phys. Geogr.</span> <b>2003</b>, <span class='html-italic'>27</span>, 88–106. [<a href="https://scholar.google.com/scholar_lookup?title=LiDAR+Remote+Sensing+of+Forest+Structure&author=Lim,+K.&author=Treitz,+P.&author=Wulder,+M.&author=St-Onge,+B.&author=Flood,+M.&publication_year=2003&journal=Prog.+Phys.+Geogr.&volume=27&pages=88%E2%80%93106&doi=10.1191/0309133303pp360ra" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1191/0309133303pp360ra" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B17-forests-14-00945' class='html-xx' data-content='17.'>Mallet, C.; Bretar, F. Full-Waveform Topographic Lidar: State-of-the-Art. <span class='html-italic'>ISPRS J. Photogramm. Remote Sens.</span> <b>2009</b>, <span class='html-italic'>64</span>, 1–16. [<a href="https://scholar.google.com/scholar_lookup?title=Full-Waveform+Topographic+Lidar:+State-of-the-Art&author=Mallet,+C.&author=Bretar,+F.&publication_year=2009&journal=ISPRS+J.+Photogramm.+Remote+Sens.&volume=64&pages=1%E2%80%9316&doi=10.1016/j.isprsjprs.2008.09.007" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.isprsjprs.2008.09.007" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B18-forests-14-00945' class='html-xx' data-content='18.'>Pirotti, F. Analysis of Full-Waveform LiDAR Data for Forestry Applications: A Review of Investigations and Methods. <span class='html-italic'>Iforest-Biogeosciences For.</span> <b>2011</b>, <span class='html-italic'>4</span>, 100. [<a href="https://scholar.google.com/scholar_lookup?title=Analysis+of+Full-Waveform+LiDAR+Data+for+Forestry+Applications:+A+Review+of+Investigations+and+Methods&author=Pirotti,+F.&publication_year=2011&journal=Iforest-Biogeosciences+For.&volume=4&pages=100&doi=10.3832/ifor0562-004" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3832/ifor0562-004" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B19-forests-14-00945' class='html-xx' data-content='19.'>Reitberger, J.; Krzystek, P.; Stilla, U. Analysis of Full Waveform LIDAR Data for the Classification of Deciduous and Coniferous Trees. <span class='html-italic'>Int. J. Remote Sens.</span> <b>2008</b>, <span class='html-italic'>29</span>, 1407–1431. [<a href="https://scholar.google.com/scholar_lookup?title=Analysis+of+Full+Waveform+LIDAR+Data+for+the+Classification+of+Deciduous+and+Coniferous+Trees&author=Reitberger,+J.&author=Krzystek,+P.&author=Stilla,+U.&publication_year=2008&journal=Int.+J.+Remote+Sens.&volume=29&pages=1407%E2%80%931431&doi=10.1080/01431160701736448" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1080/01431160701736448" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B20-forests-14-00945' class='html-xx' data-content='20.'>RIEGL DataSheet LMS-Q680i. Available online: <a href='http://www.riegl.com/uploads/tx_pxpriegldownloads/10_DataSheet_LMS-Q680i_28-09-2012_01.pdf' target='_blank' rel="noopener noreferrer" >http://www.riegl.com/uploads/tx_pxpriegldownloads/10_DataSheet_LMS-Q680i_28-09-2012_01.pdf</a> (accessed on 7 April 2021).</li><li id='B21-forests-14-00945' class='html-xx' data-content='21.'>Luo, S.; Wang, C.; Xi, X.; Zeng, H.; Li, D.; Xia, S.; Wang, P. Fusion of Airborne Discrete-Return LiDAR and Hyperspectral Data for Land Cover Classification. <span class='html-italic'>Remote Sens.</span> <b>2016</b>, <span class='html-italic'>8</span>, 3. [<a href="https://scholar.google.com/scholar_lookup?title=Fusion+of+Airborne+Discrete-Return+LiDAR+and+Hyperspectral+Data+for+Land+Cover+Classification&author=Luo,+S.&author=Wang,+C.&author=Xi,+X.&author=Zeng,+H.&author=Li,+D.&author=Xia,+S.&author=Wang,+P.&publication_year=2016&journal=Remote+Sens.&volume=8&pages=3&doi=10.3390/rs8010003" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/rs8010003" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B22-forests-14-00945' class='html-xx' data-content='22.'>Liao, W.; Van Coillie, F.; Gao, L.; Li, L.; Zhang, B.; Chanussot, J. Deep Learning for Fusion of APEX Hyperspectral and Full-Waveform LiDAR Remote Sensing Data for Tree Species Mapping. <span class='html-italic'>IEEE Access</span> <b>2018</b>, <span class='html-italic'>6</span>, 68716–68729. [<a href="https://scholar.google.com/scholar_lookup?title=Deep+Learning+for+Fusion+of+APEX+Hyperspectral+and+Full-Waveform+LiDAR+Remote+Sensing+Data+for+Tree+Species+Mapping&author=Liao,+W.&author=Van+Coillie,+F.&author=Gao,+L.&author=Li,+L.&author=Zhang,+B.&author=Chanussot,+J.&publication_year=2018&journal=IEEE+Access&volume=6&pages=68716%E2%80%9368729&doi=10.1109/ACCESS.2018.2880083" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1109/ACCESS.2018.2880083" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B23-forests-14-00945' class='html-xx' data-content='23.'>Guerra, T.N.F.; Rodal, M.J.N.; e Silva, A.C.B.L.; Alves, M.; Silva, M.A.M.; de Araújo Mendes, P.G. Influence of Edge and Topography on the Vegetation in an Atlantic Forest Remnant in Northeastern Brazil. <span class='html-italic'>J. For. Res.</span> <b>2013</b>, <span class='html-italic'>18</span>, 200–208. [<a href="https://scholar.google.com/scholar_lookup?title=Influence+of+Edge+and+Topography+on+the+Vegetation+in+an+Atlantic+Forest+Remnant+in+Northeastern+Brazil&author=Guerra,+T.N.F.&author=Rodal,+M.J.N.&author=e+Silva,+A.C.B.L.&author=Alves,+M.&author=Silva,+M.A.M.&author=de+Ara%C3%BAjo+Mendes,+P.G.&publication_year=2013&journal=J.+For.+Res.&volume=18&pages=200%E2%80%93208&doi=10.1007/s10310-012-0344-3" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1007/s10310-012-0344-3" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B24-forests-14-00945' class='html-xx' data-content='24.'>Scarano, F.R.; Ceotto, P. Brazilian Atlantic Forest: Impact, Vulnerability, and Adaptation to Climate Change. <span class='html-italic'>Biodivers. Conserv.</span> <b>2015</b>, <span class='html-italic'>24</span>, 2319–2331. [<a href="https://scholar.google.com/scholar_lookup?title=Brazilian+Atlantic+Forest:+Impact,+Vulnerability,+and+Adaptation+to+Climate+Change&author=Scarano,+F.R.&author=Ceotto,+P.&publication_year=2015&journal=Biodivers.+Conserv.&volume=24&pages=2319%E2%80%932331&doi=10.1007/s10531-015-0972-y" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1007/s10531-015-0972-y" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B25-forests-14-00945' class='html-xx' data-content='25.'>Haddad, N.M.; Brudvig, L.A.; Clobert, J.; Davies, K.F.; Gonzalez, A.; Holt, R.D.; Lovejoy, T.E.; Sexton, J.O.; Austin, M.P.; Collins, C.D. Habitat Fragmentation and Its Lasting Impact on Earth’s Ecosystems. <span class='html-italic'>Sci. Adv.</span> <b>2015</b>, <span class='html-italic'>1</span>, e1500052. [<a href="https://scholar.google.com/scholar_lookup?title=Habitat+Fragmentation+and+Its+Lasting+Impact+on+Earth%E2%80%99s+Ecosystems&author=Haddad,+N.M.&author=Brudvig,+L.A.&author=Clobert,+J.&author=Davies,+K.F.&author=Gonzalez,+A.&author=Holt,+R.D.&author=Lovejoy,+T.E.&author=Sexton,+J.O.&author=Austin,+M.P.&author=Collins,+C.D.&publication_year=2015&journal=Sci.+Adv.&volume=1&pages=e1500052&doi=10.1126/sciadv.1500052" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1126/sciadv.1500052" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B26-forests-14-00945' class='html-xx' data-content='26.'>Rodrigues, R.R.; Lima, R.A.; Gandolfi, S.; Nave, A.G. On the Restoration of High Diversity Forests: 30 Years of Experience in the Brazilian Atlantic Forest. <span class='html-italic'>Biol. Conserv.</span> <b>2009</b>, <span class='html-italic'>142</span>, 1242–1251. [<a href="https://scholar.google.com/scholar_lookup?title=On+the+Restoration+of+High+Diversity+Forests:+30+Years+of+Experience+in+the+Brazilian+Atlantic+Forest&author=Rodrigues,+R.R.&author=Lima,+R.A.&author=Gandolfi,+S.&author=Nave,+A.G.&publication_year=2009&journal=Biol.+Conserv.&volume=142&pages=1242%E2%80%931251&doi=10.1016/j.biocon.2008.12.008" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.biocon.2008.12.008" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B27-forests-14-00945' class='html-xx' data-content='27.'>Werneck, M.d.S.; Sobral, M.E.G.; Rocha, C.T.V.; Landau, E.C.; Stehmann, J.R. Distribution and Endemism of Angiosperms in the Atlantic Forest. <span class='html-italic'>Nat. Conserv.</span> <b>2011</b>, <span class='html-italic'>9</span>, 188–193. [<a href="https://scholar.google.com/scholar_lookup?title=Distribution+and+Endemism+of+Angiosperms+in+the+Atlantic+Forest&author=Werneck,+M.d.S.&author=Sobral,+M.E.G.&author=Rocha,+C.T.V.&author=Landau,+E.C.&author=Stehmann,+J.R.&publication_year=2011&journal=Nat.+Conserv.&volume=9&pages=188%E2%80%93193&doi=10.4322/natcon.2011.024" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.4322/natcon.2011.024" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B28-forests-14-00945' class='html-xx' data-content='28.'>Myers, N.; Mittermeier, R.A.; Mittermeier, C.G.; Da Fonseca, G.A.; Kent, J. Biodiversity Hotspots for Conservation Priorities. <span class='html-italic'>Nature</span> <b>2000</b>, <span class='html-italic'>403</span>, 853. [<a href="https://scholar.google.com/scholar_lookup?title=Biodiversity+Hotspots+for+Conservation+Priorities&author=Myers,+N.&author=Mittermeier,+R.A.&author=Mittermeier,+C.G.&author=Da+Fonseca,+G.A.&author=Kent,+J.&publication_year=2000&journal=Nature&volume=403&pages=853&doi=10.1038/35002501&pmid=10706275" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1038/35002501" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>] [<a href="http://www.ncbi.nlm.nih.gov/pubmed/10706275" class='cross-ref' data-typ='pmid' target='_blank' rel='noopener noreferrer'>PubMed</a>]</li><li id='B29-forests-14-00945' class='html-xx' data-content='29.'>Ferreira, M.P.; Zortea, M.; Zanotta, D.C.; Shimabukuro, Y.E.; de Souza Filho, C.R. Mapping Tree Species in Tropical Seasonal Semi-Deciduous Forests with Hyperspectral and Multispectral Data. <span class='html-italic'>Remote Sens. Environ.</span> <b>2016</b>, <span class='html-italic'>179</span>, 66–78. [<a href="https://scholar.google.com/scholar_lookup?title=Mapping+Tree+Species+in+Tropical+Seasonal+Semi-Deciduous+Forests+with+Hyperspectral+and+Multispectral+Data&author=Ferreira,+M.P.&author=Zortea,+M.&author=Zanotta,+D.C.&author=Shimabukuro,+Y.E.&author=de+Souza+Filho,+C.R.&publication_year=2016&journal=Remote+Sens.+Environ.&volume=179&pages=66%E2%80%9378&doi=10.1016/j.rse.2016.03.021" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.rse.2016.03.021" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B30-forests-14-00945' class='html-xx' data-content='30.'>Berveglieri, A.; Imai, N.N.; Tommaselli, A.M.; Martins-Neto, R.P.; Miyoshi, G.T.; Honkavaara, E. Forest Cover Change Analysis Based on Temporal Gradients of the Vertical Structure and Density. <span class='html-italic'>Ecol. Indic.</span> <b>2021</b>, <span class='html-italic'>126</span>, 107597. [<a href="https://scholar.google.com/scholar_lookup?title=Forest+Cover+Change+Analysis+Based+on+Temporal+Gradients+of+the+Vertical+Structure+and+Density&author=Berveglieri,+A.&author=Imai,+N.N.&author=Tommaselli,+A.M.&author=Martins-Neto,+R.P.&author=Miyoshi,+G.T.&author=Honkavaara,+E.&publication_year=2021&journal=Ecol.+Indic.&volume=126&pages=107597&doi=10.1016/j.ecolind.2021.107597" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.ecolind.2021.107597" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B31-forests-14-00945' class='html-xx' data-content='31.'>Berveglieri, A.; Imai, N.N.; Tommaselli, A.M.; Casagrande, B.; Honkavaara, E. Successional Stages and Their Evolution in Tropical Forests Using Multi-Temporal Photogrammetric Surface Models and Superpixels. <span class='html-italic'>ISPRS J. Photogramm. Remote Sens.</span> <b>2018</b>, <span class='html-italic'>146</span>, 548–558. [<a href="https://scholar.google.com/scholar_lookup?title=Successional+Stages+and+Their+Evolution+in+Tropical+Forests+Using+Multi-Temporal+Photogrammetric+Surface+Models+and+Superpixels&author=Berveglieri,+A.&author=Imai,+N.N.&author=Tommaselli,+A.M.&author=Casagrande,+B.&author=Honkavaara,+E.&publication_year=2018&journal=ISPRS+J.+Photogramm.+Remote+Sens.&volume=146&pages=548%E2%80%93558&doi=10.1016/j.isprsjprs.2018.11.002" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.isprsjprs.2018.11.002" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B32-forests-14-00945' class='html-xx' data-content='32.'>Berveglieri, A.; Tommaselli, A.M.G.; Imai, N.N.; Ribeiro, E.A.W.; Guimaraes, R.B.; Honkavaara, E. Identification of Successional Stages and Cover Changes of Tropical Forest Based on Digital Surface Model Analysis. <span class='html-italic'>IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.</span> <b>2016</b>, <span class='html-italic'>9</span>, 5385–5397. [<a href="https://scholar.google.com/scholar_lookup?title=Identification+of+Successional+Stages+and+Cover+Changes+of+Tropical+Forest+Based+on+Digital+Surface+Model+Analysis&author=Berveglieri,+A.&author=Tommaselli,+A.M.G.&author=Imai,+N.N.&author=Ribeiro,+E.A.W.&author=Guimaraes,+R.B.&author=Honkavaara,+E.&publication_year=2016&journal=IEEE+J.+Sel.+Top.+Appl.+Earth+Obs.+Remote+Sens.&volume=9&pages=5385%E2%80%935397&doi=10.1109/JSTARS.2016.2606320" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1109/JSTARS.2016.2606320" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B33-forests-14-00945' class='html-xx' data-content='33.'>Martins-Neto, R.; Tommaselli, A.; Imai, N.; Berveglieri, A.; Thomaz, M.; Miyoshi, G.; Casagrande, B.; Guimarães, R.; Ribeiro, E.; Honkavaara, E. Structure and Tree Diversity of an Inland Atlantic Forest—A Case Study of Ponte Branca Forest Remnant, Brazil. <span class='html-italic'>Indones. J. Geogr.</span> <b>2022</b>, <span class='html-italic'>54</span>, 9. [<a href="https://scholar.google.com/scholar_lookup?title=Structure+and+Tree+Diversity+of+an+Inland+Atlantic+Forest%E2%80%94A+Case+Study+of+Ponte+Branca+Forest+Remnant,+Brazil&author=Martins-Neto,+R.&author=Tommaselli,+A.&author=Imai,+N.&author=Berveglieri,+A.&author=Thomaz,+M.&author=Miyoshi,+G.&author=Casagrande,+B.&author=Guimar%C3%A3es,+R.&author=Ribeiro,+E.&author=Honkavaara,+E.&publication_year=2022&journal=Indones.+J.+Geogr.&volume=54&pages=9&doi=10.22146/ijg.61120" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.22146/ijg.61120" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B34-forests-14-00945' class='html-xx' data-content='34.'>Chase, M.W.; Christenhusz, M.J.M.; Fay, M.F.; Byng, J.W.; Judd, W.S.; Soltis, D.E.; Mabberley, D.J.; Sennikov, A.N.; Soltis, P.S.; Stevens, P.F. An Update of the Angiosperm Phylogeny Group Classification for the Orders and Families of Flowering Plants: APG IV. <span class='html-italic'>Bot. J. Linn. Soc.</span> <b>2016</b>, <span class='html-italic'>181</span>, 1–20. [<a href="https://scholar.google.com/scholar_lookup?title=An+Update+of+the+Angiosperm+Phylogeny+Group+Classification+for+the+Orders+and+Families+of+Flowering+Plants:+APG+IV&author=Chase,+M.W.&author=Christenhusz,+M.J.M.&author=Fay,+M.F.&author=Byng,+J.W.&author=Judd,+W.S.&author=Soltis,+D.E.&author=Mabberley,+D.J.&author=Sennikov,+A.N.&author=Soltis,+P.S.&author=Stevens,+P.F.&publication_year=2016&journal=Bot.+J.+Linn.+Soc.&volume=181&pages=1%E2%80%9320&doi=10.1111/boj.12385" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1111/boj.12385" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B35-forests-14-00945' class='html-xx' data-content='35.'>Miyoshi, G.T.; Imai, N.N.; Garcia Tommaselli, A.M.; Antunes de Moraes, M.V.; Honkavaara, E. Evaluation of Hyperspectral Multitemporal Information to Improve Tree Species Identification in the Highly Diverse Atlantic Forest. <span class='html-italic'>Remote Sens.</span> <b>2020</b>, <span class='html-italic'>12</span>, 244. [<a href="https://scholar.google.com/scholar_lookup?title=Evaluation+of+Hyperspectral+Multitemporal+Information+to+Improve+Tree+Species+Identification+in+the+Highly+Diverse+Atlantic+Forest&author=Miyoshi,+G.T.&author=Imai,+N.N.&author=Garcia+Tommaselli,+A.M.&author=Antunes+de+Moraes,+M.V.&author=Honkavaara,+E.&publication_year=2020&journal=Remote+Sens.&volume=12&pages=244&doi=10.3390/rs12020244" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/rs12020244" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B36-forests-14-00945' class='html-xx' data-content='36.'>Mariscal-Flores, E.J. Potencial Produtivo e Alternativas de Manejo Sustentável de Um Fragmento de Mata Atlântica Secundária, Município de Viçosa, Minas Gerais. Master’s Thesis, Universidade Federal de Viçosa, Viçosa, MG, Brazil, 1993. [<a href="https://scholar.google.com/scholar_lookup?title=Potencial+Produtivo+e+Alternativas+de+Manejo+Sustent%C3%A1vel+de+Um+Fragmento+de+Mata+Atl%C3%A2ntica+Secund%C3%A1ria,+Munic%C3%ADpio+de+Vi%C3%A7osa,+Minas+Gerais&author=Mariscal-Flores,+E.J.&publication_year=1993" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B37-forests-14-00945' class='html-xx' data-content='37.'>Souza, D.R.d.; Souza, A.L.d.; Gama, J.R.V.; Leite, H.G. Emprego de Análise Multivariada Para Estratificação Vertical de Florestas Ineqüiâneas. <span class='html-italic'>Rev. Árvore</span> <b>2003</b>, <span class='html-italic'>27</span>, 59–63. [<a href="https://scholar.google.com/scholar_lookup?title=Emprego+de+An%C3%A1lise+Multivariada+Para+Estratifica%C3%A7%C3%A3o+Vertical+de+Florestas+Ineq%C3%BCi%C3%A2neas&author=Souza,+D.R.d.&author=Souza,+A.L.d.&author=Gama,+J.R.V.&author=Leite,+H.G.&publication_year=2003&journal=Rev.+%C3%81rvore&volume=27&pages=59%E2%80%9363&doi=10.1590/S0100-67622003000100008" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1590/S0100-67622003000100008" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B38-forests-14-00945' class='html-xx' data-content='38.'>Ishii, H.T.; Tanabe, S.; Hiura, T. Exploring the Relationships among Canopy Structure, Stand Productivity, and Biodiversity of Temperate Forest Ecosystems. <span class='html-italic'>For. Sci.</span> <b>2004</b>, <span class='html-italic'>50</span>, 342–355. [<a href="https://scholar.google.com/scholar_lookup?title=Exploring+the+Relationships+among+Canopy+Structure,+Stand+Productivity,+and+Biodiversity+of+Temperate+Forest+Ecosystems&author=Ishii,+H.T.&author=Tanabe,+S.&author=Hiura,+T.&publication_year=2004&journal=For.+Sci.&volume=50&pages=342%E2%80%93355" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B39-forests-14-00945' class='html-xx' data-content='39.'>Lesica, P.; Allendorf, F.W. Ecological Genetics and the Restoration of Plant Communities: Mix or Match? <span class='html-italic'>Restor. Ecol.</span> <b>1999</b>, <span class='html-italic'>7</span>, 42–50. [<a href="https://scholar.google.com/scholar_lookup?title=Ecological+Genetics+and+the+Restoration+of+Plant+Communities:+Mix+or+Match?&author=Lesica,+P.&author=Allendorf,+F.W.&publication_year=1999&journal=Restor.+Ecol.&volume=7&pages=42%E2%80%9350&doi=10.1046/j.1526-100X.1999.07105.x" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1046/j.1526-100X.1999.07105.x" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B40-forests-14-00945' class='html-xx' data-content='40.'>Carvalho, P.E.R. <span class='html-italic'>Espécies Arbóreas Brasileiras</span>; Embrapa Informação Tecnológica Brasília: Brasília, Brazil, 2003; Volume 1. [<a href="https://scholar.google.com/scholar_lookup?title=Esp%C3%A9cies+Arb%C3%B3reas+Brasileiras&author=Carvalho,+P.E.R.&publication_year=2003" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B41-forests-14-00945' class='html-xx' data-content='41.'>Carvalho, P.E.R. <span class='html-italic'>Espécies Arbóreas Brasileiras</span>; Embrapa Informação Tecnológica Brasília: Brasília, Brazil, 2008; Volume 3. [<a href="https://scholar.google.com/scholar_lookup?title=Esp%C3%A9cies+Arb%C3%B3reas+Brasileiras&author=Carvalho,+P.E.R.&publication_year=2008" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B42-forests-14-00945' class='html-xx' data-content='42.'>Carvalho, P.E.R. <span class='html-italic'>Espécies Arbóreas Brasileiras</span>; Embrapa Informação Tecnológica Brasília: Brasília, Brazil, 2014; Volume 5. [<a href="https://scholar.google.com/scholar_lookup?title=Esp%C3%A9cies+Arb%C3%B3reas+Brasileiras&author=Carvalho,+P.E.R.&publication_year=2014" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B43-forests-14-00945' class='html-xx' data-content='43.'>Carvalho, P.E.R. <span class='html-italic'>Espécies Arbóreas Brasileiras</span>; Embrapa Informação Tecnológica Brasília: Brasília, Brazil, 2006; Volume 2. [<a href="https://scholar.google.com/scholar_lookup?title=Esp%C3%A9cies+Arb%C3%B3reas+Brasileiras&author=Carvalho,+P.E.R.&publication_year=2006" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B44-forests-14-00945' class='html-xx' data-content='44.'>Miyoshi, G.T.; Imai, N.N.; Tommaselli, A.M.G.; Honkavaara, E.; Näsi, R.; Moriya, É.A.S. Radiometric Block Adjustment of Hyperspectral Image Blocks in the Brazilian Environment. <span class='html-italic'>Int. J. Remote Sens.</span> <b>2018</b>, <span class='html-italic'>39</span>, 4910–4930. [<a href="https://scholar.google.com/scholar_lookup?title=Radiometric+Block+Adjustment+of+Hyperspectral+Image+Blocks+in+the+Brazilian+Environment&author=Miyoshi,+G.T.&author=Imai,+N.N.&author=Tommaselli,+A.M.G.&author=Honkavaara,+E.&author=N%C3%A4si,+R.&author=Moriya,+%C3%89.A.S.&publication_year=2018&journal=Int.+J.+Remote+Sens.&volume=39&pages=4910%E2%80%934930&doi=10.1080/01431161.2018.1425570" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1080/01431161.2018.1425570" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B45-forests-14-00945' class='html-xx' data-content='45.'>Oliveira, R.A.; Tommaselli, A.M.; Honkavaara, E. Geometric Calibration of a Hyperspectral Frame Camera. <span class='html-italic'>Photogramm. Rec.</span> <b>2016</b>, <span class='html-italic'>31</span>, 325–347. [<a href="https://scholar.google.com/scholar_lookup?title=Geometric+Calibration+of+a+Hyperspectral+Frame+Camera&author=Oliveira,+R.A.&author=Tommaselli,+A.M.&author=Honkavaara,+E.&publication_year=2016&journal=Photogramm.+Rec.&volume=31&pages=325%E2%80%93347&doi=10.1111/phor.12153" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1111/phor.12153" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B46-forests-14-00945' class='html-xx' data-content='46.'>Oliveira, R.A.; Tommaselli, A.M.; Honkavaara, E. Generating a Hyperspectral Digital Surface Model Using a Hyperspectral 2D Frame Camera. <span class='html-italic'>ISPRS J. Photogramm. Remote Sens.</span> <b>2019</b>, <span class='html-italic'>147</span>, 345–360. [<a href="https://scholar.google.com/scholar_lookup?title=Generating+a+Hyperspectral+Digital+Surface+Model+Using+a+Hyperspectral+2D+Frame+Camera&author=Oliveira,+R.A.&author=Tommaselli,+A.M.&author=Honkavaara,+E.&publication_year=2019&journal=ISPRS+J.+Photogramm.+Remote+Sens.&volume=147&pages=345%E2%80%93360&doi=10.1016/j.isprsjprs.2018.11.025" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.isprsjprs.2018.11.025" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B47-forests-14-00945' class='html-xx' data-content='47.'>Miyoshi, G.T. Emergent Tree Species Identification in Highly Diverse Brazilian Atlantic Forest Using Hyperspectral Images Acquired with UAV. Doctoral Thesis, Universidade Estadual Paulista, Faculdade de Ciências e Tecnologia, Presidente Prudente, SP, Brazil, 2020. [<a href="https://scholar.google.com/scholar_lookup?title=Emergent+Tree+Species+Identification+in+Highly+Diverse+Brazilian+Atlantic+Forest+Using+Hyperspectral+Images+Acquired+with+UAV&author=Miyoshi,+G.T.&publication_year=2020" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B48-forests-14-00945' class='html-xx' data-content='48.'>Honkavaara, E.; Saari, H.; Kaivosoja, J.; Pölönen, I.; Hakala, T.; Litkey, P.; Mäkynen, J.; Pesonen, L. Processing and Assessment of Spectrometric, Stereoscopic Imagery Collected Using a Lightweight UAV Spectral Camera for Precision Agriculture. <span class='html-italic'>Remote Sens.</span> <b>2013</b>, <span class='html-italic'>5</span>, 5006–5039. [<a href="https://scholar.google.com/scholar_lookup?title=Processing+and+Assessment+of+Spectrometric,+Stereoscopic+Imagery+Collected+Using+a+Lightweight+UAV+Spectral+Camera+for+Precision+Agriculture&author=Honkavaara,+E.&author=Saari,+H.&author=Kaivosoja,+J.&author=P%C3%B6l%C3%B6nen,+I.&author=Hakala,+T.&author=Litkey,+P.&author=M%C3%A4kynen,+J.&author=Pesonen,+L.&publication_year=2013&journal=Remote+Sens.&volume=5&pages=5006%E2%80%935039&doi=10.3390/rs5105006" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/rs5105006" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B49-forests-14-00945' class='html-xx' data-content='49.'>Mäkeläinen, A.; Saari, H.; Hippi, I.; Sarkeala, J.; Soukkamäki, J. 2D Hyperspectral Frame Imager Camera Data in Photogrammetric Mosaicking. <span class='html-italic'>ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.</span> <b>2013</b>, <span class='html-italic'>XL-1/W2</span>, 263–267. [<a href="https://scholar.google.com/scholar_lookup?title=2D+Hyperspectral+Frame+Imager+Camera+Data+in+Photogrammetric+Mosaicking&author=M%C3%A4kel%C3%A4inen,+A.&author=Saari,+H.&author=Hippi,+I.&author=Sarkeala,+J.&author=Soukkam%C3%A4ki,+J.&publication_year=2013&journal=ISPRS-Int.+Arch.+Photogramm.+Remote+Sens.+Spat.+Inf.+Sci.&volume=XL-1/W2&pages=263%E2%80%93267&doi=10.5194/isprsarchives-XL-1-W2-263-2013" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.5194/isprsarchives-XL-1-W2-263-2013" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B50-forests-14-00945' class='html-xx' data-content='50.'>Saari, H.; Aallos, V.-V.; Akujärvi, A.; Antila, T.; Holmlund, C.; Kantojärvi, U.; Mäkynen, J.; Ollila, J. Novel Miniaturized Hyperspectral Sensor for UAV and Space Applications. In <span class='html-italic'>Proceedings of the Sensors, Systems, and Next-Generation Satellites XIII</span>; International Society for Optics and Photonics (SPIE): Bellingham, WA, USA, 2009; Volume 7474, p. 74741M. [<a href="https://scholar.google.com/scholar_lookup?title=Novel+Miniaturized+Hyperspectral+Sensor+for+UAV+and+Space+Applications&author=Saari,+H.&author=Aallos,+V.-V.&author=Akuj%C3%A4rvi,+A.&author=Antila,+T.&author=Holmlund,+C.&author=Kantoj%C3%A4rvi,+U.&author=M%C3%A4kynen,+J.&author=Ollila,+J.&publication_year=2009&pages=74741M" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B51-forests-14-00945' class='html-xx' data-content='51.'>Näsi, R.; Honkavaara, E.; Lyytikäinen-Saarenmaa, P.; Blomqvist, M.; Litkey, P.; Hakala, T.; Viljanen, N.; Kantola, T.; Tanhuapää, T.; Holopainen, M. Using UAV-Based Photogrammetry and Hyperspectral Imaging for Mapping Bark Beetle Damage at Tree-Level. <span class='html-italic'>Remote Sens.</span> <b>2015</b>, <span class='html-italic'>7</span>, 15467–15493. [<a href="https://scholar.google.com/scholar_lookup?title=Using+UAV-Based+Photogrammetry+and+Hyperspectral+Imaging+for+Mapping+Bark+Beetle+Damage+at+Tree-Level&author=N%C3%A4si,+R.&author=Honkavaara,+E.&author=Lyytik%C3%A4inen-Saarenmaa,+P.&author=Blomqvist,+M.&author=Litkey,+P.&author=Hakala,+T.&author=Viljanen,+N.&author=Kantola,+T.&author=Tanhuap%C3%A4%C3%A4,+T.&author=Holopainen,+M.&publication_year=2015&journal=Remote+Sens.&volume=7&pages=15467%E2%80%9315493&doi=10.3390/rs71115467" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/rs71115467" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B52-forests-14-00945' class='html-xx' data-content='52.'>ASD FieldSpec® UV/VNIR. <span class='html-italic'>HandHeld Spectroradiometer—User’s Guide</span>; Analytical Spectral Devices, Inc.: Boulder, CO, USA, 2002. [<a href="https://scholar.google.com/scholar_lookup?title=HandHeld+Spectroradiometer%E2%80%94User%E2%80%99s+Guide&author=ASD+FieldSpec%C2%AE+UV/VNIR&publication_year=2002" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B53-forests-14-00945' class='html-xx' data-content='53.'>Moriya, E.A.S.; Imai, N.N.; Tommaselli, A.M.G.; Miyoshi, G.T. Mapping Mosaic Virus in Sugarcane Based on Hyperspectral Images. <span class='html-italic'>IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens.</span> <b>2017</b>, <span class='html-italic'>10</span>, 740–748. [<a href="https://scholar.google.com/scholar_lookup?title=Mapping+Mosaic+Virus+in+Sugarcane+Based+on+Hyperspectral+Images&author=Moriya,+E.A.S.&author=Imai,+N.N.&author=Tommaselli,+A.M.G.&author=Miyoshi,+G.T.&publication_year=2017&journal=IEEE+J.+Sel.+Top.+Appl.+Earth+Obs.+Remote+Sens.&volume=10&pages=740%E2%80%93748&doi=10.1109/JSTARS.2016.2635482" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1109/JSTARS.2016.2635482" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B54-forests-14-00945' class='html-xx' data-content='54.'>Honkavaara, E.; Kaivosoja, J.; Mäkynen, J.; Pellikka, I.; Pesonen, L.; Saari, H.; Salo, H.; Hakala, T.; Marklelin, L.; Rosnell, T. Hyperspectral Reflectance Signatures and Point Clouds for Precision Agriculture by Light Weight UAV Imaging System. <span class='html-italic'>ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci.</span> <b>2012</b>, <span class='html-italic'>7</span>, 353–358. [<a href="https://scholar.google.com/scholar_lookup?title=Hyperspectral+Reflectance+Signatures+and+Point+Clouds+for+Precision+Agriculture+by+Light+Weight+UAV+Imaging+System&author=Honkavaara,+E.&author=Kaivosoja,+J.&author=M%C3%A4kynen,+J.&author=Pellikka,+I.&author=Pesonen,+L.&author=Saari,+H.&author=Salo,+H.&author=Hakala,+T.&author=Marklelin,+L.&author=Rosnell,+T.&publication_year=2012&journal=ISPRS+Ann.+Photogramm.+Remote+Sens.+Spat.+Inf.+Sci.&volume=7&pages=353%E2%80%93358&doi=10.5194/isprsannals-I-7-353-2012" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.5194/isprsannals-I-7-353-2012" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B55-forests-14-00945' class='html-xx' data-content='55.'>Honkavaara, E.; Hakala, T.; Saari, H.; Markelin, L.; Mäkynen, J.; Rosnell, T. A Process for Radiometric Correction of UAV Image Blocks. <span class='html-italic'>Photogramm. Fernerkund. Geoinf.</span> <b>2012</b>, 115–127. [<a href="https://scholar.google.com/scholar_lookup?title=A+Process+for+Radiometric+Correction+of+UAV+Image+Blocks&author=Honkavaara,+E.&author=Hakala,+T.&author=Saari,+H.&author=Markelin,+L.&author=M%C3%A4kynen,+J.&author=Rosnell,+T.&publication_year=2012&journal=Photogramm.+Fernerkund.+Geoinf.&pages=115%E2%80%93127&doi=10.1127/1432-8364/2012/0106" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1127/1432-8364/2012/0106" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B56-forests-14-00945' class='html-xx' data-content='56.'>Miyoshi, G.T. Caracterização Espectral de Espécies de Mata Atlântica de Interior Em Nível Foliar e de Copa. Master’s Thesis, Universidade Estadual Paulista, Faculdade de Ciências e Tecnologia, Presidente Prudente, SP, Brazil, 2016. [<a href="https://scholar.google.com/scholar_lookup?title=Caracteriza%C3%A7%C3%A3o+Espectral+de+Esp%C3%A9cies+de+Mata+Atl%C3%A2ntica+de+Interior+Em+N%C3%ADvel+Foliar+e+de+Copa&author=Miyoshi,+G.T.&publication_year=2016" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B57-forests-14-00945' class='html-xx' data-content='57.'>Honkavaara, E.; Rosnell, T.; Oliveira, R.; Tommaselli, A. Band Registration of Tuneable Frame Format Hyperspectral UAV Imagers in Complex Scenes. <span class='html-italic'>ISPRS J. Photogramm. Remote Sens.</span> <b>2017</b>, <span class='html-italic'>134</span>, 96–109. [<a href="https://scholar.google.com/scholar_lookup?title=Band+Registration+of+Tuneable+Frame+Format+Hyperspectral+UAV+Imagers+in+Complex+Scenes&author=Honkavaara,+E.&author=Rosnell,+T.&author=Oliveira,+R.&author=Tommaselli,+A.&publication_year=2017&journal=ISPRS+J.+Photogramm.+Remote+Sens.&volume=134&pages=96%E2%80%93109&doi=10.1016/j.isprsjprs.2017.10.014" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.isprsjprs.2017.10.014" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B58-forests-14-00945' class='html-xx' data-content='58.'>Baugh, W.M.; Groeneveld, D.P. Empirical Proof of the Empirical Line. <span class='html-italic'>Int. J. Remote Sens.</span> <b>2008</b>, <span class='html-italic'>29</span>, 665–672. [<a href="https://scholar.google.com/scholar_lookup?title=Empirical+Proof+of+the+Empirical+Line&author=Baugh,+W.M.&author=Groeneveld,+D.P.&publication_year=2008&journal=Int.+J.+Remote+Sens.&volume=29&pages=665%E2%80%93672&doi=10.1080/01431160701352162" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1080/01431160701352162" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B59-forests-14-00945' class='html-xx' data-content='59.'>Richards, J.A.; Jia, X. <span class='html-italic'>Remote Sensing Digital Image Analysis</span>, 4th ed.; Springer: Berlin, Germany, 2006. [<a href="https://scholar.google.com/scholar_lookup?title=Remote+Sensing+Digital+Image+Analysis&author=Richards,+J.A.&author=Jia,+X.&publication_year=2006" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B60-forests-14-00945' class='html-xx' data-content='60.'>Jensen, J.R. <span class='html-italic'>Remote Sensing of the Environment: An Earth Resource Perspective 2/e</span>, 2nd ed.; Prentice Hall: Upper Saddle River, NJ, USA, 2006. [<a href="https://scholar.google.com/scholar_lookup?title=Remote+Sensing+of+the+Environment:+An+Earth+Resource+Perspective+2/e&author=Jensen,+J.R.&publication_year=2006" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B61-forests-14-00945' class='html-xx' data-content='61.'>Ponzoni, F.J.; Shimabukuro, Y.E.; Kuplich, T.M. <span class='html-italic'>Sensoriamento Remoto Da Vegetação (Remote Sensing of Vegetation)</span>, 2nd ed.; Oficina de Textos: São Paulo, Brazil, 2012. [<a href="https://scholar.google.com/scholar_lookup?title=Sensoriamento+Remoto+Da+Vegeta%C3%A7%C3%A3o+(Remote+Sensing+of+Vegetation)&author=Ponzoni,+F.J.&author=Shimabukuro,+Y.E.&author=Kuplich,+T.M.&publication_year=2012" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B62-forests-14-00945' class='html-xx' data-content='62.'>Martins-Neto, R.P.; Tommaselli, A.M.G.; Imai, N.N.; David, H.C.; Miltiadou, M.; Honkavaara, E. Identification of Significative LiDAR Metrics and Comparison of Machine Learning Approaches for Estimating Stand and Diversity Variables in Heterogeneous Brazilian Atlantic Forest. <span class='html-italic'>Remote Sens.</span> <b>2021</b>, <span class='html-italic'>13</span>, 2444. [<a href="https://scholar.google.com/scholar_lookup?title=Identification+of+Significative+LiDAR+Metrics+and+Comparison+of+Machine+Learning+Approaches+for+Estimating+Stand+and+Diversity+Variables+in+Heterogeneous+Brazilian+Atlantic+Forest&author=Martins-Neto,+R.P.&author=Tommaselli,+A.M.G.&author=Imai,+N.N.&author=David,+H.C.&author=Miltiadou,+M.&author=Honkavaara,+E.&publication_year=2021&journal=Remote+Sens.&volume=13&pages=2444&doi=10.3390/rs13132444" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/rs13132444" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B63-forests-14-00945' class='html-xx' data-content='63.'>Isenburg, M. LAStools-Efficient LiDAR Processing Software. Available online: <a href='http://lastools.org/' target='_blank' rel="noopener noreferrer" >http://lastools.org/</a> (accessed on 12 November 2020).</li><li id='B64-forests-14-00945' class='html-xx' data-content='64.'><span class='html-italic'>R Core Team R: A Language and Environment for Statistical Computing</span>; R Core Team: Vienna, Austria, 2017.</li><li id='B65-forests-14-00945' class='html-xx' data-content='65.'>Roussel, J.-R.; Auty, D.; De Boissieu, F.; Meador, A.S. LidR: Airborne LiDAR Data Manipulation and Visualization for Forestry Applications. Available online: <a href='https://rdrr.io/cran/lidR/' target='_blank' rel="noopener noreferrer" >https://rdrr.io/cran/lidR/</a> (accessed on 21 January 2021).</li><li id='B66-forests-14-00945' class='html-xx' data-content='66.'>Roussel, J.-R.; Bourdon, J.-F.; Achim, A. Range-Based Intensity Normalization of ALS Data over Forested Areas Using a Sensor Tracking Method from Multiple Returns. <span class='html-italic'>Non-Peer Rev. EarthArXiv Prepr.</span> <b>2020</b>. [<a href="https://scholar.google.com/scholar_lookup?title=Range-Based+Intensity+Normalization+of+ALS+Data+over+Forested+Areas+Using+a+Sensor+Tracking+Method+from+Multiple+Returns&author=Roussel,+J.-R.&author=Bourdon,+J.-F.&author=Achim,+A.&publication_year=2020&journal=Non-Peer+Rev.+EarthArXiv+Prepr.&doi=10.31223/osf.io/k32qw" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.31223/osf.io/k32qw" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B67-forests-14-00945' class='html-xx' data-content='67.'>Gatziolis, D. Dynamic Range-Based Intensity Normalization for Airborne, Discrete Return Lidar Data of Forest Canopies. <span class='html-italic'>Photogramm. Eng. Remote Sens.</span> <b>2011</b>, <span class='html-italic'>77</span>, 251–259. [<a href="https://scholar.google.com/scholar_lookup?title=Dynamic+Range-Based+Intensity+Normalization+for+Airborne,+Discrete+Return+Lidar+Data+of+Forest+Canopies&author=Gatziolis,+D.&publication_year=2011&journal=Photogramm.+Eng.+Remote+Sens.&volume=77&pages=251%E2%80%93259&doi=10.14358/PERS.77.3.251" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.14358/PERS.77.3.251" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B68-forests-14-00945' class='html-xx' data-content='68.'>Kashani, A.G.; Olsen, M.J.; Parrish, C.E.; Wilson, N. A Review of LiDAR Radiometric Processing: From Ad Hoc Intensity Correction to Rigorous Radiometric Calibration. <span class='html-italic'>Sensors</span> <b>2015</b>, <span class='html-italic'>15</span>, 28099–28128. [<a href="https://scholar.google.com/scholar_lookup?title=A+Review+of+LiDAR+Radiometric+Processing:+From+Ad+Hoc+Intensity+Correction+to+Rigorous+Radiometric+Calibration&author=Kashani,+A.G.&author=Olsen,+M.J.&author=Parrish,+C.E.&author=Wilson,+N.&publication_year=2015&journal=Sensors&volume=15&pages=28099%E2%80%9328128&doi=10.3390/s151128099" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/s151128099" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B69-forests-14-00945' class='html-xx' data-content='69.'>Khosravipour, A.; Skidmore, A.K.; Isenburg, M.; Wang, T.; Hussin, Y.A. Generating Pit-Free Canopy Height Models from Airborne Lidar. <span class='html-italic'>Photogramm. Eng. Remote Sens.</span> <b>2014</b>, <span class='html-italic'>80</span>, 863–872. [<a href="https://scholar.google.com/scholar_lookup?title=Generating+Pit-Free+Canopy+Height+Models+from+Airborne+Lidar&author=Khosravipour,+A.&author=Skidmore,+A.K.&author=Isenburg,+M.&author=Wang,+T.&author=Hussin,+Y.A.&publication_year=2014&journal=Photogramm.+Eng.+Remote+Sens.&volume=80&pages=863%E2%80%93872&doi=10.14358/PERS.80.9.863" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.14358/PERS.80.9.863" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B70-forests-14-00945' class='html-xx' data-content='70.'>Miltiadou, M.; Grant, M.; Brown, M.; Warren, M.; Carolan, E. Reconstruction of a 3D Polygon Representation from Full-Waveform LiDAR Data. In Proceedings of the RSPSoc Annual Conference, Aberystwyth, UK, 2 September 2014. [<a href="https://scholar.google.com/scholar_lookup?title=Reconstruction+of+a+3D+Polygon+Representation+from+Full-Waveform+LiDAR+Data&conference=Proceedings+of+the+RSPSoc+Annual+Conference&author=Miltiadou,+M.&author=Grant,+M.&author=Brown,+M.&author=Warren,+M.&author=Carolan,+E.&publication_year=2014" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B71-forests-14-00945' class='html-xx' data-content='71.'>Miltiadou, M.; Warren, M.; Grant, M.G.; Brown, M.A. Alignment of Hyperspectral Imagery and Full-Waveform LiDAR Data for Visualisation and Classification Purposes. <span class='html-italic'>Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.-ISPRS Arch.</span> <b>2015</b>, <span class='html-italic'>XL-7/W3</span>, 1257–1264. [<a href="https://scholar.google.com/scholar_lookup?title=Alignment+of+Hyperspectral+Imagery+and+Full-Waveform+LiDAR+Data+for+Visualisation+and+Classification+Purposes&author=Miltiadou,+M.&author=Warren,+M.&author=Grant,+M.G.&author=Brown,+M.A.&publication_year=2015&journal=Int.+Arch.+Photogramm.+Remote+Sens.+Spat.+Inf.+Sci.-ISPRS+Arch.&volume=XL-7/W3&pages=1257%E2%80%931264&doi=10.5194/isprsarchives-XL-7-W3-1257-2015" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.5194/isprsarchives-XL-7-W3-1257-2015" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B72-forests-14-00945' class='html-xx' data-content='72.'>Miltiadou, M.; Grant, M.G.; Campbell, N.D.; Warren, M.; Clewley, D.; Hadjimitsis, D.G. Open Source Software DASOS: Efficient Accumulation, Analysis, and Visualisation of Full-Waveform Lidar. In Proceedings of the Seventh International Conference on Remote Sensing and Geoinformation of the Environment (RSCy2019), International Society for Optics and Photonics, Paphos, Cyprus, 21 March 2019; p. 111741M. [<a href="https://scholar.google.com/scholar_lookup?title=Open+Source+Software+DASOS:+Efficient+Accumulation,+Analysis,+and+Visualisation+of+Full-Waveform+Lidar&conference=Proceedings+of+the+Seventh+International+Conference+on+Remote+Sensing+and+Geoinformation+of+the+Environment+(RSCy2019),+International+Society+for+Optics+and+Photonics&author=Miltiadou,+M.&author=Grant,+M.G.&author=Campbell,+N.D.&author=Warren,+M.&author=Clewley,+D.&author=Hadjimitsis,+D.G.&publication_year=2019&pages=111741M" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B73-forests-14-00945' class='html-xx' data-content='73.'>Clark, M.L.; Roberts, D.A.; Clark, D.B. Hyperspectral Discrimination of Tropical Rain Forest Tree Species at Leaf to Crown Scales. <span class='html-italic'>Remote Sens. Environ.</span> <b>2005</b>, <span class='html-italic'>96</span>, 375–398. [<a href="https://scholar.google.com/scholar_lookup?title=Hyperspectral+Discrimination+of+Tropical+Rain+Forest+Tree+Species+at+Leaf+to+Crown+Scales&author=Clark,+M.L.&author=Roberts,+D.A.&author=Clark,+D.B.&publication_year=2005&journal=Remote+Sens.+Environ.&volume=96&pages=375%E2%80%93398&doi=10.1016/j.rse.2005.03.009" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.rse.2005.03.009" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B74-forests-14-00945' class='html-xx' data-content='74.'>Dalponte, M.; Ørka, H.O.; Ene, L.T.; Gobakken, T.; Næsset, E. Tree Crown Delineation and Tree Species Classification in Boreal Forests Using Hyperspectral and ALS Data. <span class='html-italic'>Remote Sens. Environ.</span> <b>2014</b>, <span class='html-italic'>140</span>, 306–317. [<a href="https://scholar.google.com/scholar_lookup?title=Tree+Crown+Delineation+and+Tree+Species+Classification+in+Boreal+Forests+Using+Hyperspectral+and+ALS+Data&author=Dalponte,+M.&author=%C3%98rka,+H.O.&author=Ene,+L.T.&author=Gobakken,+T.&author=N%C3%A6sset,+E.&publication_year=2014&journal=Remote+Sens.+Environ.&volume=140&pages=306%E2%80%93317&doi=10.1016/j.rse.2013.09.006" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.rse.2013.09.006" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B75-forests-14-00945' class='html-xx' data-content='75.'>Tochon, G.; Feret, J.-B.; Valero, S.; Martin, R.E.; Knapp, D.E.; Salembier, P.; Chanussot, J.; Asner, G.P. On the Use of Binary Partition Trees for the Tree Crown Segmentation of Tropical Rainforest Hyperspectral Images. <span class='html-italic'>Remote Sens. Environ.</span> <b>2015</b>, <span class='html-italic'>159</span>, 318–331. [<a href="https://scholar.google.com/scholar_lookup?title=On+the+Use+of+Binary+Partition+Trees+for+the+Tree+Crown+Segmentation+of+Tropical+Rainforest+Hyperspectral+Images&author=Tochon,+G.&author=Feret,+J.-B.&author=Valero,+S.&author=Martin,+R.E.&author=Knapp,+D.E.&author=Salembier,+P.&author=Chanussot,+J.&author=Asner,+G.P.&publication_year=2015&journal=Remote+Sens.+Environ.&volume=159&pages=318%E2%80%93331&doi=10.1016/j.rse.2014.12.020" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.rse.2014.12.020" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B76-forests-14-00945' class='html-xx' data-content='76.'>Achanta, R.; Shaji, A.; Smith, K.; Lucchi, A.; Fua, P.; Süsstrunk, S. SLIC Superpixels Compared to State-of-the-Art Superpixel Methods. <span class='html-italic'>IEEE Trans. Pattern Anal. Mach. Intell.</span> <b>2012</b>, <span class='html-italic'>34</span>, 2274–2282. [<a href="https://scholar.google.com/scholar_lookup?title=SLIC+Superpixels+Compared+to+State-of-the-Art+Superpixel+Methods&author=Achanta,+R.&author=Shaji,+A.&author=Smith,+K.&author=Lucchi,+A.&author=Fua,+P.&author=S%C3%BCsstrunk,+S.&publication_year=2012&journal=IEEE+Trans.+Pattern+Anal.+Mach.+Intell.&volume=34&pages=2274%E2%80%932282&doi=10.1109/TPAMI.2012.120" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1109/TPAMI.2012.120" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B77-forests-14-00945' class='html-xx' data-content='77.'>Nowosad, J.; Stepinski, T.F. Extended SLIC Superpixels Algorithm for Applications to Non-Imagery Geospatial Rasters. <span class='html-italic'>Int. J. Appl. Earth Obs. Geoinf.</span> <b>2022</b>, <span class='html-italic'>112</span>, 102935. [<a href="https://scholar.google.com/scholar_lookup?title=Extended+SLIC+Superpixels+Algorithm+for+Applications+to+Non-Imagery+Geospatial+Rasters&author=Nowosad,+J.&author=Stepinski,+T.F.&publication_year=2022&journal=Int.+J.+Appl.+Earth+Obs.+Geoinf.&volume=112&pages=102935&doi=10.1016/j.jag.2022.102935" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.jag.2022.102935" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B78-forests-14-00945' class='html-xx' data-content='78.'>Bruzzone, L.; Roli, F.; Serpico, S.B. An Extension of the Jeffreys-Matusita Distance to Multiclass Cases for Feature Selection. <span class='html-italic'>IEEE Trans. Geosci. Remote Sens.</span> <b>1995</b>, <span class='html-italic'>33</span>, 1318–1321. [<a href="https://scholar.google.com/scholar_lookup?title=An+Extension+of+the+Jeffreys-Matusita+Distance+to+Multiclass+Cases+for+Feature+Selection&author=Bruzzone,+L.&author=Roli,+F.&author=Serpico,+S.B.&publication_year=1995&journal=IEEE+Trans.+Geosci.+Remote+Sens.&volume=33&pages=1318%E2%80%931321&doi=10.1109/36.477187" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1109/36.477187" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B79-forests-14-00945' class='html-xx' data-content='79.'>Sothe, C.; Dalponte, M.; Almeida, C.M.d.; Schimalski, M.B.; Lima, C.L.; Liesenberg, V.; Miyoshi, G.T.; Tommaselli, A.M.G. Tree Species Classification in a Highly Diverse Subtropical Forest Integrating UAV-Based Photogrammetric Point Cloud and Hyperspectral Data. <span class='html-italic'>Remote Sens.</span> <b>2019</b>, <span class='html-italic'>11</span>, 1338. [<a href="https://scholar.google.com/scholar_lookup?title=Tree+Species+Classification+in+a+Highly+Diverse+Subtropical+Forest+Integrating+UAV-Based+Photogrammetric+Point+Cloud+and+Hyperspectral+Data&author=Sothe,+C.&author=Dalponte,+M.&author=Almeida,+C.M.d.&author=Schimalski,+M.B.&author=Lima,+C.L.&author=Liesenberg,+V.&author=Miyoshi,+G.T.&author=Tommaselli,+A.M.G.&publication_year=2019&journal=Remote+Sens.&volume=11&pages=1338&doi=10.3390/rs11111338" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/rs11111338" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B80-forests-14-00945' class='html-xx' data-content='80.'>Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring Vegetation Systems in the Great Plains with ERTS. <span class='html-italic'>NASA Spec.</span> <b>1974</b>, <span class='html-italic'>351</span>, 309. [<a href="https://scholar.google.com/scholar_lookup?title=Monitoring+Vegetation+Systems+in+the+Great+Plains+with+ERTS&author=Rouse,+J.W.&author=Haas,+R.H.&author=Schell,+J.A.&author=Deering,+D.W.&publication_year=1974&journal=NASA+Spec.&volume=351&pages=309" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B81-forests-14-00945' class='html-xx' data-content='81.'>Gandia, S.; Fernández, G.; García, J.C.; Moreno, J. Retrieval of Vegetation Biophysical Variables from CHRIS/PROBA Data in the SPARC Campaign. <span class='html-italic'>Esa. Sp.</span> <b>2004</b>, <span class='html-italic'>578</span>, 40–48. [<a href="https://scholar.google.com/scholar_lookup?title=Retrieval+of+Vegetation+Biophysical+Variables+from+CHRIS/PROBA+Data+in+the+SPARC+Campaign&author=Gandia,+S.&author=Fern%C3%A1ndez,+G.&author=Garc%C3%ADa,+J.C.&author=Moreno,+J.&publication_year=2004&journal=Esa.+Sp.&volume=578&pages=40%E2%80%9348" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B82-forests-14-00945' class='html-xx' data-content='82.'>Main, R.; Cho, M.A.; Mathieu, R.; O’Kennedy, M.M.; Ramoelo, A.; Koch, S. An Investigation into Robust Spectral Indices for Leaf Chlorophyll Estimation. <span class='html-italic'>ISPRS J. Photogramm. Remote Sens.</span> <b>2011</b>, <span class='html-italic'>66</span>, 751–761. [<a href="https://scholar.google.com/scholar_lookup?title=An+Investigation+into+Robust+Spectral+Indices+for+Leaf+Chlorophyll+Estimation&author=Main,+R.&author=Cho,+M.A.&author=Mathieu,+R.&author=O%E2%80%99Kennedy,+M.M.&author=Ramoelo,+A.&author=Koch,+S.&publication_year=2011&journal=ISPRS+J.+Photogramm.+Remote+Sens.&volume=66&pages=751%E2%80%93761&doi=10.1016/j.isprsjprs.2011.08.001" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.isprsjprs.2011.08.001" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B83-forests-14-00945' class='html-xx' data-content='83.'>Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a Green Channel in Remote Sensing of Global Vegetation from EOS-MODIS. <span class='html-italic'>Remote Sens. Environ.</span> <b>1996</b>, <span class='html-italic'>58</span>, 289–298. [<a href="https://scholar.google.com/scholar_lookup?title=Use+of+a+Green+Channel+in+Remote+Sensing+of+Global+Vegetation+from+EOS-MODIS&author=Gitelson,+A.A.&author=Kaufman,+Y.J.&author=Merzlyak,+M.N.&publication_year=1996&journal=Remote+Sens.+Environ.&volume=58&pages=289%E2%80%93298&doi=10.1016/S0034-4257(96)00072-7" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/S0034-4257(96)00072-7" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B84-forests-14-00945' class='html-xx' data-content='84.'>Le Maire, G.; Francois, C.; Dufrene, E. Towards Universal Broad Leaf Chlorophyll Indices Using PROSPECT Simulated Database and Hyperspectral Reflectance Measurements. <span class='html-italic'>Remote Sens. Environ.</span> <b>2004</b>, <span class='html-italic'>89</span>, 1–28. [<a href="https://scholar.google.com/scholar_lookup?title=Towards+Universal+Broad+Leaf+Chlorophyll+Indices+Using+PROSPECT+Simulated+Database+and+Hyperspectral+Reflectance+Measurements&author=Le+Maire,+G.&author=Francois,+C.&author=Dufrene,+E.&publication_year=2004&journal=Remote+Sens.+Environ.&volume=89&pages=1%E2%80%9328&doi=10.1016/j.rse.2003.09.004" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.rse.2003.09.004" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B85-forests-14-00945' class='html-xx' data-content='85.'>Daughtry, C.S.; Walthall, C.L.; Kim, M.S.; De Colstoun, E.B.; McMurtrey Iii, J.E. Estimating Corn Leaf Chlorophyll Concentration from Leaf and Canopy Reflectance. <span class='html-italic'>Remote Sens. Environ.</span> <b>2000</b>, <span class='html-italic'>74</span>, 229–239. [<a href="https://scholar.google.com/scholar_lookup?title=Estimating+Corn+Leaf+Chlorophyll+Concentration+from+Leaf+and+Canopy+Reflectance&author=Daughtry,+C.S.&author=Walthall,+C.L.&author=Kim,+M.S.&author=De+Colstoun,+E.B.&author=McMurtrey+Iii,+J.E.&publication_year=2000&journal=Remote+Sens.+Environ.&volume=74&pages=229%E2%80%93239&doi=10.1016/S0034-4257(00)00113-9" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/S0034-4257(00)00113-9" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B86-forests-14-00945' class='html-xx' data-content='86.'>Gamon, J.; Serrano, L.; Surfus, J.S. The Photochemical Reflectance Index: An Optical Indicator of Photosynthetic Radiation Use Efficiency across Species, Functional Types, and Nutrient Levels. <span class='html-italic'>Oecologia</span> <b>1997</b>, <span class='html-italic'>112</span>, 492–501. [<a href="https://scholar.google.com/scholar_lookup?title=The+Photochemical+Reflectance+Index:+An+Optical+Indicator+of+Photosynthetic+Radiation+Use+Efficiency+across+Species,+Functional+Types,+and+Nutrient+Levels&author=Gamon,+J.&author=Serrano,+L.&author=Surfus,+J.S.&publication_year=1997&journal=Oecologia&volume=112&pages=492%E2%80%93501&doi=10.1007/s004420050337" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1007/s004420050337" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B87-forests-14-00945' class='html-xx' data-content='87.'>Merzlyak, M.N.; Gitelson, A.A.; Chivkunova, O.B.; Rakitin, V.Y. Non-Destructive Optical Detection of Pigment Changes during Leaf Senescence and Fruit Ripening. <span class='html-italic'>Physiol. Plant.</span> <b>1999</b>, <span class='html-italic'>106</span>, 135–141. [<a href="https://scholar.google.com/scholar_lookup?title=Non-Destructive+Optical+Detection+of+Pigment+Changes+during+Leaf+Senescence+and+Fruit+Ripening&author=Merzlyak,+M.N.&author=Gitelson,+A.A.&author=Chivkunova,+O.B.&author=Rakitin,+V.Y.&publication_year=1999&journal=Physiol.+Plant.&volume=106&pages=135%E2%80%93141&doi=10.1034/j.1399-3054.1999.106119.x" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1034/j.1399-3054.1999.106119.x" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B88-forests-14-00945' class='html-xx' data-content='88.'>Blackburn, G.A. Spectral Indices for Estimating Photosynthetic Pigment Concentrations: A Test Using Senescent Tree Leaves. <span class='html-italic'>Int. J. Remote Sens.</span> <b>1998</b>, <span class='html-italic'>19</span>, 657–675. [<a href="https://scholar.google.com/scholar_lookup?title=Spectral+Indices+for+Estimating+Photosynthetic+Pigment+Concentrations:+A+Test+Using+Senescent+Tree+Leaves&author=Blackburn,+G.A.&publication_year=1998&journal=Int.+J.+Remote+Sens.&volume=19&pages=657%E2%80%93675&doi=10.1080/014311698215919" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1080/014311698215919" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B89-forests-14-00945' class='html-xx' data-content='89.'>Clevers, J.G. Imaging Spectrometry in Agriculture-Plant Vitality and Yield Indicators. In <span class='html-italic'>Imaging Spectrometry—A Tool for Environmental Observations</span>; Remote Sensing; Springer: Eurocourses, Dordrecht, 1994; Volume 4, pp. 193–219. [<a href="https://scholar.google.com/scholar_lookup?title=Imaging+Spectrometry+in+Agriculture-Plant+Vitality+and+Yield+Indicators&author=Clevers,+J.G.&publication_year=1994&pages=193%E2%80%93219" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B90-forests-14-00945' class='html-xx' data-content='90.'>Baranoski, G.V.G.; Rokne, J.G. A Practical Approach for Estimating the Red Edge Position of Plant Leaf Reflectance. <span class='html-italic'>Int. J. Remote Sens.</span> <b>2005</b>, <span class='html-italic'>26</span>, 503–521. [<a href="https://scholar.google.com/scholar_lookup?title=A+Practical+Approach+for+Estimating+the+Red+Edge+Position+of+Plant+Leaf+Reflectance&author=Baranoski,+G.V.G.&author=Rokne,+J.G.&publication_year=2005&journal=Int.+J.+Remote+Sens.&volume=26&pages=503%E2%80%93521&doi=10.1080/01431160512331314029" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1080/01431160512331314029" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B91-forests-14-00945' class='html-xx' data-content='91.'>Dawson, T.P.; Curran, P.J. Technical Note A New Technique for Interpolating the Reflectance Red Edge Position. <span class='html-italic'>Int. J. Remote Sens.</span> <b>1998</b>, <span class='html-italic'>19</span>, 2133–2139. [<a href="https://scholar.google.com/scholar_lookup?title=Technical+Note+A+New+Technique+for+Interpolating+the+Reflectance+Red+Edge+Position&author=Dawson,+T.P.&author=Curran,+P.J.&publication_year=1998&journal=Int.+J.+Remote+Sens.&volume=19&pages=2133%E2%80%932139&doi=10.1080/014311698214910" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1080/014311698214910" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B92-forests-14-00945' class='html-xx' data-content='92.'>Peñuelas, J.; Baret, F.; Filella, I. Semi-Empirical Indices to Assess Carotenoids/Chlorophyll a Ratio from Leaf Spectral Reflectance. <span class='html-italic'>Photosynthetica</span> <b>1995</b>, <span class='html-italic'>31</span>, 221–230. [<a href="https://scholar.google.com/scholar_lookup?title=Semi-Empirical+Indices+to+Assess+Carotenoids/Chlorophyll+a+Ratio+from+Leaf+Spectral+Reflectance&author=Pe%C3%B1uelas,+J.&author=Baret,+F.&author=Filella,+I.&publication_year=1995&journal=Photosynthetica&volume=31&pages=221%E2%80%93230" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B93-forests-14-00945' class='html-xx' data-content='93.'>Lê, S.; Josse, J.; Husson, F. FactoMineR: An R Package for Multivariate Analysis. <span class='html-italic'>J. Stat. Softw.</span> <b>2008</b>, <span class='html-italic'>25</span>, 1–18. [<a href="https://scholar.google.com/scholar_lookup?title=FactoMineR:+An+R+Package+for+Multivariate+Analysis&author=L%C3%AA,+S.&author=Josse,+J.&author=Husson,+F.&publication_year=2008&journal=J.+Stat.+Softw.&volume=25&pages=1%E2%80%9318&doi=10.18637/jss.v025.i01" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.18637/jss.v025.i01" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B94-forests-14-00945' class='html-xx' data-content='94.'>Abdi, H.; Williams, L.J. Principal Component Analysis. <span class='html-italic'>Wiley Interdiscip. Rev. Comput. Stat.</span> <b>2010</b>, <span class='html-italic'>2</span>, 433–459. [<a href="https://scholar.google.com/scholar_lookup?title=Principal+Component+Analysis&author=Abdi,+H.&author=Williams,+L.J.&publication_year=2010&journal=Wiley+Interdiscip.+Rev.+Comput.+Stat.&volume=2&pages=433%E2%80%93459&doi=10.1002/wics.101" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1002/wics.101" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B95-forests-14-00945' class='html-xx' data-content='95.'>Kaiser, H.F. The Varimax Criterion for Analytic Rotation in Factor Analysis. <span class='html-italic'>Psychometrika</span> <b>1958</b>, <span class='html-italic'>23</span>, 187–200. [<a href="https://scholar.google.com/scholar_lookup?title=The+Varimax+Criterion+for+Analytic+Rotation+in+Factor+Analysis&author=Kaiser,+H.F.&publication_year=1958&journal=Psychometrika&volume=23&pages=187%E2%80%93200&doi=10.1007/BF02289233" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1007/BF02289233" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B96-forests-14-00945' class='html-xx' data-content='96.'>Breiman, L. Random Forests, Machine Learning 45. <span class='html-italic'>J. Clin. Microbiol</span> <b>2001</b>, <span class='html-italic'>2</span>, 199–228. [<a href="https://scholar.google.com/scholar_lookup?title=Random+Forests,+Machine+Learning+45&author=Breiman,+L.&publication_year=2001&journal=J.+Clin.+Microbiol&volume=2&pages=199%E2%80%93228" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B97-forests-14-00945' class='html-xx' data-content='97.'>Belgiu, M.; Drăguţ, L. Random Forest in Remote Sensing: A Review of Applications and Future Directions. <span class='html-italic'>ISPRS J. Photogramm. Remote Sens.</span> <b>2016</b>, <span class='html-italic'>114</span>, 24–31. [<a href="https://scholar.google.com/scholar_lookup?title=Random+Forest+in+Remote+Sensing:+A+Review+of+Applications+and+Future+Directions&author=Belgiu,+M.&author=Dr%C4%83gu%C5%A3,+L.&publication_year=2016&journal=ISPRS+J.+Photogramm.+Remote+Sens.&volume=114&pages=24%E2%80%9331&doi=10.1016/j.isprsjprs.2016.01.011" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.isprsjprs.2016.01.011" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B98-forests-14-00945' class='html-xx' data-content='98.'>Liaw, A.; Wiener, M. Classification and Regression by RandomForest. <span class='html-italic'>R News</span> <b>2002</b>, <span class='html-italic'>2</span>, 18–22. [<a href="https://scholar.google.com/scholar_lookup?title=Classification+and+Regression+by+RandomForest&author=Liaw,+A.&author=Wiener,+M.&publication_year=2002&journal=R+News&volume=2&pages=18%E2%80%9322" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B99-forests-14-00945' class='html-xx' data-content='99.'>Pal, M. Random Forest Classifier for Remote Sensing Classification. <span class='html-italic'>Int. J. Remote Sens.</span> <b>2005</b>, <span class='html-italic'>26</span>, 217–222. [<a href="https://scholar.google.com/scholar_lookup?title=Random+Forest+Classifier+for+Remote+Sensing+Classification&author=Pal,+M.&publication_year=2005&journal=Int.+J.+Remote+Sens.&volume=26&pages=217%E2%80%93222&doi=10.1080/01431160412331269698" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1080/01431160412331269698" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B100-forests-14-00945' class='html-xxx' data-content='100.'>Prasad, A.M.; Iverson, L.R.; Liaw, A. Newer Classification and Regression Tree Techniques: Bagging and Random Forests for Ecological Prediction. <span class='html-italic'>Ecosystems</span> <b>2006</b>, <span class='html-italic'>9</span>, 181–199. [<a href="https://scholar.google.com/scholar_lookup?title=Newer+Classification+and+Regression+Tree+Techniques:+Bagging+and+Random+Forests+for+Ecological+Prediction&author=Prasad,+A.M.&author=Iverson,+L.R.&author=Liaw,+A.&publication_year=2006&journal=Ecosystems&volume=9&pages=181%E2%80%93199&doi=10.1007/s10021-005-0054-1" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1007/s10021-005-0054-1" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B101-forests-14-00945' class='html-xxx' data-content='101.'>Nevalainen, O.; Honkavaara, E.; Tuominen, S.; Viljanen, N.; Hakala, T.; Yu, X.; Hyyppä, J.; Saari, H.; Pölönen, I.; Imai, N.N. Individual Tree Detection and Classification with UAV-Based Photogrammetric Point Clouds and Hyperspectral Imaging. <span class='html-italic'>Remote Sens.</span> <b>2017</b>, <span class='html-italic'>9</span>, 185. [<a href="https://scholar.google.com/scholar_lookup?title=Individual+Tree+Detection+and+Classification+with+UAV-Based+Photogrammetric+Point+Clouds+and+Hyperspectral+Imaging&author=Nevalainen,+O.&author=Honkavaara,+E.&author=Tuominen,+S.&author=Viljanen,+N.&author=Hakala,+T.&author=Yu,+X.&author=Hyypp%C3%A4,+J.&author=Saari,+H.&author=P%C3%B6l%C3%B6nen,+I.&author=Imai,+N.N.&publication_year=2017&journal=Remote+Sens.&volume=9&pages=185&doi=10.3390/rs9030185" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/rs9030185" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B102-forests-14-00945' class='html-xxx' data-content='102.'>Cohen, J. A Coefficient of Agreement for Nominal Scales. <span class='html-italic'>Educ. Psychol. Meas.</span> <b>1960</b>, <span class='html-italic'>20</span>, 37–46. [<a href="https://scholar.google.com/scholar_lookup?title=A+Coefficient+of+Agreement+for+Nominal+Scales&author=Cohen,+J.&publication_year=1960&journal=Educ.+Psychol.+Meas.&volume=20&pages=37%E2%80%9346&doi=10.1177/001316446002000104" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1177/001316446002000104" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B103-forests-14-00945' class='html-xxx' data-content='103.'>Millikan, P.H.K.; Silva, C.A.; Rodriguez, L.C.E.; de Oliveira, T.M.; Carvalho, M.P.d.L.C.; Carvalho, S.d.P.C. Automated Individual Tree Detection in Amazon Tropical Forest from Airborne Laser Scanning Data. <span class='html-italic'>Cerne</span> <b>2019</b>, <span class='html-italic'>25</span>, 273–282. [<a href="https://scholar.google.com/scholar_lookup?title=Automated+Individual+Tree+Detection+in+Amazon+Tropical+Forest+from+Airborne+Laser+Scanning+Data&author=Millikan,+P.H.K.&author=Silva,+C.A.&author=Rodriguez,+L.C.E.&author=de+Oliveira,+T.M.&author=Carvalho,+M.P.d.L.C.&author=Carvalho,+S.d.P.C.&publication_year=2019&journal=Cerne&volume=25&pages=273%E2%80%93282&doi=10.1590/01047760201925032630" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1590/01047760201925032630" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B104-forests-14-00945' class='html-xxx' data-content='104.'>Silva, C.A.; Hudak, A.T.; Vierling, L.A.; Loudermilk, E.L.; O’Brien, J.J.; Hiers, J.K.; Jack, S.B.; Gonzalez-Benecke, C.; Lee, H.; Falkowski, M.J. Imputation of Individual Longleaf Pine (Pinus Palustris Mill.) Tree Attributes from Field and LiDAR Data. <span class='html-italic'>Can. J. Remote Sens.</span> <b>2016</b>, <span class='html-italic'>42</span>, 554–573. [<a href="https://scholar.google.com/scholar_lookup?title=Imputation+of+Individual+Longleaf+Pine+(Pinus+Palustris+Mill.)+Tree+Attributes+from+Field+and+LiDAR+Data&author=Silva,+C.A.&author=Hudak,+A.T.&author=Vierling,+L.A.&author=Loudermilk,+E.L.&author=O%E2%80%99Brien,+J.J.&author=Hiers,+J.K.&author=Jack,+S.B.&author=Gonzalez-Benecke,+C.&author=Lee,+H.&author=Falkowski,+M.J.&publication_year=2016&journal=Can.+J.+Remote+Sens.&volume=42&pages=554%E2%80%93573&doi=10.1080/07038992.2016.1196582" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1080/07038992.2016.1196582" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B105-forests-14-00945' class='html-xxx' data-content='105.'>Wagner, F.H.; Ferreira, M.P.; Sanchez, A.; Hirye, M.C.; Zortea, M.; Gloor, E.; Phillips, O.L.; de Souza Filho, C.R.; Shimabukuro, Y.E.; Aragão, L.E. Individual Tree Crown Delineation in a Highly Diverse Tropical Forest Using Very High Resolution Satellite Images. <span class='html-italic'>ISPRS J. Photogramm. Remote Sens.</span> <b>2018</b>, <span class='html-italic'>145</span>, 362–377. [<a href="https://scholar.google.com/scholar_lookup?title=Individual+Tree+Crown+Delineation+in+a+Highly+Diverse+Tropical+Forest+Using+Very+High+Resolution+Satellite+Images&author=Wagner,+F.H.&author=Ferreira,+M.P.&author=Sanchez,+A.&author=Hirye,+M.C.&author=Zortea,+M.&author=Gloor,+E.&author=Phillips,+O.L.&author=de+Souza+Filho,+C.R.&author=Shimabukuro,+Y.E.&author=Arag%C3%A3o,+L.E.&publication_year=2018&journal=ISPRS+J.+Photogramm.+Remote+Sens.&volume=145&pages=362%E2%80%93377&doi=10.1016/j.isprsjprs.2018.09.013" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.isprsjprs.2018.09.013" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B106-forests-14-00945' class='html-xxx' data-content='106.'>Mäyrä, J.; Keski-Saari, S.; Kivinen, S.; Tanhuanpää, T.; Hurskainen, P.; Kullberg, P.; Poikolainen, L.; Viinikka, A.; Tuominen, S.; Kumpula, T. Tree Species Classification from Airborne Hyperspectral and LiDAR Data Using 3D Convolutional Neural Networks. <span class='html-italic'>Remote Sens. Environ.</span> <b>2021</b>, <span class='html-italic'>256</span>, 112322. [<a href="https://scholar.google.com/scholar_lookup?title=Tree+Species+Classification+from+Airborne+Hyperspectral+and+LiDAR+Data+Using+3D+Convolutional+Neural+Networks&author=M%C3%A4yr%C3%A4,+J.&author=Keski-Saari,+S.&author=Kivinen,+S.&author=Tanhuanp%C3%A4%C3%A4,+T.&author=Hurskainen,+P.&author=Kullberg,+P.&author=Poikolainen,+L.&author=Viinikka,+A.&author=Tuominen,+S.&author=Kumpula,+T.&publication_year=2021&journal=Remote+Sens.+Environ.&volume=256&pages=112322&doi=10.1016/j.rse.2021.112322" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.rse.2021.112322" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B107-forests-14-00945' class='html-xxx' data-content='107.'>Koenig, K.; Höfle, B. Full-Waveform Airborne Laser Scanning in Vegetation Studies—A Review of Point Cloud and Waveform Features for Tree Species Classification. <span class='html-italic'>Forests</span> <b>2016</b>, <span class='html-italic'>7</span>, 198. [<a href="https://scholar.google.com/scholar_lookup?title=Full-Waveform+Airborne+Laser+Scanning+in+Vegetation+Studies%E2%80%94A+Review+of+Point+Cloud+and+Waveform+Features+for+Tree+Species+Classification&author=Koenig,+K.&author=H%C3%B6fle,+B.&publication_year=2016&journal=Forests&volume=7&pages=198&doi=10.3390/f7090198" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/f7090198" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B108-forests-14-00945' class='html-xxx' data-content='108.'>Sun, P.; Yuan, X.; Li, D. Classification of Individual Tree Species Using UAV LiDAR Based on Transformer. <span class='html-italic'>Forests</span> <b>2023</b>, <span class='html-italic'>14</span>, 484. [<a href="https://scholar.google.com/scholar_lookup?title=Classification+of+Individual+Tree+Species+Using+UAV+LiDAR+Based+on+Transformer&author=Sun,+P.&author=Yuan,+X.&author=Li,+D.&publication_year=2023&journal=Forests&volume=14&pages=484&doi=10.3390/f14030484" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/f14030484" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B109-forests-14-00945' class='html-xxx' data-content='109.'>Jombo, S.; Adam, E.; Tesfamichael, S. Classification of Urban Tree Species Using LiDAR Data and WorldView-2 Satellite Imagery in a Heterogeneous Environment. <span class='html-italic'>Geocarto Int.</span> <b>2022</b>, <span class='html-italic'>37</span>, 1–24. [<a href="https://scholar.google.com/scholar_lookup?title=Classification+of+Urban+Tree+Species+Using+LiDAR+Data+and+WorldView-2+Satellite+Imagery+in+a+Heterogeneous+Environment&author=Jombo,+S.&author=Adam,+E.&author=Tesfamichael,+S.&publication_year=2022&journal=Geocarto+Int.&volume=37&pages=1%E2%80%9324&doi=10.1080/10106049.2022.2028904" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1080/10106049.2022.2028904" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B110-forests-14-00945' class='html-xxx' data-content='110.'>Qin, H.; Zhou, W.; Yao, Y.; Wang, W. Individual Tree Segmentation and Tree Species Classification in Subtropical Broadleaf Forests Using UAV-Based LiDAR, Hyperspectral, and Ultrahigh-Resolution RGB Data. <span class='html-italic'>Remote Sens. Environ.</span> <b>2022</b>, <span class='html-italic'>280</span>, 113143. [<a href="https://scholar.google.com/scholar_lookup?title=Individual+Tree+Segmentation+and+Tree+Species+Classification+in+Subtropical+Broadleaf+Forests+Using+UAV-Based+LiDAR,+Hyperspectral,+and+Ultrahigh-Resolution+RGB+Data&author=Qin,+H.&author=Zhou,+W.&author=Yao,+Y.&author=Wang,+W.&publication_year=2022&journal=Remote+Sens.+Environ.&volume=280&pages=113143&doi=10.1016/j.rse.2022.113143" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.rse.2022.113143" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B111-forests-14-00945' class='html-xxx' data-content='111.'>Wan, H.; Tang, Y.; Jing, L.; Li, H.; Qiu, F.; Wu, W. Tree Species Classification of Forest Stands Using Multisource Remote Sensing Data. <span class='html-italic'>Remote Sens.</span> <b>2021</b>, <span class='html-italic'>13</span>, 144. [<a href="https://scholar.google.com/scholar_lookup?title=Tree+Species+Classification+of+Forest+Stands+Using+Multisource+Remote+Sensing+Data&author=Wan,+H.&author=Tang,+Y.&author=Jing,+L.&author=Li,+H.&author=Qiu,+F.&author=Wu,+W.&publication_year=2021&journal=Remote+Sens.&volume=13&pages=144&doi=10.3390/rs13010144" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/rs13010144" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B112-forests-14-00945' class='html-xxx' data-content='112.'>Wu, Y.; Zhang, X. Object-Based Tree Species Classification Using Airborne Hyperspectral Images and LiDAR Data. <span class='html-italic'>Forests</span> <b>2019</b>, <span class='html-italic'>11</span>, 32. [<a href="https://scholar.google.com/scholar_lookup?title=Object-Based+Tree+Species+Classification+Using+Airborne+Hyperspectral+Images+and+LiDAR+Data&author=Wu,+Y.&author=Zhang,+X.&publication_year=2019&journal=Forests&volume=11&pages=32&doi=10.3390/f11010032" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/f11010032" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B113-forests-14-00945' class='html-xxx' data-content='113.'>You, H.T.; Lei, P.; Li, M.S.; Ruan, F.Q. Forest Species Classification Based on Three-Dimensional Coordinate and Intensity Information of Airborne LiDAR Data with Random Forest Method. <span class='html-italic'>Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.</span> <b>2020</b>, <span class='html-italic'>42</span>, 117–123. [<a href="https://scholar.google.com/scholar_lookup?title=Forest+Species+Classification+Based+on+Three-Dimensional+Coordinate+and+Intensity+Information+of+Airborne+LiDAR+Data+with+Random+Forest+Method&author=You,+H.T.&author=Lei,+P.&author=Li,+M.S.&author=Ruan,+F.Q.&publication_year=2020&journal=Int.+Arch.+Photogramm.+Remote+Sens.+Spat.+Inf.+Sci.&volume=42&pages=117%E2%80%93123&doi=10.5194/isprs-archives-XLII-3-W10-117-2020" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.5194/isprs-archives-XLII-3-W10-117-2020" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B114-forests-14-00945' class='html-xxx' data-content='114.'>Ballanti, L.; Blesius, L.; Hines, E.; Kruse, B. Tree Species Classification Using Hyperspectral Imagery: A Comparison of Two Classifiers. <span class='html-italic'>Remote Sens.</span> <b>2016</b>, <span class='html-italic'>8</span>, 445. [<a href="https://scholar.google.com/scholar_lookup?title=Tree+Species+Classification+Using+Hyperspectral+Imagery:+A+Comparison+of+Two+Classifiers&author=Ballanti,+L.&author=Blesius,+L.&author=Hines,+E.&author=Kruse,+B.&publication_year=2016&journal=Remote+Sens.&volume=8&pages=445&doi=10.3390/rs8060445" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/rs8060445" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B115-forests-14-00945' class='html-xxx' data-content='115.'>Reitberger, J.; Krzystek, P.; Stilla, U. Analysis of Full Waveform Lidar Data for Tree Species Classification. <span class='html-italic'>Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.</span> <b>2006</b>, <span class='html-italic'>36</span>, 228–233. [<a href="https://scholar.google.com/scholar_lookup?title=Analysis+of+Full+Waveform+Lidar+Data+for+Tree+Species+Classification&author=Reitberger,+J.&author=Krzystek,+P.&author=Stilla,+U.&publication_year=2006&journal=Int.+Arch.+Photogramm.+Remote+Sens.+Spat.+Inf.+Sci.&volume=36&pages=228%E2%80%93233" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B116-forests-14-00945' class='html-xxx' data-content='116.'>Xu, G.; Pang, Y.; Li, Z.; Zhao, D.; Liu, L. Individual Trees Species Classification Using Relative Calibrated Fullwaveform LiDAR Data. In Proceedings of the 2012 Silvilaser International Conference on Lidar Applications for Assessing Forest Ecosystems, Vancouver, BC, Canada, 16–19 September 2012; Volume 1619, p. 165176. [<a href="https://scholar.google.com/scholar_lookup?title=Individual+Trees+Species+Classification+Using+Relative+Calibrated+Fullwaveform+LiDAR+Data&conference=Proceedings+of+the+2012+Silvilaser+International+Conference+on+Lidar+Applications+for+Assessing+Forest+Ecosystems&author=Xu,+G.&author=Pang,+Y.&author=Li,+Z.&author=Zhao,+D.&author=Liu,+L.&publication_year=2012&pages=165176" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B117-forests-14-00945' class='html-xxx' data-content='117.'>Cao, L.; Gao, S.; Li, P.; Yun, T.; Shen, X.; Ruan, H. Aboveground Biomass Estimation of Individual Trees in a Coastal Planted Forest Using Full-Waveform Airborne Laser Scanning Data. <span class='html-italic'>Remote Sens.</span> <b>2016</b>, <span class='html-italic'>8</span>, 729. [<a href="https://scholar.google.com/scholar_lookup?title=Aboveground+Biomass+Estimation+of+Individual+Trees+in+a+Coastal+Planted+Forest+Using+Full-Waveform+Airborne+Laser+Scanning+Data&author=Cao,+L.&author=Gao,+S.&author=Li,+P.&author=Yun,+T.&author=Shen,+X.&author=Ruan,+H.&publication_year=2016&journal=Remote+Sens.&volume=8&pages=729&doi=10.3390/rs8090729" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/rs8090729" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B118-forests-14-00945' class='html-xxx' data-content='118.'>Hollaus, M.; Mücke, W.; Höfle, B.; Dorigo, W.; Pfeifer, N.; Wagner, W.; Bauerhansl, C.; Regner, B. Tree Species Classification Based on Full-Waveform Airborne Laser Scanning Data. In Proceedings of the SILVILASER, College Station, TX, USA, 14–16 October 2009. [<a href="https://scholar.google.com/scholar_lookup?title=Tree+Species+Classification+Based+on+Full-Waveform+Airborne+Laser+Scanning+Data&conference=Proceedings+of+the+SILVILASER&author=Hollaus,+M.&author=M%C3%BCcke,+W.&author=H%C3%B6fle,+B.&author=Dorigo,+W.&author=Pfeifer,+N.&author=Wagner,+W.&author=Bauerhansl,+C.&author=Regner,+B.&publication_year=2009" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B119-forests-14-00945' class='html-xxx' data-content='119.'>Lines, E.R.; Allen, M.; Cabo, C.; Calders, K.; Debus, A.; Grieve, S.W.; Miltiadou, M.; Noach, A.; Owen, H.J.; Puliti, S. AI Applications in Forest Monitoring Need Remote Sensing Benchmark Datasets. <span class='html-italic'>arXiv</span> <b>2022</b>, arXiv:2212.09937. [<a href="https://scholar.google.com/scholar_lookup?title=AI+Applications+in+Forest+Monitoring+Need+Remote+Sensing+Benchmark+Datasets&author=Lines,+E.R.&author=Allen,+M.&author=Cabo,+C.&author=Calders,+K.&author=Debus,+A.&author=Grieve,+S.W.&author=Miltiadou,+M.&author=Noach,+A.&author=Owen,+H.J.&author=Puliti,+S.&publication_year=2022&journal=arXiv&doi=10.48550/arXiv.2212.09937" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.48550/arXiv.2212.09937" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B120-forests-14-00945' class='html-xxx' data-content='120.'>Anderson, K.; Hancock, S.; Disney, M.; Gaston, K.J. Is Waveform Worth It? A Comparison of Li DAR Approaches for Vegetation and Landscape Characterization. <span class='html-italic'>Remote Sens. Ecol. Conserv.</span> <b>2016</b>, <span class='html-italic'>2</span>, 5–15. [<a href="https://scholar.google.com/scholar_lookup?title=Is+Waveform+Worth+It?+A+Comparison+of+Li+DAR+Approaches+for+Vegetation+and+Landscape+Characterization&author=Anderson,+K.&author=Hancock,+S.&author=Disney,+M.&author=Gaston,+K.J.&publication_year=2016&journal=Remote+Sens.+Ecol.+Conserv.&volume=2&pages=5%E2%80%9315&doi=10.1002/rse2.8" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1002/rse2.8" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B121-forests-14-00945' class='html-xxx' data-content='121.'>Asner, G.P.; Knapp, D.E.; Kennedy-Bowdoin, T.; Jones, M.O.; Martin, R.E.; Boardman, J.; Hughes, R.F. Invasive Species Detection in Hawaiian Rainforests Using Airborne Imaging Spectroscopy and LiDAR. <span class='html-italic'>Remote Sens. Environ.</span> <b>2008</b>, <span class='html-italic'>112</span>, 1942–1955. [<a href="https://scholar.google.com/scholar_lookup?title=Invasive+Species+Detection+in+Hawaiian+Rainforests+Using+Airborne+Imaging+Spectroscopy+and+LiDAR&author=Asner,+G.P.&author=Knapp,+D.E.&author=Kennedy-Bowdoin,+T.&author=Jones,+M.O.&author=Martin,+R.E.&author=Boardman,+J.&author=Hughes,+R.F.&publication_year=2008&journal=Remote+Sens.+Environ.&volume=112&pages=1942%E2%80%931955&doi=10.1016/j.rse.2007.11.016" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.1016/j.rse.2007.11.016" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B122-forests-14-00945' class='html-xxx' data-content='122.'>Shen, X.; Cao, L. Tree-Species Classification in Subtropical Forests Using Airborne Hyperspectral and LiDAR Data. <span class='html-italic'>Remote Sens.</span> <b>2017</b>, <span class='html-italic'>9</span>, 1180. [<a href="https://scholar.google.com/scholar_lookup?title=Tree-Species+Classification+in+Subtropical+Forests+Using+Airborne+Hyperspectral+and+LiDAR+Data&author=Shen,+X.&author=Cao,+L.&publication_year=2017&journal=Remote+Sens.&volume=9&pages=1180&doi=10.3390/rs9111180" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>] [<a href="https://doi.org/10.3390/rs9111180" class='cross-ref' target='_blank' rel='noopener noreferrer'>CrossRef</a>]</li><li id='B123-forests-14-00945' class='html-xxx' data-content='123.'>Rogers, J.; Gunn, S. Identifying Feature Relevance Using a Random Forest. In <span class='html-italic'>Proceedings of the Subspace, Latent Structure and Feature Selection: Statistical and Optimization Perspectives Workshop, SLSFS 2005, Bohinj, Slovenia, 23–25 February 2005, Revised Selected Papers</span>; Springer: Berlin/Heidelberg, Germany, 2006; pp. 173–184. [<a href="https://scholar.google.com/scholar_lookup?title=Identifying+Feature+Relevance+Using+a+Random+Forest&author=Rogers,+J.&author=Gunn,+S.&publication_year=2006&pages=173%E2%80%93184" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B124-forests-14-00945' class='html-xxx' data-content='124.'>Zhang, Y.; Song, B.; Zhang, Y.; Chen, S. An Advanced Random Forest Algorithm Targeting the Big Data with Redundant Features. In <span class='html-italic'>Proceedings of the Algorithms and Architectures for Parallel Processing: 17th International Conference, ICA3PP 2017, Helsinki, Finland, 21–23 August 2017, Proceedings 17</span>; Springer: Berlin/Heidelberg, Germany, 2017; pp. 642–651. [<a href="https://scholar.google.com/scholar_lookup?title=An+Advanced+Random+Forest+Algorithm+Targeting+the+Big+Data+with+Redundant+Features&author=Zhang,+Y.&author=Song,+B.&author=Zhang,+Y.&author=Chen,+S.&publication_year=2017&pages=642%E2%80%93651" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li><li id='B125-forests-14-00945' class='html-xxx' data-content='125.'>Van Coillie, F.M.; Liao, W.; Kempeneers, P.; Vandekerkhove, K.; Gautama, S.; Philips, W.; De Wulf, R.R. Optimized Feature Fusion of LiDAR and Hyperspectral Data for Tree Species Mapping in Closed Forest Canopies. In <span class='html-italic'>Proceedings of the 2015 7th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS)</span>; IEEE: Piscataway, NJ, USA, 2015; pp. 1–4. [<a href="https://scholar.google.com/scholar_lookup?title=Optimized+Feature+Fusion+of+LiDAR+and+Hyperspectral+Data+for+Tree+Species+Mapping+in+Closed+Forest+Canopies&author=Van+Coillie,+F.M.&author=Liao,+W.&author=Kempeneers,+P.&author=Vandekerkhove,+K.&author=Gautama,+S.&author=Philips,+W.&author=De+Wulf,+R.R.&publication_year=2015&pages=1%E2%80%934" class='google-scholar' target='_blank' rel='noopener noreferrer'>Google Scholar</a>]</li></ol></section><section id='FiguresandTables' type='display-objects'><div class="html-fig-wrap" id="forests-14-00945-f001"> <div class='html-fig_img'> <div class="html-figpopup html-figpopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f001"> <img alt="Forests 14 00945 g001 550" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g001.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g001.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g001-550.jpg" /> <a class="html-expand html-figpopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f001"></a> </div> </div> <div class="html-fig_description"> <b>Figure 1.</b> Location of the <span class='html-italic'>Ponte Branca</span> Forest remnant with the field plots and the different successional stages found (Source: Martins-Neto et al., 2022 [<a href="#B33-forests-14-00945" class="html-bibr">33</a>]). <!-- <p><a class="html-figpopup" href="#fig_body_display_forests-14-00945-f001"> Click here to enlarge figure </a></p> --> </div> </div> <div class="html-fig_show mfp-hide" id ="fig_body_display_forests-14-00945-f001" > <div class="html-caption" > <b>Figure 1.</b> Location of the <span class='html-italic'>Ponte Branca</span> Forest remnant with the field plots and the different successional stages found (Source: Martins-Neto et al., 2022 [<a href="#B33-forests-14-00945" class="html-bibr">33</a>]).</div> <div class="html-img"><img alt="Forests 14 00945 g001" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g001.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g001.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g001.png" /></div> </div><div class="html-fig-wrap" id="forests-14-00945-f002"> <div class='html-fig_img'> <div class="html-figpopup html-figpopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f002"> <img alt="Forests 14 00945 g002 550" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g002.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g002.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g002-550.jpg" /> <a class="html-expand html-figpopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f002"></a> </div> </div> <div class="html-fig_description"> <b>Figure 2.</b> Vertical stratification of <span class='html-italic'>Ponte Branca</span> Forest remnant. (<b>a</b>) All tree heights. (<b>b</b>) Lower stratum. (<b>c</b>) Middle stratum. (<b>d</b>) Upper stratum. <!-- <p><a class="html-figpopup" href="#fig_body_display_forests-14-00945-f002"> Click here to enlarge figure </a></p> --> </div> </div> <div class="html-fig_show mfp-hide" id ="fig_body_display_forests-14-00945-f002" > <div class="html-caption" > <b>Figure 2.</b> Vertical stratification of <span class='html-italic'>Ponte Branca</span> Forest remnant. (<b>a</b>) All tree heights. (<b>b</b>) Lower stratum. (<b>c</b>) Middle stratum. (<b>d</b>) Upper stratum.</div> <div class="html-img"><img alt="Forests 14 00945 g002" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g002.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g002.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g002.png" /></div> </div><div class="html-fig-wrap" id="forests-14-00945-f003"> <div class='html-fig_img'> <div class="html-figpopup html-figpopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f003"> <img alt="Forests 14 00945 g003 550" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g003.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g003.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g003-550.jpg" /> <a class="html-expand html-figpopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f003"></a> </div> </div> <div class="html-fig_description"> <b>Figure 3.</b> Individual tree crowns delineated manually for each species identified in the field for hyperspectral orthomosaics in green (R: 780.49 nm; G: 650.96 nm; B: 535.09 nm), and for RGB imagens in red. <!-- <p><a class="html-figpopup" href="#fig_body_display_forests-14-00945-f003"> Click here to enlarge figure </a></p> --> </div> </div> <div class="html-fig_show mfp-hide" id ="fig_body_display_forests-14-00945-f003" > <div class="html-caption" > <b>Figure 3.</b> Individual tree crowns delineated manually for each species identified in the field for hyperspectral orthomosaics in green (R: 780.49 nm; G: 650.96 nm; B: 535.09 nm), and for RGB imagens in red.</div> <div class="html-img"><img alt="Forests 14 00945 g003" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g003.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g003.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g003.png" /></div> </div><div class="html-fig-wrap" id="forests-14-00945-f004"> <div class='html-fig_img'> <div class="html-figpopup html-figpopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f004"> <img alt="Forests 14 00945 g004 550" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g004.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g004.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g004-550.jpg" /> <a class="html-expand html-figpopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f004"></a> </div> </div> <div class="html-fig_description"> <b>Figure 4.</b> (<b>a</b>) Rikola hyperspectral camera. (<b>b</b>) UAV quadcopter with Rikola camera mounted. (Source: Miyoshi, 2020 [<a href="#B47-forests-14-00945" class="html-bibr">47</a>]). <!-- <p><a class="html-figpopup" href="#fig_body_display_forests-14-00945-f004"> Click here to enlarge figure </a></p> --> </div> </div> <div class="html-fig_show mfp-hide" id ="fig_body_display_forests-14-00945-f004" > <div class="html-caption" > <b>Figure 4.</b> (<b>a</b>) Rikola hyperspectral camera. (<b>b</b>) UAV quadcopter with Rikola camera mounted. (Source: Miyoshi, 2020 [<a href="#B47-forests-14-00945" class="html-bibr">47</a>]).</div> <div class="html-img"><img alt="Forests 14 00945 g004" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g004.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g004.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g004.png" /></div> </div><div class="html-fig-wrap" id="forests-14-00945-f005"> <div class='html-fig_img'> <div class="html-figpopup html-figpopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f005"> <img alt="Forests 14 00945 g005 550" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g005.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g005.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g005-550.jpg" /> <a class="html-expand html-figpopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f005"></a> </div> </div> <div class="html-fig_description"> <b>Figure 5.</b> Targets located near the overflown area. The radiometric targets are in red, and the GCPs are in blue. <!-- <p><a class="html-figpopup" href="#fig_body_display_forests-14-00945-f005"> Click here to enlarge figure </a></p> --> </div> </div> <div class="html-fig_show mfp-hide" id ="fig_body_display_forests-14-00945-f005" > <div class="html-caption" > <b>Figure 5.</b> Targets located near the overflown area. The radiometric targets are in red, and the GCPs are in blue.</div> <div class="html-img"><img alt="Forests 14 00945 g005" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g005.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g005.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g005.png" /></div> </div><div class="html-fig-wrap" id="forests-14-00945-f006"> <div class='html-fig_img'> <div class="html-figpopup html-figpopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f006"> <img alt="Forests 14 00945 g006 550" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g006.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g006.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g006-550.jpg" /> <a class="html-expand html-figpopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f006"></a> </div> </div> <div class="html-fig_description"> <b>Figure 6.</b> Hyperspectral images processing flowchart (Source: adapted from Näsi et al., 2015 and Moriya et al., 2017 [<a href="#B51-forests-14-00945" class="html-bibr">51</a>,<a href="#B53-forests-14-00945" class="html-bibr">53</a>]). <!-- <p><a class="html-figpopup" href="#fig_body_display_forests-14-00945-f006"> Click here to enlarge figure </a></p> --> </div> </div> <div class="html-fig_show mfp-hide" id ="fig_body_display_forests-14-00945-f006" > <div class="html-caption" > <b>Figure 6.</b> Hyperspectral images processing flowchart (Source: adapted from Näsi et al., 2015 and Moriya et al., 2017 [<a href="#B51-forests-14-00945" class="html-bibr">51</a>,<a href="#B53-forests-14-00945" class="html-bibr">53</a>]).</div> <div class="html-img"><img alt="Forests 14 00945 g006" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g006.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g006.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g006.png" /></div> </div><div class="html-fig-wrap" id="forests-14-00945-f007"> <div class='html-fig_img'> <div class="html-figpopup html-figpopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f007"> <img alt="Forests 14 00945 g007 550" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g007.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g007.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g007-550.jpg" /> <a class="html-expand html-figpopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f007"></a> </div> </div> <div class="html-fig_description"> <b>Figure 7.</b> LiDAR data processing flowchart. Dark blue are steps for PR data and light blue for FWF data. <!-- <p><a class="html-figpopup" href="#fig_body_display_forests-14-00945-f007"> Click here to enlarge figure </a></p> --> </div> </div> <div class="html-fig_show mfp-hide" id ="fig_body_display_forests-14-00945-f007" > <div class="html-caption" > <b>Figure 7.</b> LiDAR data processing flowchart. Dark blue are steps for PR data and light blue for FWF data.</div> <div class="html-img"><img alt="Forests 14 00945 g007" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g007.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g007.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g007.png" /></div> </div><div class="html-fig-wrap" id="forests-14-00945-f008"> <div class='html-fig_img'> <div class="html-figpopup html-figpopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f008"> <img alt="Forests 14 00945 g008 550" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g008.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g008.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g008-550.jpg" /> <a class="html-expand html-figpopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f008"></a> </div> </div> <div class="html-fig_description"> <b>Figure 8.</b> Canopy height model with the superpixels; on the left, it was generated with 100,000 superpixels, and on the right, it was generated with 200,000 superpixels. <!-- <p><a class="html-figpopup" href="#fig_body_display_forests-14-00945-f008"> Click here to enlarge figure </a></p> --> </div> </div> <div class="html-fig_show mfp-hide" id ="fig_body_display_forests-14-00945-f008" > <div class="html-caption" > <b>Figure 8.</b> Canopy height model with the superpixels; on the left, it was generated with 100,000 superpixels, and on the right, it was generated with 200,000 superpixels.</div> <div class="html-img"><img alt="Forests 14 00945 g008" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g008.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g008.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g008.png" /></div> </div><div class="html-fig-wrap" id="forests-14-00945-f009"> <div class='html-fig_img'> <div class="html-figpopup html-figpopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f009"> <img alt="Forests 14 00945 g009 550" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g009.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g009.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g009-550.jpg" /> <a class="html-expand html-figpopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f009"></a> </div> </div> <div class="html-fig_description"> <b>Figure 9.</b> The selected superpixels are depicted in blue and were derived based on the criteria present in <a href="#forests-14-00945-t004" class="html-table">Table 4</a>. The merged superpixels are depicted with yellow9 while the white colour shows the comparison of the superpixels with the reference ITC. Regarding the case of SyRo, manual corrections was performed on the merged superpixels since an excessive number of cells were selected. <!-- <p><a class="html-figpopup" href="#fig_body_display_forests-14-00945-f009"> Click here to enlarge figure </a></p> --> </div> </div> <div class="html-fig_show mfp-hide" id ="fig_body_display_forests-14-00945-f009" > <div class="html-caption" > <b>Figure 9.</b> The selected superpixels are depicted in blue and were derived based on the criteria present in <a href="#forests-14-00945-t004" class="html-table">Table 4</a>. The merged superpixels are depicted with yellow9 while the white colour shows the comparison of the superpixels with the reference ITC. Regarding the case of SyRo, manual corrections was performed on the merged superpixels since an excessive number of cells were selected.</div> <div class="html-img"><img alt="Forests 14 00945 g009" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g009.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g009.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g009.png" /></div> </div><div class="html-fig-wrap" id="forests-14-00945-f010"> <div class='html-fig_img'> <div class="html-figpopup html-figpopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f010"> <img alt="Forests 14 00945 g010 550" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g010.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g010.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g010-550.jpg" /> <a class="html-expand html-figpopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f010"></a> </div> </div> <div class="html-fig_description"> <b>Figure 10.</b> Mean spectra for each tree species. <!-- <p><a class="html-figpopup" href="#fig_body_display_forests-14-00945-f010"> Click here to enlarge figure </a></p> --> </div> </div> <div class="html-fig_show mfp-hide" id ="fig_body_display_forests-14-00945-f010" > <div class="html-caption" > <b>Figure 10.</b> Mean spectra for each tree species.</div> <div class="html-img"><img alt="Forests 14 00945 g010" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g010.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g010.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g010.png" /></div> </div><div class="html-fig-wrap" id="forests-14-00945-f011"> <div class='html-fig_img'> <div class="html-figpopup html-figpopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f011"> <img alt="Forests 14 00945 g011 550" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g011.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g011.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g011-550.jpg" /> <a class="html-expand html-figpopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f011"></a> </div> </div> <div class="html-fig_description"> <b>Figure 11.</b> Accuracy assessment of the 13 scenarios tested for the classification of tree species. <!-- <p><a class="html-figpopup" href="#fig_body_display_forests-14-00945-f011"> Click here to enlarge figure </a></p> --> </div> </div> <div class="html-fig_show mfp-hide" id ="fig_body_display_forests-14-00945-f011" > <div class="html-caption" > <b>Figure 11.</b> Accuracy assessment of the 13 scenarios tested for the classification of tree species.</div> <div class="html-img"><img alt="Forests 14 00945 g011" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g011.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g011.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g011.png" /></div> </div><div class="html-fig-wrap" id="forests-14-00945-f012"> <div class='html-fig_img'> <div class="html-figpopup html-figpopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f012"> <img alt="Forests 14 00945 g012 550" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g012.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g012.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g012-550.jpg" /> <a class="html-expand html-figpopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f012"></a> </div> </div> <div class="html-fig_description"> <b>Figure 12.</b> Confusion matrix for the classification of the eight tree species for the two best scenarios. <!-- <p><a class="html-figpopup" href="#fig_body_display_forests-14-00945-f012"> Click here to enlarge figure </a></p> --> </div> </div> <div class="html-fig_show mfp-hide" id ="fig_body_display_forests-14-00945-f012" > <div class="html-caption" > <b>Figure 12.</b> Confusion matrix for the classification of the eight tree species for the two best scenarios.</div> <div class="html-img"><img alt="Forests 14 00945 g012" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g012.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g012.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g012.png" /></div> </div><div class="html-fig-wrap" id="forests-14-00945-f013"> <div class='html-fig_img'> <div class="html-figpopup html-figpopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f013"> <img alt="Forests 14 00945 g013 550" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g013.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g013.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g013-550.jpg" /> <a class="html-expand html-figpopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f013"></a> </div> </div> <div class="html-fig_description"> <b>Figure 13.</b> Feature importance of tree species classification for S11 and S13. <!-- <p><a class="html-figpopup" href="#fig_body_display_forests-14-00945-f013"> Click here to enlarge figure </a></p> --> </div> </div> <div class="html-fig_show mfp-hide" id ="fig_body_display_forests-14-00945-f013" > <div class="html-caption" > <b>Figure 13.</b> Feature importance of tree species classification for S11 and S13.</div> <div class="html-img"><img alt="Forests 14 00945 g013" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g013.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g013.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g013.png" /></div> </div><div class="html-fig-wrap" id="forests-14-00945-f014"> <div class='html-fig_img'> <div class="html-figpopup html-figpopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f014"> <img alt="Forests 14 00945 g014 550" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g014.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g014.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g014-550.jpg" /> <a class="html-expand html-figpopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#fig_body_display_forests-14-00945-f014"></a> </div> </div> <div class="html-fig_description"> <b>Figure 14.</b> Projection of features for the first and fourth principal components and their respective contribution. <!-- <p><a class="html-figpopup" href="#fig_body_display_forests-14-00945-f014"> Click here to enlarge figure </a></p> --> </div> </div> <div class="html-fig_show mfp-hide" id ="fig_body_display_forests-14-00945-f014" > <div class="html-caption" > <b>Figure 14.</b> Projection of features for the first and fourth principal components and their respective contribution.</div> <div class="html-img"><img alt="Forests 14 00945 g014" data-large="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g014.png" data-original="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g014.png" data-lsrc="/forests/forests-14-00945/article_deploy/html/images/forests-14-00945-g014.png" /></div> </div><div class="html-table-wrap" id="forests-14-00945-t001"> <div class="html-table_wrap_td" > <div class="html-tablepopup html-tablepopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href='#table_body_display_forests-14-00945-t001'> <img alt="Table" data-lsrc="https://www.mdpi.com/img/table.png" /> <a class="html-expand html-tablepopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#table_body_display_forests-14-00945-t001"></a> </div> </div> <div class="html-table_wrap_discription"> <b>Table 1.</b> Summary of selected species for automatic classification. </div> </div> <div class="html-table_show mfp-hide " id ="table_body_display_forests-14-00945-t001" > <div class="html-caption" ><b>Table 1.</b> Summary of selected species for automatic classification.</div> <table > <thead ><tr ><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >ID</th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Tree Species</th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Family</th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >ITC</th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >T/A <sup>1</sup></th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Hm(m) <sup>2</sup></th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Characteristics</th></tr></thead><tbody ><tr ><td align='center' valign='middle' class='html-align-center' >AnPe</td><td align='center' valign='middle' class='html-align-center' ><span class='html-italic'>Anadenanthera peregrina</span></td><td align='center' valign='middle' class='html-align-center' >Fabaceae–Mimosoidae</td><td align='center' valign='middle' class='html-align-center' >9</td><td align='center' valign='middle' class='html-align-center' >7641/849</td><td align='center' valign='middle' class='html-align-center' >17.64 ± 2.64</td><td align='center' valign='middle' class='html-align-center' >Evergreen species, with characteristics from pioneer to early secondary. It is fast-growing and its uses include urban afforestation, recovery of degraded areas, and wood for civil construction [<a href="#B40-forests-14-00945" class="html-bibr">40</a>].</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >ApLe</td><td align='center' valign='middle' class='html-align-center' ><span class='html-italic'>Apuleia leiocarpa</span></td><td align='center' valign='middle' class='html-align-center' >Fabaceae–Caesalpionideae</td><td align='center' valign='middle' class='html-align-center' >9</td><td align='center' valign='middle' class='html-align-center' >4960/551</td><td align='center' valign='middle' class='html-align-center' >14.27 ± 3.46</td><td align='center' valign='middle' class='html-align-center' >Deciduous and slow-growing species, with characteristics from pioneer to early secondary. Its wood is resistant, suitable for construction of external structures. Furthermore, it can be used in urban afforestation, honey production, and riparian forest restoration in areas without flooding [<a href="#B40-forests-14-00945" class="html-bibr">40</a>].</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >AsPo</td><td align='center' valign='middle' class='html-align-center' ><span class='html-italic'>Aspidosperma polyneuron</span></td><td align='center' valign='middle' class='html-align-center' >Apocynaceae</td><td align='center' valign='middle' class='html-align-center' >9</td><td align='center' valign='middle' class='html-align-center' >28,946/3216</td><td align='center' valign='middle' class='html-align-center' >22.13 ± 3.62</td><td align='center' valign='middle' class='html-align-center' >Evergreen species, late secondary to climax. Long-lived species with very slow growth. Wood with a high economic value has good mechanical resistance used in the furniture industry, construction, carpentry, and shipbuilding [<a href="#B40-forests-14-00945" class="html-bibr">40</a>].</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >CoLa</td><td align='center' valign='middle' class='html-align-center' ><span class='html-italic'>Copaifera</span><span class='html-italic'>langsdorffii</span></td><td align='center' valign='middle' class='html-align-center' >Fabaceae–Caesalpionideae</td><td align='center' valign='middle' class='html-align-center' >9</td><td align='center' valign='middle' class='html-align-center' >9984/1109</td><td align='center' valign='middle' class='html-align-center' >15.14 ± 2.58</td><td align='center' valign='middle' class='html-align-center' >Semi-deciduous tree, with late secondary to climax characteristics. Species with remarkable plasticity and easy adaptation. Long-lived tree with moderate to slow growth. High durability wood used in civil construction. However, the most significant feature of this species is the extraction of its essential oil, used in the cosmetics, plastics, paints, and resins industry [<a href="#B40-forests-14-00945" class="html-bibr">40</a>].</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >HeAp</td><td align='center' valign='middle' class='html-align-center' ><span class='html-italic'>Helietta apiculata</span></td><td align='center' valign='middle' class='html-align-center' >Rutaceae</td><td align='center' valign='middle' class='html-align-center' >10</td><td align='center' valign='middle' class='html-align-center' >3549/355</td><td align='center' valign='middle' class='html-align-center' >13.41 ± 0.78</td><td align='center' valign='middle' class='html-align-center' >Evergreen tree, with early and late secondary characteristics. This species is slow growing, with dense wood, and is very useful for manufacturing pieces that require great durability. In addition, this species has a good development in shallow and rocky soils, indicated for the recovery of degraded areas [<a href="#B41-forests-14-00945" class="html-bibr">41</a>].</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >HyCo</td><td align='center' valign='middle' class='html-align-center' ><span class='html-italic'>Hymenaea</span><span class='html-italic'>courbaril</span></td><td align='center' valign='middle' class='html-align-center' >Fabaceae–Caesalpionideae</td><td align='center' valign='middle' class='html-align-center' >8</td><td align='center' valign='middle' class='html-align-center' >9308/1164</td><td align='center' valign='middle' class='html-align-center' >15.49 ± 3.27</td><td align='center' valign='middle' class='html-align-center' >Long-lived semi-deciduous tree with late secondary to climax characteristics. This species presents moderate to slow growth with high-density wood. The uses are for civil and external construction and carpentry. The resin from this tree is used to manufacture varnishes and medicinal uses. In addition, this species can be used for the production of honey [<a href="#B40-forests-14-00945" class="html-bibr">40</a>].</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >PtPu</td><td align='center' valign='middle' class='html-align-center' ><span class='html-italic'>Pterodon pubescens</span></td><td align='center' valign='middle' class='html-align-center' >Fabaceae–Faboideae</td><td align='center' valign='middle' class='html-align-center' >6</td><td align='center' valign='middle' class='html-align-center' >12,249/2042</td><td align='center' valign='middle' class='html-align-center' >15.35 ± 2.89</td><td align='center' valign='middle' class='html-align-center' >Deciduous species, with characteristic of initial secondary. It is fast-growing, and the wood presents high density being used for civil construction. Other uses of this species include honey production, urban afforestation, and recovery of degraded areas [<a href="#B42-forests-14-00945" class="html-bibr">42</a>].</td></tr><tr ><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >SyRo</td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' ><span class='html-italic'>Syagrus</span><span class='html-italic'>romanzoffiana</span></td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >Arecaceae</td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >21</td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >5731/273</td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >13.00± 0.55</td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >Palm tree, with a characteristic of pioneer species, early secondary and late secondary. This species has great plasticity, occurring in soils of low and high chemical fertility, drained to flooded. Its growth is slow, and its fruits serve as food for countless animals [<a href="#B43-forests-14-00945" class="html-bibr">43</a>].</td></tr></tbody> </table> <div class='html-table_foot html-p'><div class='html-p' style='text-indent:0em;'><span class='html-fn-content'><sup>1</sup> Total and average number of pixels for each tree species; <sup>2</sup> Average height of the trees obtained from the CHM followed by the standard deviation.</span></div><div style='clear:both;'></div></div> </div><div class="html-table-wrap" id="forests-14-00945-t002"> <div class="html-table_wrap_td" > <div class="html-tablepopup html-tablepopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href='#table_body_display_forests-14-00945-t002'> <img alt="Table" data-lsrc="https://www.mdpi.com/img/table.png" /> <a class="html-expand html-tablepopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#table_body_display_forests-14-00945-t002"></a> </div> </div> <div class="html-table_wrap_discription"> <b>Table 2.</b> Wavelengths used in the Rikola camera bands and their respective FWHM. </div> </div> <div class="html-table_show mfp-hide " id ="table_body_display_forests-14-00945-t002" > <div class="html-caption" ><b>Table 2.</b> Wavelengths used in the Rikola camera bands and their respective FWHM.</div> <table > <thead ><tr ><th colspan='3' align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Sensor 2</th><th colspan='3' align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Sensor 1</th></tr><tr ><th align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >Band</th><th align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >λ * (nm)</th><th align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >FWHM (nm)</th><th align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >Band</th><th align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >λ * (nm)</th><th align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >FWHM (nm)</th></tr></thead><tbody ><tr ><td align='center' valign='middle' class='html-align-center' >1</td><td align='center' valign='middle' class='html-align-center' >506.22</td><td align='center' valign='middle' class='html-align-center' >12.44</td><td align='center' valign='middle' class='html-align-center' >11</td><td align='center' valign='middle' class='html-align-center' >650.96</td><td align='center' valign='middle' class='html-align-center' >14.44</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >2</td><td align='center' valign='middle' class='html-align-center' >519.94</td><td align='center' valign='middle' class='html-align-center' >17.38</td><td align='center' valign='middle' class='html-align-center' >12</td><td align='center' valign='middle' class='html-align-center' >659.72</td><td align='center' valign='middle' class='html-align-center' >16.83</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >3</td><td align='center' valign='middle' class='html-align-center' >535.09</td><td align='center' valign='middle' class='html-align-center' >16.84</td><td align='center' valign='middle' class='html-align-center' >13</td><td align='center' valign='middle' class='html-align-center' >669.75</td><td align='center' valign='middle' class='html-align-center' >19.80</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >4</td><td align='center' valign='middle' class='html-align-center' >550.39</td><td align='center' valign='middle' class='html-align-center' >16.53</td><td align='center' valign='middle' class='html-align-center' >14</td><td align='center' valign='middle' class='html-align-center' >679.84</td><td align='center' valign='middle' class='html-align-center' >20.45</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >5</td><td align='center' valign='middle' class='html-align-center' >565.10</td><td align='center' valign='middle' class='html-align-center' >17.26</td><td align='center' valign='middle' class='html-align-center' >15</td><td align='center' valign='middle' class='html-align-center' >690.28</td><td align='center' valign='middle' class='html-align-center' >18.87</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >6</td><td align='center' valign='middle' class='html-align-center' >580.16</td><td align='center' valign='middle' class='html-align-center' >15.95</td><td align='center' valign='middle' class='html-align-center' >16</td><td align='center' valign='middle' class='html-align-center' >700.28</td><td align='center' valign='middle' class='html-align-center' >18.94</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >7</td><td align='center' valign='middle' class='html-align-center' >591.90</td><td align='center' valign='middle' class='html-align-center' >16.61</td><td align='center' valign='middle' class='html-align-center' >17</td><td align='center' valign='middle' class='html-align-center' >710.06</td><td align='center' valign='middle' class='html-align-center' >19.70</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >8</td><td align='center' valign='middle' class='html-align-center' >609.00</td><td align='center' valign='middle' class='html-align-center' >15.08</td><td align='center' valign='middle' class='html-align-center' >18</td><td align='center' valign='middle' class='html-align-center' >720.17</td><td align='center' valign='middle' class='html-align-center' >19.31</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >9</td><td align='center' valign='middle' class='html-align-center' >620.22</td><td align='center' valign='middle' class='html-align-center' >16.26</td><td align='center' valign='middle' class='html-align-center' >19</td><td align='center' valign='middle' class='html-align-center' >729.57</td><td align='center' valign='middle' class='html-align-center' >19.01</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >10</td><td align='center' valign='middle' class='html-align-center' >628.75</td><td align='center' valign='middle' class='html-align-center' >15.30</td><td align='center' valign='middle' class='html-align-center' >20</td><td align='center' valign='middle' class='html-align-center' >740.42</td><td align='center' valign='middle' class='html-align-center' >17.98</td></tr><tr ><td align='center' valign='middle' class='html-align-center' > </td><td align='center' valign='middle' class='html-align-center' > </td><td align='center' valign='middle' class='html-align-center' > </td><td align='center' valign='middle' class='html-align-center' >21</td><td align='center' valign='middle' class='html-align-center' >750.16</td><td align='center' valign='middle' class='html-align-center' >17.97</td></tr><tr ><td align='center' valign='middle' class='html-align-center' > </td><td align='center' valign='middle' class='html-align-center' > </td><td align='center' valign='middle' class='html-align-center' > </td><td align='center' valign='middle' class='html-align-center' >22</td><td align='center' valign='middle' class='html-align-center' >769.89</td><td align='center' valign='middle' class='html-align-center' >18.72</td></tr><tr ><td align='center' valign='middle' class='html-align-center' > </td><td align='center' valign='middle' class='html-align-center' > </td><td align='center' valign='middle' class='html-align-center' > </td><td align='center' valign='middle' class='html-align-center' >23</td><td align='center' valign='middle' class='html-align-center' >780.49</td><td align='center' valign='middle' class='html-align-center' >17.36</td></tr><tr ><td align='center' valign='middle' class='html-align-center' > </td><td align='center' valign='middle' class='html-align-center' > </td><td align='center' valign='middle' class='html-align-center' > </td><td align='center' valign='middle' class='html-align-center' >24</td><td align='center' valign='middle' class='html-align-center' >790.30</td><td align='center' valign='middle' class='html-align-center' >17.39</td></tr><tr ><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' > </td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' > </td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' > </td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >25</td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >819.66</td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >17.84</td></tr></tbody> </table> <div class='html-table_foot html-p'><div class='html-p' style='text-indent:0em;'><span class='html-fn-content'>* Wavelength.</span></div><div style='clear:both;'></div></div> </div><div class="html-table-wrap" id="forests-14-00945-t003"> <div class="html-table_wrap_td" > <div class="html-tablepopup html-tablepopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href='#table_body_display_forests-14-00945-t003'> <img alt="Table" data-lsrc="https://www.mdpi.com/img/table.png" /> <a class="html-expand html-tablepopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#table_body_display_forests-14-00945-t003"></a> </div> </div> <div class="html-table_wrap_discription"> <b>Table 3.</b> Summary of flight campaigns for images acquisition. </div> </div> <div class="html-table_show mfp-hide " id ="table_body_display_forests-14-00945-t003" > <div class="html-caption" ><b>Table 3.</b> Summary of flight campaigns for images acquisition.</div> <table > <thead ><tr ><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Plot</th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Date</th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Time (UTC–3h)</th></tr></thead><tbody ><tr ><td align='center' valign='middle' class='html-align-center' >P4, P5, P6</td><td align='center' valign='middle' class='html-align-center' >9 August 2016</td><td align='center' valign='middle' class='html-align-center' >11h46</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >P1, P3</td><td align='center' valign='middle' class='html-align-center' >10 August 2016</td><td align='center' valign='middle' class='html-align-center' >13h05</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >P8 to P15</td><td align='center' valign='middle' class='html-align-center' >1 July 2017</td><td align='center' valign='middle' class='html-align-center' >10h11</td></tr><tr ><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >P2, P7</td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >1 July 2017</td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >12h19</td></tr></tbody> </table> </div><div class="html-table-wrap" id="forests-14-00945-t004"> <div class="html-table_wrap_td" > <div class="html-tablepopup html-tablepopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href='#table_body_display_forests-14-00945-t004'> <img alt="Table" data-lsrc="https://www.mdpi.com/img/table.png" /> <a class="html-expand html-tablepopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#table_body_display_forests-14-00945-t004"></a> </div> </div> <div class="html-table_wrap_discription"> <b>Table 4.</b> Criteria for merging the superpixels. </div> </div> <div class="html-table_show mfp-hide " id ="table_body_display_forests-14-00945-t004" > <div class="html-caption" ><b>Table 4.</b> Criteria for merging the superpixels.</div> <table > <thead ><tr ><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Criteria</th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Max. Height Class (m)</th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Standart Deviaton (m)</th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >JM Distance</th></tr></thead><tbody ><tr ><td align='center' valign='middle' class='html-align-center' >1</td><td align='center' valign='middle' class='html-align-center' >12.4–18.3</td><td align='center' valign='middle' class='html-align-center' >≥1.5</td><td rowspan='3' align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >≥0.00215</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >2</td><td align='center' valign='middle' class='html-align-center' >18.4–24.1</td><td align='center' valign='middle' class='html-align-center' >≥2.5</td></tr><tr ><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >3</td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >24.2–29.9</td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >≥3.5</td></tr></tbody> </table> </div><div class="html-table-wrap" id="forests-14-00945-t005"> <div class="html-table_wrap_td" > <div class="html-tablepopup html-tablepopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href='#table_body_display_forests-14-00945-t005'> <img alt="Table" data-lsrc="https://www.mdpi.com/img/table.png" /> <a class="html-expand html-tablepopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#table_body_display_forests-14-00945-t005"></a> </div> </div> <div class="html-table_wrap_discription"> <b>Table 5.</b> Vegetation indices (VIs) calculated from hyperspectral orthomosaics. </div> </div> <div class="html-table_show mfp-hide " id ="table_body_display_forests-14-00945-t005" > <div class="html-caption" ><b>Table 5.</b> Vegetation indices (VIs) calculated from hyperspectral orthomosaics.</div> <table > <thead ><tr ><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >ID</th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Vegetation Index</th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Equation</th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Rikola Bands</th></tr></thead><tbody ><tr ><td align='center' valign='middle' class='html-align-center' >NDVI</td><td align='center' valign='middle' class='html-align-center' >Normalized Difference Vegetation Index [<a href="#B80-forests-14-00945" class="html-bibr">80</a>]</td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>750</mn> </mrow> </msub> <mo>−</mo> <msub> <mi>ρ</mi> <mrow> <mn>650</mn> </mrow> </msub> </mrow> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>750</mn> </mrow> </msub> <mo>+</mo> <msub> <mi>ρ</mi> <mrow> <mn>650</mn> </mrow> </msub> </mrow> </mfrac> </mrow> </semantics> </math></td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <mi>B</mi> <mn>21</mn> <mo>−</mo> <mi>B</mi> <mn>11</mn> </mrow> <mrow> <mi>B</mi> <mn>21</mn> <mo>+</mo> <mi>B</mi> <mn>11</mn> </mrow> </mfrac> </mrow> </semantics> </math></td></tr><tr ><td align='center' valign='middle' class='html-align-center' >ND</td><td align='center' valign='middle' class='html-align-center' >Normalized Difference 682/553 [<a href="#B81-forests-14-00945" class="html-bibr">81</a>,<a href="#B82-forests-14-00945" class="html-bibr">82</a>]</td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>682</mn> </mrow> </msub> <mo>−</mo> <msub> <mi>ρ</mi> <mrow> <mn>553</mn> </mrow> </msub> </mrow> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>682</mn> </mrow> </msub> <mo>+</mo> <msub> <mi>ρ</mi> <mrow> <mn>553</mn> </mrow> </msub> </mrow> </mfrac> </mrow> </semantics> </math></td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <mi>B</mi> <mn>14</mn> <mo>−</mo> <mi>B</mi> <mn>4</mn> </mrow> <mrow> <mi>B</mi> <mn>14</mn> <mo>+</mo> <mi>B</mi> <mn>4</mn> </mrow> </mfrac> </mrow> </semantics> </math></td></tr><tr ><td align='center' valign='middle' class='html-align-center' >NDVIh</td><td align='center' valign='middle' class='html-align-center' >Normalized Difference 780/550 Green NDVI hyper [<a href="#B83-forests-14-00945" class="html-bibr">83</a>,<a href="#B84-forests-14-00945" class="html-bibr">84</a>]</td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>780</mn> </mrow> </msub> <mo>−</mo> <msub> <mi>ρ</mi> <mrow> <mn>550</mn> </mrow> </msub> </mrow> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>780</mn> </mrow> </msub> <mo>+</mo> <msub> <mi>ρ</mi> <mrow> <mn>550</mn> </mrow> </msub> </mrow> </mfrac> </mrow> </semantics> </math></td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <mi>B</mi> <mn>23</mn> <mo>−</mo> <mi>B</mi> <mn>4</mn> </mrow> <mrow> <mi>B</mi> <mn>23</mn> <mo>+</mo> <mi>B</mi> <mn>4</mn> </mrow> </mfrac> </mrow> </semantics> </math></td></tr><tr ><td align='center' valign='middle' class='html-align-center' >MCARI</td><td align='center' valign='middle' class='html-align-center' >Modified Chlorophyll Absorption in Reflectance Index [<a href="#B85-forests-14-00945" class="html-bibr">85</a>]</td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mo stretchy="false">(</mo> <msub> <mi>ρ</mi> <mrow> <mn>700</mn> </mrow> </msub> <mo>−</mo> <msub> <mi>ρ</mi> <mrow> <mn>670</mn> </mrow> </msub> <mo stretchy="false">)</mo> <mo>−</mo> <mn>0.2</mn> <mfenced> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>700</mn> </mrow> </msub> <mo>−</mo> <msub> <mi>ρ</mi> <mrow> <mn>550</mn> </mrow> </msub> </mrow> </mfenced> <mfenced> <mrow> <mfrac> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>700</mn> </mrow> </msub> </mrow> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>670</mn> </mrow> </msub> </mrow> </mfrac> </mrow> </mfenced> </mrow> </semantics> </math></td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfenced> <mrow> <mi>B</mi> <mn>16</mn> <mo>−</mo> <mi>B</mi> <mn>13</mn> </mrow> </mfenced> <mo>−</mo> <mn>0.2</mn> <mfenced> <mrow> <mi>B</mi> <mn>16</mn> <mo>−</mo> <mi>B</mi> <mn>4</mn> </mrow> </mfenced> <mfenced> <mrow> <mfrac> <mrow> <mi>B</mi> <mn>16</mn> </mrow> <mrow> <mi>B</mi> <mn>13</mn> </mrow> </mfrac> </mrow> </mfenced> </mrow> </semantics> </math></td></tr><tr ><td align='center' valign='middle' class='html-align-center' >PRI</td><td align='center' valign='middle' class='html-align-center' >Photochemical Reflectance Index [<a href="#B86-forests-14-00945" class="html-bibr">86</a>]</td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>535</mn> </mrow> </msub> <mo>−</mo> <msub> <mi>ρ</mi> <mrow> <mn>565</mn> </mrow> </msub> </mrow> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>535</mn> </mrow> </msub> <mo>+</mo> <msub> <mi>ρ</mi> <mrow> <mn>565</mn> </mrow> </msub> </mrow> </mfrac> </mrow> </semantics> </math></td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <mi>B</mi> <mn>3</mn> <mo>−</mo> <mi>B</mi> <mn>5</mn> </mrow> <mrow> <mi>B</mi> <mn>3</mn> <mo>+</mo> <mi>B</mi> <mn>5</mn> </mrow> </mfrac> </mrow> </semantics> </math></td></tr><tr ><td align='center' valign='middle' class='html-align-center' >PSRI</td><td align='center' valign='middle' class='html-align-center' >Plant Senescence Reflectance Index [<a href="#B87-forests-14-00945" class="html-bibr">87</a>]</td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>679</mn> </mrow> </msub> <mo>−</mo> <msub> <mi>ρ</mi> <mrow> <mn>506</mn> </mrow> </msub> </mrow> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>750</mn> </mrow> </msub> </mrow> </mfrac> </mrow> </semantics> </math></td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <mi>B</mi> <mn>14</mn> <mo>−</mo> <mi>B</mi> <mn>1</mn> </mrow> <mrow> <mi>B</mi> <mn>21</mn> </mrow> </mfrac> </mrow> </semantics> </math></td></tr><tr ><td align='center' valign='middle' class='html-align-center' >PSSR</td><td align='center' valign='middle' class='html-align-center' >Pigment Specific Simple Ratio [<a href="#B88-forests-14-00945" class="html-bibr">88</a>]</td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>819</mn> </mrow> </msub> </mrow> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>679</mn> </mrow> </msub> </mrow> </mfrac> </mrow> </semantics> </math></td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <mi>B</mi> <mn>25</mn> </mrow> <mrow> <mi>B</mi> <mn>14</mn> </mrow> </mfrac> </mrow> </semantics> </math></td></tr><tr ><td align='center' valign='middle' class='html-align-center' >RE</td><td align='center' valign='middle' class='html-align-center' >Red edge [<a href="#B89-forests-14-00945" class="html-bibr">89</a>]</td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>670</mn> </mrow> </msub> <mo>−</mo> <msub> <mi>ρ</mi> <mrow> <mn>780</mn> </mrow> </msub> </mrow> <mn>2</mn> </mfrac> </mrow> </semantics> </math></td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <mi>B</mi> <mn>13</mn> <mo>−</mo> <mi>B</mi> <mn>23</mn> </mrow> <mn>2</mn> </mfrac> </mrow> </semantics> </math></td></tr><tr ><td align='center' valign='middle' class='html-align-center' >REP</td><td align='center' valign='middle' class='html-align-center' >Red edge position [<a href="#B89-forests-14-00945" class="html-bibr">89</a>,<a href="#B90-forests-14-00945" class="html-bibr">90</a>,<a href="#B91-forests-14-00945" class="html-bibr">91</a>]</td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mn>700</mn> <mo>+</mo> <mn>40</mn> <mfrac> <mrow> <msub> <mi>ρ</mi> <mrow> <mi>r</mi> <mi>e</mi> <mi>d</mi> <mi>e</mi> <mi>d</mi> <mi>g</mi> <mi>e</mi> </mrow> </msub> <mo>−</mo> <msub> <mi>ρ</mi> <mrow> <mn>700</mn> </mrow> </msub> </mrow> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>740</mn> </mrow> </msub> <mo>−</mo> <msub> <mi>ρ</mi> <mrow> <mn>700</mn> </mrow> </msub> </mrow> </mfrac> </mrow> </semantics> </math></td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mn>700</mn> <mo>+</mo> <mn>40</mn> <mfrac> <mrow> <msub> <mi>ρ</mi> <mrow> <mi>r</mi> <mi>e</mi> <mi>d</mi> <mi>e</mi> <mi>d</mi> <mi>g</mi> <mi>e</mi> </mrow> </msub> <mo>−</mo> <mi>B</mi> <mn>16</mn> </mrow> <mrow> <mi>B</mi> <mn>20</mn> <mo>−</mo> <mi>B</mi> <mn>16</mn> </mrow> </mfrac> </mrow> </semantics> </math></td></tr><tr ><td align='center' valign='middle' class='html-align-center' >RENDVI</td><td align='center' valign='middle' class='html-align-center' >Red Edge Normalized Difference Vegetation Index [<a href="#B83-forests-14-00945" class="html-bibr">83</a>]</td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>753</mn> </mrow> </msub> <mo>−</mo> <msub> <mi>ρ</mi> <mrow> <mn>700</mn> </mrow> </msub> </mrow> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>753</mn> </mrow> </msub> <mo>+</mo> <msub> <mi>ρ</mi> <mrow> <mn>700</mn> </mrow> </msub> </mrow> </mfrac> </mrow> </semantics> </math></td><td align='center' valign='middle' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <mi>B</mi> <mn>21</mn> <mo>−</mo> <mi>B</mi> <mn>16</mn> </mrow> <mrow> <mi>B</mi> <mn>21</mn> <mo>+</mo> <mi>B</mi> <mn>16</mn> </mrow> </mfrac> </mrow> </semantics> </math></td></tr><tr ><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >SIPI</td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >Structure Insensitive Pigment reflectance index [<a href="#B92-forests-14-00945" class="html-bibr">92</a>]</td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>800</mn> </mrow> </msub> <mo>−</mo> <msub> <mi>ρ</mi> <mrow> <mn>500</mn> </mrow> </msub> </mrow> <mrow> <msub> <mi>ρ</mi> <mrow> <mn>800</mn> </mrow> </msub> <mo>+</mo> <msub> <mi>ρ</mi> <mrow> <mn>680</mn> </mrow> </msub> </mrow> </mfrac> </mrow> </semantics> </math></td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' ><math display='inline'> <semantics> <mrow> <mfrac> <mrow> <mi>B</mi> <mn>24</mn> <mo>−</mo> <mi>B</mi> <mn>1</mn> </mrow> <mrow> <mi>B</mi> <mn>24</mn> <mo>+</mo> <mi>B</mi> <mn>1</mn> </mrow> </mfrac> </mrow> </semantics> </math></td></tr></tbody> </table> </div><div class="html-table-wrap" id="forests-14-00945-t006"> <div class="html-table_wrap_td" > <div class="html-tablepopup html-tablepopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href='#table_body_display_forests-14-00945-t006'> <img alt="Table" data-lsrc="https://www.mdpi.com/img/table.png" /> <a class="html-expand html-tablepopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#table_body_display_forests-14-00945-t006"></a> </div> </div> <div class="html-table_wrap_discription"> <b>Table 6.</b> Raster metrics extracted from LiDAR FWF data (Source: Miltiadou et al., 2019 [<a href="#B72-forests-14-00945" class="html-bibr">72</a>]). </div> </div> <div class="html-table_show mfp-hide " id ="table_body_display_forests-14-00945-t006" > <div class="html-caption" ><b>Table 6.</b> Raster metrics extracted from LiDAR FWF data (Source: Miltiadou et al., 2019 [<a href="#B72-forests-14-00945" class="html-bibr">72</a>]).</div> <table > <thead ><tr ><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Metric</th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Description</th></tr></thead><tbody ><tr ><td align='center' valign='middle' class='html-align-center' >Height</td><td align='center' valign='middle' class='html-align-center' >Distance between the lower boundaries of the FW voxelized space and the top of non-empty voxel of the column.</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >Thickness</td><td align='center' valign='middle' class='html-align-center' >Distance between the first and last non-empty voxel for each column.</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >Density</td><td align='center' valign='middle' class='html-align-center' >It is a ratio of the number of non-empty voxels and the Thickness of each column.</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >First Patch</td><td align='center' valign='middle' class='html-align-center' >Finds the first non-empty voxel of the column and counts downward how many adjacent non-empty voxels exist.</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >Last Patch</td><td align='center' valign='middle' class='html-align-center' >Finds the last non-empty voxel of the column and counts upward how many adjacent non-empty voxels exist.</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >Average Height Difference</td><td align='center' valign='middle' class='html-align-center' >It is a laplacian edge detector. Once the height difference between a given column and each adjacent column is calculated, the average difference of its adjacent columns is taken.</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >Lowest Return</td><td align='center' valign='middle' class='html-align-center' >The voxel length multiplied by the nymber of voxels that exist after the lowest non-empty voxel of the column.</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >Maximum Intensity</td><td align='center' valign='middle' class='html-align-center' >The maximum intensity of each column.</td></tr><tr ><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >Average Intensity</td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >The average intensity of each column.</td></tr></tbody> </table> </div><div class="html-table-wrap" id="forests-14-00945-t007"> <div class="html-table_wrap_td" > <div class="html-tablepopup html-tablepopup-link" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href='#table_body_display_forests-14-00945-t007'> <img alt="Table" data-lsrc="https://www.mdpi.com/img/table.png" /> <a class="html-expand html-tablepopup" data-counterslinkmanual = "https://www.mdpi.com/1999-4907/14/5/945/display" href="#table_body_display_forests-14-00945-t007"></a> </div> </div> <div class="html-table_wrap_discription"> <b>Table 7.</b> Different scenarios tested for each dataset with the number of features used in each test. </div> </div> <div class="html-table_show mfp-hide " id ="table_body_display_forests-14-00945-t007" > <div class="html-caption" ><b>Table 7.</b> Different scenarios tested for each dataset with the number of features used in each test.</div> <table > <thead ><tr ><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Scenario</th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Datasets</th><th align='center' valign='middle' style='border-top:solid thin;border-bottom:solid thin' class='html-align-center' >Number of Features</th></tr></thead><tbody ><tr ><td align='center' valign='middle' class='html-align-center' >S1</td><td align='center' valign='middle' class='html-align-center' >Tree Spectra</td><td align='center' valign='middle' class='html-align-center' >25</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >S2</td><td align='center' valign='middle' class='html-align-center' >Vis</td><td align='center' valign='middle' class='html-align-center' >11</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >S3</td><td align='center' valign='middle' class='html-align-center' >PR LiDAR</td><td align='center' valign='middle' class='html-align-center' >53</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >S4</td><td align='center' valign='middle' class='html-align-center' >PR LiDAR PCA</td><td align='center' valign='middle' class='html-align-center' >6</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >S5</td><td align='center' valign='middle' class='html-align-center' >FWF LiDAR</td><td align='center' valign='middle' class='html-align-center' >9</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >S6</td><td align='center' valign='middle' class='html-align-center' >Tree Spectra + PR LiDAR</td><td align='center' valign='middle' class='html-align-center' >78</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >S7</td><td align='center' valign='middle' class='html-align-center' >Tree Spectra + PR LiDAR PCA</td><td align='center' valign='middle' class='html-align-center' >31</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >S8</td><td align='center' valign='middle' class='html-align-center' >Tree Spectra + FWF LiDAR</td><td align='center' valign='middle' class='html-align-center' >34</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >S9</td><td align='center' valign='middle' class='html-align-center' >VIs + PR LiDAR</td><td align='center' valign='middle' class='html-align-center' >64</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >S10</td><td align='center' valign='middle' class='html-align-center' >VIs + PR LiDAR PCA</td><td align='center' valign='middle' class='html-align-center' >17</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >S11</td><td align='center' valign='middle' class='html-align-center' >VIs + FWF LiDAR</td><td align='center' valign='middle' class='html-align-center' >20</td></tr><tr ><td align='center' valign='middle' class='html-align-center' >S12</td><td align='center' valign='middle' class='html-align-center' >All Features (Tree Spectra + Vis + PR LiDAR + FWF LiDAR)</td><td align='center' valign='middle' class='html-align-center' >98</td></tr><tr ><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >S13</td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >All_PCA</td><td align='center' valign='middle' style='border-bottom:solid thin' class='html-align-center' >10</td></tr></tbody> </table> </div></section><section class='html-fn_group'><table><tr id=''><td></td><td><div class='html-p'><b>Disclaimer/Publisher’s Note:</b> The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.</div></td></tr></table></section> <section id="html-copyright"><br>© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (<a href='https://creativecommons.org/licenses/by/4.0/' target='_blank' rel="noopener noreferrer" >https://creativecommons.org/licenses/by/4.0/</a>).</section> </div> </div> <div class="additional-content"> <h2><a name="cite"></a>Share and Cite</h2> <div class="social-media-links" style="text-align: left;"> <a href="/cdn-cgi/l/email-protection#4c736a2d213c773f392e26292f38710a3e2321697e7c01081c05697f0d697e7c697e7e183e2929697e7c1f3c292f25293f697e7c0f202d3f3f252a252f2d38252322697e7c2522697e7c2d697e7c0f23213c202934697e7c0e3e2d362520252d22697e7c183e233c252f2d20697e7c0a233e293f38697e7c193f25222b697e7c04353c293e3f3c292f383e2d20697e7c2d2228697e7c0025080d1e697e7c082d382d6a3d392338776a2d213c772e232835712438383c3f7663633b3b3b6221283c25622f2321637e7e7b747c797c697f0d697c0d697c0d183e2929697e7c1f3c292f25293f697e7c0f202d3f3f252a252f2d38252322697e7c2522697e7c2d697e7c0f23213c202934697e7c0e3e2d362520252d22697e7c183e233c252f2d20697e7c0a233e293f38697e7c193f25222b697e7c04353c293e3f3c292f383e2d20697e7c2d2228697e7c0025080d1e697e7c082d382d697c0d697c0d0d2e3f383e2d2f38697f0d697e7c1824253f697e7c3f38392835697e7c29343c293e25212922383f697e7c3b253824697e7c28252a2a293e292238697e7c2f23212e25222d382523223f697e7c232a697e7c190d1a697e7c24353c293e3f3c292f383e2d20697e7c282d382d697e7c2d2228697e7c0025080d1e697e7c2129383e252f3f697e7c2a233e697e7c2f202d3f3f252a3525222b697e7c29252b2438697e7c383e2929697e7c3f3c292f25293f697e7c2a23392228697e7c2522697e7c2d697e7c0e3e2d362520252d22697e7c0d38202d2238252f697e7c0a233e293f38697e7c3e2921222d2238697e0f697e7c382429697e7c21233f38697e7c28292b3e2d282928697e7c0e3e2d362520252d22697e7c2e25232129697e7c3b253824697e7c24252b24697e7c2a3e2d2b212922382d38252322697e7c2e3938697e7c3b253824697e7c24392b29697e7c3f383e392f38393e2d20697e7c2f23213c20293425383562697e7c182429697e7c3f2920292f38252322697e7c232a697e7c382429697e7c3f3c292f25293f697e7c3b2d3f697e7c28232229697e7c2e2d3f2928697e7c2322697e7c382429697e7c2239212e293e697e7c232a697e7c383e2929697e7c3f2d213c20293f697e0f697e7c3b24252f24697e7c2934253f38697e7c2522697e7c382429697e7c3c202338697e7c282d382d697e7c2d2228697e7c2522697e7c382429697e7c2a2d2f38697e7c382429697e7c190d1a697e7c25212d2b293e35697e7c2823293f697e7c222338697e7c2d2f3d39253e29697e7c25222a233e212d38252322697e7c2e2920233b697e7c382429697e7c2a233e293f38697e7c2f2d22233c3562697e7c083929697e7c3823697e7c382429697e7c2f23213c202934253835697e7c232a697e7c382429697e7c2a233e293f38697e0f697e7c23222035697e7c3f3c292f25293f697e7c38242d38697e7c2934253f38697e7c2522697e7c382429697e7c393c3c293e697e7c2f2d22233c35697e7c232a697e7c382429697e7c3e2921222d2238697e7c3b293e29697e7c25222f2039282928697e7c2522697e7c382429697e7c2f202d3f3f252a252f2d3825232262697e7c0d697e7c2f23212e25222d38252322697e7c232a697e7c24353c293e3f3c292f383e2d20697e7c190d1a697e7c25212d2b293f697e7c2d2228697e7c0025080d1e697e7c3c23252238697e7c2f202339283f697e7c3b293e29697e7c2522697e7c382429697e7c29343c293e252129223862697e7c182429697e7c24353c293e3f3c292f383e2d20697e7c25212d2b293f697e7c3b293e29697e7c3c242338232b3e2d212129383e252f697e7c2d2228697e7c3e2d2825232129383e252f697e7c3c3e232f293f3f2928697e7c3823697e7c232e382d2522697e7c233e38242321233f2d252f3f697e7c3b253824697e7c3e292a20292f382d222f29697e7c2a2d2f38233e697e7c3a2d2039293f62697e7c1e2d3b697e7c3f3c292f383e2d697e7c3b293e29697e7c2934383e2d2f382928697e7c2a3e2321697e7c382429697e7c383e29293f697e0f697e7c2d2228697e7c3a292b29382d38252322697e7c252228252f293f697e7c697e741a053f697e75697e7c3b293e29697e7c2f2d202f39202d38292862697e7c1e292b2d3e2825222b697e7c382429697e7c0025080d1e697e7c282d382d697e0f697e7c2e233824697e7c382429697e7c3c23252238697e7c2f20233928697e7a21282d3f24697f0e3e292a293e3e2928697e7c3823697e7c2d3f697e7c1c292d27697e7c1e2938393e223f697e7c697e741c1e697e75697e7a21282d3f24697f0e2d2228697e7c382429697e7c2a392020613b2d3a292a233e21697e7c697e740a1b0a697e75697e7c0025080d1e697e7c3b293e29697e7c25222f2039282928697e7c2522697e7c3824253f697e7c3f3839283562697e7c182429697e7c3c23252238697e7c2f202339283f697e7c3b293e29697e7c3c3e232f293f3f2928697e7c3823697e7c22233e212d20253629697e7c382429697e7c25223829223f253825293f697e7c2d2228697e7c2429252b24383f697e0f697e7c2d2228697e7c28252a2a293e292238697e7c2129383e252f3f697e7c2a233e697e7c292d2f24697e7c282d382d697e7c38353c29697e7c697e741c1e697e7c2d2228697e7c0a1b0a697e75697e7c3b293e291762626211" title="Email"> <i class="fa fa-envelope-square" style="font-size: 30px;"></i> </a> <a href="https://twitter.com/intent/tweet?text=Tree+Species+Classification+in+a+Complex+Brazilian+Tropical+Forest+Using+Hyperspectral+and+LiDAR+Data&amp;hashtags=mdpiforests&amp;url=https%3A%2F%2Fwww.mdpi.com%2F2278050&amp;via=Forests_MDPI" onclick="windowOpen(this.href,600,800); return false" target="_blank" rel="noopener noreferrer"> <i class="fa fa-twitter-x-square" style="font-size: 30px;"></i> </a> <a href=" http://www.linkedin.com/shareArticle?mini=true&amp;url=https%3A%2F%2Fwww.mdpi.com%2F2278050&amp;title=Tree%20Species%20Classification%20in%20a%20Complex%20Brazilian%20Tropical%20Forest%20Using%20Hyperspectral%20and%20LiDAR%20Data%26source%3Dhttps%3A%2F%2Fwww.mdpi.com%26summary%3DThis%20study%20experiments%20with%20different%20combinations%20of%20UAV%20hyperspectral%20data%20and%20LiDAR%20metrics%20for%20classifying%20eight%20tree%20species%20found%20in%20a%20Brazilian%20Atlantic%20Forest%20remnant%2C%20the%20most%20degraded%20Brazilian%20biome%20with%20high%20fragmentation%20but%20with%20huge%20%5B...%5D" onclick="windowOpen(this.href,600,800); return false" title="LinkedIn" target="_blank" rel="noopener noreferrer"> <i class="fa fa-linkedin-square" style="font-size: 30px;"></i> </a> <a href="https://www.facebook.com/sharer.php?u=https://www.mdpi.com/2278050" title="facebook" target="_blank" rel="noopener noreferrer"> <i class="fa fa-facebook-square" style="font-size: 30px;"></i> </a> <a href="javascript:void(0);" title="Wechat" data-reveal-id="weixin-share-modal"> <i class="fa fa-weixin-square" style="font-size: 26px;"></i> </a> <a href="http://www.reddit.com/submit?url=https://www.mdpi.com/2278050" title="Reddit" target="_blank" rel="noopener noreferrer"> <i class="fa fa-reddit-square" style="font-size: 30px;"></i> </a> <a href="http://www.mendeley.com/import/?url=https://www.mdpi.com/2278050" title="Mendeley" target="_blank" rel="noopener noreferrer"> <i class="fa fa-mendeley-square" style="font-size: 30px;"></i> </a> </div> <div class="in-tab" style="padding-top: 0px!important; margin-top: 15px;"> <div><b>MDPI and ACS Style</b></div> <p> Pereira Martins-Neto, R.; Garcia Tommaselli, A.M.; Imai, N.N.; Honkavaara, E.; Miltiadou, M.; Saito Moriya, E.A.; David, H.C. Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data. <em>Forests</em> <b>2023</b>, <em>14</em>, 945. https://doi.org/10.3390/f14050945 </p> <div style="display: block"> <b>AMA Style</b><br> <p> Pereira Martins-Neto R, Garcia Tommaselli AM, Imai NN, Honkavaara E, Miltiadou M, Saito Moriya EA, David HC. Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data. <em>Forests</em>. 2023; 14(5):945. https://doi.org/10.3390/f14050945 </p> <b>Chicago/Turabian Style</b><br> <p> Pereira Martins-Neto, Rorai, Antonio Maria Garcia Tommaselli, Nilton Nobuhiro Imai, Eija Honkavaara, Milto Miltiadou, Erika Akemi Saito Moriya, and Hassan Camil David. 2023. "Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data" <em>Forests</em> 14, no. 5: 945. https://doi.org/10.3390/f14050945 </p> <b>APA Style</b><br> <p> Pereira Martins-Neto, R., Garcia Tommaselli, A. M., Imai, N. N., Honkavaara, E., Miltiadou, M., Saito Moriya, E. A., & David, H. C. (2023). Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data. <em>Forests</em>, <em>14</em>(5), 945. https://doi.org/10.3390/f14050945 </p> </div> </div> <div class="info-box no-margin"> Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details <a target="_blank" href="https://www.mdpi.com/about/announcements/784">here</a>. </div> <h2><a name="metrics"></a>Article Metrics</h2> <div class="row"> <div class="small-12 columns"> <div id="loaded_cite_count" style="display:none">No</div> <div id="framed_div_cited_count" class="in-tab" style="display: none; overflow: auto;"></div> <div id="loaded" style="display:none">No</div> <div id="framed_div" class="in-tab" style="display: none; margin-top: 10px;"></div> </div> <div class="small-12 columns"> <div id="article_stats_div" style="display: none; margin-bottom: 1em;"> <h3>Article Access Statistics</h3> <div id="article_stats_swf" ></div> For more information on the journal statistics, click <a href="/journal/forests/stats">here</a>. <div class="info-box"> Multiple requests from the same IP address are counted as one view. </div> </div> </div> </div> </div> </div> </article> </div> <div id="supplementaryModal" class="reveal-modal reveal-modal-new" data-reveal aria-labelledby="Captcha" aria-hidden="true" role="dialog"> <div class="row"> <div class="large-12 medium-12 small-12 columns"> <h2>Supplementary Material</h2> <div class="custom-accordion-for-small-screen-content show-for-medium-up"> <div class="in-tab"> <ul style="margin:0; list-style: none; overflow: auto;"> <li><a name="supplementary_file1"></a><b>Supplementary File 1:</b> <p> <a href="/1999-4907/14/5/945/s1?version=1683190015"> ZIP-Document </a> (ZIP, 6482 KiB) </p> </li> </ul> </div> </div> </div> </div> <a class="close-reveal-modal" aria-label="Close"> <i class="material-icons">clear</i> </a> </div> </div></div> <div class="webpymol-controls webpymol-controls-template" style="margin-top: 10px; display: none;"> <a class="bzoom">Zoom</a> <span style="display: inline-block; margin-left: 5px; margin-right: 5px;">|</span> <a class="borient"> Orient </a> <span style="display: inline-block; margin-left: 5px; margin-right: 5px;">|</span> <a class="blines"> As Lines </a> <span style="display: inline-block; margin-left: 5px; margin-right: 5px;">|</span> <a class="bsticks"> As Sticks </a> <span style="display: inline-block; margin-left: 5px; margin-right: 5px;">|</span> <a class="bcartoon"> As Cartoon </a> <span style="display: inline-block; margin-left: 5px; margin-right: 5px;">|</span> <a class="bsurface"> As Surface </a> <span style="display: inline-block; margin-left: 5px; margin-right: 5px;">|</span> <a class="bprevscene">Previous Scene</a> <span style="display: inline-block; margin-left: 5px; margin-right: 5px;">|</span> <a class="bnextscene">Next Scene</a> </div> <div id="scifeed-modal" class="reveal-modal reveal-modal-new" data-reveal aria-labelledby="modalTitle" aria-hidden="true" role="dialog"> </div> <div id="recommended-articles-modal" class="reveal-modal reveal-modal-new" data-reveal aria-labelledby="modalTitle" aria-hidden="true" role="dialog"> </div> <div id="author-biographies-modal" class="reveal-modal reveal-modal-new" data-reveal aria-labelledby="modalTitle" aria-hidden="true" role="dialog"> </div> <div id="cite-modal" class="reveal-modal reveal-modal-new" data-reveal aria-labelledby="Captcha" aria-hidden="true" role="dialog"> <div class="row"> <div class="small-12 columns"> <h2 style="margin: 0;">Cite</h2> </div> <div class="small-12 columns"> <!-- BibTeX --> <form style="margin:0; padding:0; display:inline;" name="export-bibtex" method="POST" action="/export"> <input type="hidden" name="articles_ids[]" value="1139025"> <input type="hidden" name="export_format_top" value="bibtex"> <input type="hidden" name="export_submit_top" value=""> </form> <!-- EndNote --> <form style="margin:0; padding:0; display:inline;" name="export-endnote" method="POST" action="/export"> <input type="hidden" name="articles_ids[]" value="1139025"> <input type="hidden" name="export_format_top" value="endnote_no_abstract"> <input type="hidden" name="export_submit_top" value=""> </form> <!-- RIS --> <form style="margin:0; padding:0; display:inline;" name="export-ris" method="POST" action="/export"> <input type="hidden" name="articles_ids[]" value="1139025"> <input type="hidden" name="export_format_top" value="ris"> <input type="hidden" name="export_submit_top" value=""> </form> <div> Export citation file: <a href="javascript:window.document.forms['export-bibtex'].submit()">BibTeX</a> | <a href="javascript:window.document.forms['export-endnote'].submit()">EndNote</a> | <a href="javascript:window.document.forms['export-ris'].submit()">RIS</a> </div> </div> <div class="small-12 columns"> <div class="in-tab"> <div><b>MDPI and ACS Style</b></div> <p> Pereira Martins-Neto, R.; Garcia Tommaselli, A.M.; Imai, N.N.; Honkavaara, E.; Miltiadou, M.; Saito Moriya, E.A.; David, H.C. Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data. <em>Forests</em> <b>2023</b>, <em>14</em>, 945. https://doi.org/10.3390/f14050945 </p> <div style="display: block"> <b>AMA Style</b><br> <p> Pereira Martins-Neto R, Garcia Tommaselli AM, Imai NN, Honkavaara E, Miltiadou M, Saito Moriya EA, David HC. Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data. <em>Forests</em>. 2023; 14(5):945. https://doi.org/10.3390/f14050945 </p> <b>Chicago/Turabian Style</b><br> <p> Pereira Martins-Neto, Rorai, Antonio Maria Garcia Tommaselli, Nilton Nobuhiro Imai, Eija Honkavaara, Milto Miltiadou, Erika Akemi Saito Moriya, and Hassan Camil David. 2023. "Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data" <em>Forests</em> 14, no. 5: 945. https://doi.org/10.3390/f14050945 </p> <b>APA Style</b><br> <p> Pereira Martins-Neto, R., Garcia Tommaselli, A. M., Imai, N. N., Honkavaara, E., Miltiadou, M., Saito Moriya, E. A., & David, H. C. (2023). Tree Species Classification in a Complex Brazilian Tropical Forest Using Hyperspectral and LiDAR Data. <em>Forests</em>, <em>14</em>(5), 945. https://doi.org/10.3390/f14050945 </p> </div> </div> <div class="info-box no-margin"> Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details <a target="_blank" href="https://www.mdpi.com/about/announcements/784">here</a>. </div> </div> </div> <a class="close-reveal-modal" aria-label="Close"> <i class="material-icons">clear</i> </a> </div> </div> </div> </div> </div> </section> <div id="footer"> <div class="journal-info"> <span> <em><a class="Var_JournalInfo" href="/journal/forests">Forests</a></em>, EISSN 1999-4907, Published by MDPI </span> <div class="large-right"> <span> <a href="/rss/journal/forests" class="rss-link">RSS</a> </span> <span> <a href="/journal/forests/toc-alert">Content Alert</a> </span> </div> </div> <div class="row full-width footer-links" data-equalizer="footer" data-equalizer-mq="small"> <div class="large-2 large-push-4 medium-3 small-6 columns" data-equalizer-watch="footer"> <h3> Further Information </h3> <a href="/apc"> Article Processing Charges </a> <a href="/about/payment"> Pay an Invoice </a> <a href="/openaccess"> Open Access Policy </a> <a href="/about/contact"> Contact MDPI </a> <a href="https://careers.mdpi.com" target="_blank" rel="noopener noreferrer"> Jobs at MDPI </a> </div> <div class="large-2 large-push-4 medium-3 small-6 columns" data-equalizer-watch="footer"> <h3> Guidelines </h3> <a href="/authors"> For Authors </a> <a href="/reviewers"> For Reviewers </a> <a href="/editors"> For Editors </a> <a href="/librarians"> For Librarians </a> <a href="/publishing_services"> For Publishers </a> <a href="/societies"> For Societies </a> <a href="/conference_organizers"> For Conference Organizers </a> </div> <div class="large-2 large-push-4 medium-3 small-6 columns"> <h3> MDPI Initiatives </h3> <a href="https://sciforum.net" target="_blank" rel="noopener noreferrer"> Sciforum </a> <a href="https://www.mdpi.com/books" target="_blank" rel="noopener noreferrer"> MDPI Books </a> <a href="https://www.preprints.org" target="_blank" rel="noopener noreferrer"> Preprints.org </a> <a href="https://www.scilit.com" target="_blank" rel="noopener noreferrer"> Scilit </a> <a href="https://sciprofiles.com?utm_source=mpdi.com&utm_medium=bottom_menu&utm_campaign=initiative" target="_blank" rel="noopener noreferrer"> SciProfiles </a> <a href="https://encyclopedia.pub" target="_blank" rel="noopener noreferrer"> Encyclopedia </a> <a href="https://jams.pub" target="_blank" rel="noopener noreferrer"> JAMS </a> <a href="/about/proceedings"> Proceedings Series </a> </div> <div class="large-2 large-push-4 medium-3 small-6 right-border-large-without columns UA_FooterFollowMDPI"> <h3> Follow MDPI </h3> <a href="https://www.linkedin.com/company/mdpi" target="_blank" rel="noopener noreferrer"> LinkedIn </a> <a href="https://www.facebook.com/MDPIOpenAccessPublishing" target="_blank" rel="noopener noreferrer"> Facebook </a> <a href="https://twitter.com/MDPIOpenAccess" target="_blank" rel="noopener noreferrer"> Twitter </a> </div> <div id="footer-subscribe" class="large-4 large-pull-8 medium-12 small-12 left-border-large columns"> <div class="footer-subscribe__container"> <img class="show-for-large-up" src="https://pub.mdpi-res.com/img/design/mdpi-pub-logo-white-small.png?71d18e5f805839ab?1739771134" alt="MDPI" title="MDPI Open Access Journals" style="height: 50px; margin-bottom: 10px;"> <form id="newsletter" method="POST" action="/subscribe"> <p> Subscribe to receive issue release notifications and newsletters from MDPI journals </p> <select multiple id="newsletter-journal" class="foundation-select" name="journals[]"> <option value="acoustics">Acoustics</option> <option value="amh">Acta Microbiologica Hellenica</option> <option value="actuators">Actuators</option> <option value="adhesives">Adhesives</option> <option value="admsci">Administrative Sciences</option> <option value="adolescents">Adolescents</option> <option value="arm">Advances in Respiratory Medicine</option> <option value="aerobiology">Aerobiology</option> <option value="aerospace">Aerospace</option> <option value="agriculture">Agriculture</option> <option value="agriengineering">AgriEngineering</option> <option value="agrochemicals">Agrochemicals</option> <option value="agronomy">Agronomy</option> <option value="ai">AI</option> <option value="aisens">AI Sensors</option> <option value="air">Air</option> <option value="algorithms">Algorithms</option> <option value="allergies">Allergies</option> <option value="alloys">Alloys</option> <option value="analytica">Analytica</option> <option value="analytics">Analytics</option> <option value="anatomia">Anatomia</option> <option value="anesthres">Anesthesia Research</option> <option value="animals">Animals</option> <option value="antibiotics">Antibiotics</option> <option value="antibodies">Antibodies</option> <option value="antioxidants">Antioxidants</option> <option value="applbiosci">Applied Biosciences</option> <option value="applmech">Applied Mechanics</option> <option value="applmicrobiol">Applied Microbiology</option> <option value="applnano">Applied Nano</option> <option value="applsci">Applied Sciences</option> <option value="asi">Applied System Innovation</option> <option value="appliedchem">AppliedChem</option> <option value="appliedmath">AppliedMath</option> <option value="aquacj">Aquaculture Journal</option> <option value="architecture">Architecture</option> <option value="arthropoda">Arthropoda</option> <option value="arts">Arts</option> <option value="astronomy">Astronomy</option> <option value="atmosphere">Atmosphere</option> <option value="atoms">Atoms</option> <option value="audiolres">Audiology Research</option> <option value="automation">Automation</option> <option value="axioms">Axioms</option> <option value="bacteria">Bacteria</option> <option value="batteries">Batteries</option> <option value="behavsci">Behavioral Sciences</option> <option value="beverages">Beverages</option> <option value="BDCC">Big Data and Cognitive Computing</option> <option value="biochem">BioChem</option> <option value="bioengineering">Bioengineering</option> <option value="biologics">Biologics</option> <option value="biology">Biology</option> <option value="blsf">Biology and Life Sciences Forum</option> <option value="biomass">Biomass</option> <option value="biomechanics">Biomechanics</option> <option value="biomed">BioMed</option> <option value="biomedicines">Biomedicines</option> <option value="biomedinformatics">BioMedInformatics</option> <option value="biomimetics">Biomimetics</option> <option value="biomolecules">Biomolecules</option> <option value="biophysica">Biophysica</option> <option value="biosensors">Biosensors</option> <option value="biosphere">Biosphere</option> <option value="biotech">BioTech</option> <option value="birds">Birds</option> <option value="blockchains">Blockchains</option> <option value="brainsci">Brain Sciences</option> <option value="buildings">Buildings</option> <option value="businesses">Businesses</option> <option value="carbon">C</option> <option value="cancers">Cancers</option> <option value="cardiogenetics">Cardiogenetics</option> <option value="catalysts">Catalysts</option> <option value="cells">Cells</option> <option value="ceramics">Ceramics</option> <option value="challenges">Challenges</option> <option value="ChemEngineering">ChemEngineering</option> <option value="chemistry">Chemistry</option> <option value="chemproc">Chemistry Proceedings</option> <option value="chemosensors">Chemosensors</option> <option value="children">Children</option> <option value="chips">Chips</option> <option value="civileng">CivilEng</option> <option value="cleantechnol">Clean Technologies</option> <option value="climate">Climate</option> <option value="ctn">Clinical and Translational Neuroscience</option> <option value="clinbioenerg">Clinical Bioenergetics</option> <option value="clinpract">Clinics and Practice</option> <option value="clockssleep">Clocks &amp; Sleep</option> <option value="coasts">Coasts</option> <option value="coatings">Coatings</option> <option value="colloids">Colloids and Interfaces</option> <option value="colorants">Colorants</option> <option value="commodities">Commodities</option> <option value="complications">Complications</option> <option value="compounds">Compounds</option> <option value="computation">Computation</option> <option value="csmf">Computer Sciences &amp; Mathematics Forum</option> <option value="computers">Computers</option> <option value="condensedmatter">Condensed Matter</option> <option value="conservation">Conservation</option> <option value="constrmater">Construction Materials</option> <option value="cmd">Corrosion and Materials Degradation</option> <option value="cosmetics">Cosmetics</option> <option value="covid">COVID</option> <option value="cmtr">Craniomaxillofacial Trauma &amp; Reconstruction</option> <option value="crops">Crops</option> <option value="cryo">Cryo</option> <option value="cryptography">Cryptography</option> <option value="crystals">Crystals</option> <option value="cimb">Current Issues in Molecular Biology</option> <option value="curroncol">Current Oncology</option> <option value="dairy">Dairy</option> <option value="data">Data</option> <option value="dentistry">Dentistry Journal</option> <option value="dermato">Dermato</option> <option value="dermatopathology">Dermatopathology</option> <option value="designs">Designs</option> <option value="diabetology">Diabetology</option> <option value="diagnostics">Diagnostics</option> <option value="dietetics">Dietetics</option> <option value="digital">Digital</option> <option value="disabilities">Disabilities</option> <option value="diseases">Diseases</option> <option value="diversity">Diversity</option> <option value="dna">DNA</option> <option value="drones">Drones</option> <option value="ddc">Drugs and Drug Candidates</option> <option value="dynamics">Dynamics</option> <option value="earth">Earth</option> <option value="ecologies">Ecologies</option> <option value="econometrics">Econometrics</option> <option value="economies">Economies</option> <option value="education">Education Sciences</option> <option value="electricity">Electricity</option> <option value="electrochem">Electrochem</option> <option value="electronicmat">Electronic Materials</option> <option value="electronics">Electronics</option> <option value="ecm">Emergency Care and Medicine</option> <option value="encyclopedia">Encyclopedia</option> <option value="endocrines">Endocrines</option> <option value="energies">Energies</option> <option value="esa">Energy Storage and Applications</option> <option value="eng">Eng</option> <option value="engproc">Engineering Proceedings</option> <option value="entropy">Entropy</option> <option value="eesp">Environmental and Earth Sciences Proceedings</option> <option value="environments">Environments</option> <option value="epidemiologia">Epidemiologia</option> <option value="epigenomes">Epigenomes</option> <option value="ebj">European Burn Journal</option> <option value="ejihpe">European Journal of Investigation in Health, Psychology and Education</option> <option value="fermentation">Fermentation</option> <option value="fibers">Fibers</option> <option value="fintech">FinTech</option> <option value="fire">Fire</option> <option value="fishes">Fishes</option> <option value="fluids">Fluids</option> <option value="foods">Foods</option> <option value="forecasting">Forecasting</option> <option value="forensicsci">Forensic Sciences</option> <option value="forests">Forests</option> <option value="fossstud">Fossil Studies</option> <option value="foundations">Foundations</option> <option value="fractalfract">Fractal and Fractional</option> <option value="fuels">Fuels</option> <option value="future">Future</option> <option value="futureinternet">Future Internet</option> <option value="futurepharmacol">Future Pharmacology</option> <option value="futuretransp">Future Transportation</option> <option value="galaxies">Galaxies</option> <option value="games">Games</option> <option value="gases">Gases</option> <option value="gastroent">Gastroenterology Insights</option> <option value="gastrointestdisord">Gastrointestinal Disorders</option> <option value="gastronomy">Gastronomy</option> <option value="gels">Gels</option> <option value="genealogy">Genealogy</option> <option value="genes">Genes</option> <option value="geographies">Geographies</option> <option value="geohazards">GeoHazards</option> <option value="geomatics">Geomatics</option> <option value="geometry">Geometry</option> <option value="geosciences">Geosciences</option> <option value="geotechnics">Geotechnics</option> <option value="geriatrics">Geriatrics</option> <option value="glacies">Glacies</option> <option value="gucdd">Gout, Urate, and Crystal Deposition Disease</option> <option value="grasses">Grasses</option> <option value="greenhealth">Green Health</option> <option value="hardware">Hardware</option> <option value="healthcare">Healthcare</option> <option value="hearts">Hearts</option> <option value="hemato">Hemato</option> <option value="hematolrep">Hematology Reports</option> <option value="heritage">Heritage</option> <option value="histories">Histories</option> <option value="horticulturae">Horticulturae</option> <option value="hospitals">Hospitals</option> <option value="humanities">Humanities</option> <option value="humans">Humans</option> <option value="hydrobiology">Hydrobiology</option> <option value="hydrogen">Hydrogen</option> <option value="hydrology">Hydrology</option> <option value="hygiene">Hygiene</option> <option value="immuno">Immuno</option> <option value="idr">Infectious Disease Reports</option> <option value="informatics">Informatics</option> <option value="information">Information</option> <option value="infrastructures">Infrastructures</option> <option value="inorganics">Inorganics</option> <option value="insects">Insects</option> <option value="instruments">Instruments</option> <option value="iic">Intelligent Infrastructure and Construction</option> <option value="ijerph">International Journal of Environmental Research and Public Health</option> <option value="ijfs">International Journal of Financial Studies</option> <option value="ijms">International Journal of Molecular Sciences</option> <option value="IJNS">International Journal of Neonatal Screening</option> <option value="ijom">International Journal of Orofacial Myology and Myofunctional Therapy</option> <option value="ijpb">International Journal of Plant Biology</option> <option value="ijt">International Journal of Topology</option> <option value="ijtm">International Journal of Translational Medicine</option> <option value="ijtpp">International Journal of Turbomachinery, Propulsion and Power</option> <option value="ime">International Medical Education</option> <option value="inventions">Inventions</option> <option value="IoT">IoT</option> <option value="ijgi">ISPRS International Journal of Geo-Information</option> <option value="J">J</option> <option value="jal">Journal of Ageing and Longevity</option> <option value="jcdd">Journal of Cardiovascular Development and Disease</option> <option value="jcto">Journal of Clinical &amp; Translational Ophthalmology</option> <option value="jcm">Journal of Clinical Medicine</option> <option value="jcs">Journal of Composites Science</option> <option value="jcp">Journal of Cybersecurity and Privacy</option> <option value="jdad">Journal of Dementia and Alzheimer&#039;s Disease</option> <option value="jdb">Journal of Developmental Biology</option> <option value="jeta">Journal of Experimental and Theoretical Analyses</option> <option value="jemr">Journal of Eye Movement Research</option> <option value="jfb">Journal of Functional Biomaterials</option> <option value="jfmk">Journal of Functional Morphology and Kinesiology</option> <option value="jof">Journal of Fungi</option> <option value="jimaging">Journal of Imaging</option> <option value="jintelligence">Journal of Intelligence</option> <option value="jlpea">Journal of Low Power Electronics and Applications</option> <option value="jmmp">Journal of Manufacturing and Materials Processing</option> <option value="jmse">Journal of Marine Science and Engineering</option> <option value="jmahp">Journal of Market Access &amp; Health Policy</option> <option value="jmms">Journal of Mind and Medical Sciences</option> <option value="jmp">Journal of Molecular Pathology</option> <option value="jnt">Journal of Nanotheranostics</option> <option value="jne">Journal of Nuclear Engineering</option> <option value="ohbm">Journal of Otorhinolaryngology, Hearing and Balance Medicine</option> <option value="jop">Journal of Parks</option> <option value="jpm">Journal of Personalized Medicine</option> <option value="jpbi">Journal of Pharmaceutical and BioTech Industry</option> <option value="jor">Journal of Respiration</option> <option value="jrfm">Journal of Risk and Financial Management</option> <option value="jsan">Journal of Sensor and Actuator Networks</option> <option value="joma">Journal of the Oman Medical Association</option> <option value="jtaer">Journal of Theoretical and Applied Electronic Commerce Research</option> <option value="jvd">Journal of Vascular Diseases</option> <option value="jox">Journal of Xenobiotics</option> <option value="jzbg">Journal of Zoological and Botanical Gardens</option> <option value="journalmedia">Journalism and Media</option> <option value="kidneydial">Kidney and Dialysis</option> <option value="kinasesphosphatases">Kinases and Phosphatases</option> <option value="knowledge">Knowledge</option> <option value="labmed">LabMed</option> <option value="laboratories">Laboratories</option> <option value="land">Land</option> <option value="languages">Languages</option> <option value="laws">Laws</option> <option value="life">Life</option> <option value="limnolrev">Limnological Review</option> <option value="lipidology">Lipidology</option> <option value="liquids">Liquids</option> <option value="literature">Literature</option> <option value="livers">Livers</option> <option value="logics">Logics</option> <option value="logistics">Logistics</option> <option value="lubricants">Lubricants</option> <option value="lymphatics">Lymphatics</option> <option value="make">Machine Learning and Knowledge Extraction</option> <option value="machines">Machines</option> <option value="macromol">Macromol</option> <option value="magnetism">Magnetism</option> <option value="magnetochemistry">Magnetochemistry</option> <option value="marinedrugs">Marine Drugs</option> <option value="materials">Materials</option> <option value="materproc">Materials Proceedings</option> <option value="mca">Mathematical and Computational Applications</option> <option value="mathematics">Mathematics</option> <option value="medsci">Medical Sciences</option> <option value="msf">Medical Sciences Forum</option> <option value="medicina">Medicina</option> <option value="medicines">Medicines</option> <option value="membranes">Membranes</option> <option value="merits">Merits</option> <option value="metabolites">Metabolites</option> <option value="metals">Metals</option> <option value="meteorology">Meteorology</option> <option value="methane">Methane</option> <option value="mps">Methods and Protocols</option> <option value="metrics">Metrics</option> <option value="metrology">Metrology</option> <option value="micro">Micro</option> <option value="microbiolres">Microbiology Research</option> <option value="micromachines">Micromachines</option> <option value="microorganisms">Microorganisms</option> <option value="microplastics">Microplastics</option> <option value="microwave">Microwave</option> <option value="minerals">Minerals</option> <option value="mining">Mining</option> <option value="modelling">Modelling</option> <option value="mmphys">Modern Mathematical Physics</option> <option value="molbank">Molbank</option> <option value="molecules">Molecules</option> <option value="mti">Multimodal Technologies and Interaction</option> <option value="muscles">Muscles</option> <option value="nanoenergyadv">Nanoenergy Advances</option> <option value="nanomanufacturing">Nanomanufacturing</option> <option value="nanomaterials">Nanomaterials</option> <option value="ndt">NDT</option> <option value="network">Network</option> <option value="neuroglia">Neuroglia</option> <option value="neurolint">Neurology International</option> <option value="neurosci">NeuroSci</option> <option value="nitrogen">Nitrogen</option> <option value="ncrna">Non-Coding RNA</option> <option value="nursrep">Nursing Reports</option> <option value="nutraceuticals">Nutraceuticals</option> <option value="nutrients">Nutrients</option> <option value="obesities">Obesities</option> <option value="oceans">Oceans</option> <option value="onco">Onco</option> <option value="optics">Optics</option> <option value="oral">Oral</option> <option value="organics">Organics</option> <option value="organoids">Organoids</option> <option value="osteology">Osteology</option> <option value="oxygen">Oxygen</option> <option value="parasitologia">Parasitologia</option> <option value="particles">Particles</option> <option value="pathogens">Pathogens</option> <option value="pathophysiology">Pathophysiology</option> <option value="pediatrrep">Pediatric Reports</option> <option value="pets">Pets</option> <option value="pharmaceuticals">Pharmaceuticals</option> <option value="pharmaceutics">Pharmaceutics</option> <option value="pharmacoepidemiology">Pharmacoepidemiology</option> <option value="pharmacy">Pharmacy</option> <option value="philosophies">Philosophies</option> <option value="photochem">Photochem</option> <option value="photonics">Photonics</option> <option value="phycology">Phycology</option> <option value="physchem">Physchem</option> <option value="psf">Physical Sciences Forum</option> <option value="physics">Physics</option> <option value="physiologia">Physiologia</option> <option value="plants">Plants</option> <option value="plasma">Plasma</option> <option value="platforms">Platforms</option> <option value="pollutants">Pollutants</option> <option value="polymers">Polymers</option> <option value="polysaccharides">Polysaccharides</option> <option value="populations">Populations</option> <option value="poultry">Poultry</option> <option value="powders">Powders</option> <option value="proceedings">Proceedings</option> <option value="processes">Processes</option> <option value="prosthesis">Prosthesis</option> <option value="proteomes">Proteomes</option> <option value="psychiatryint">Psychiatry International</option> <option value="psychoactives">Psychoactives</option> <option value="psycholint">Psychology International</option> <option value="publications">Publications</option> <option value="qubs">Quantum Beam Science</option> <option value="quantumrep">Quantum Reports</option> <option value="quaternary">Quaternary</option> <option value="radiation">Radiation</option> <option value="reactions">Reactions</option> <option value="realestate">Real Estate</option> <option value="receptors">Receptors</option> <option value="recycling">Recycling</option> <option value="rsee">Regional Science and Environmental Economics</option> <option value="religions">Religions</option> <option value="remotesensing">Remote Sensing</option> <option value="reports">Reports</option> <option value="reprodmed">Reproductive Medicine</option> <option value="resources">Resources</option> <option value="rheumato">Rheumato</option> <option value="risks">Risks</option> <option value="robotics">Robotics</option> <option value="ruminants">Ruminants</option> <option value="safety">Safety</option> <option value="sci">Sci</option> <option value="scipharm">Scientia Pharmaceutica</option> <option value="sclerosis">Sclerosis</option> <option value="seeds">Seeds</option> <option value="sensors">Sensors</option> <option value="separations">Separations</option> <option value="sexes">Sexes</option> <option value="signals">Signals</option> <option value="sinusitis">Sinusitis</option> <option value="smartcities">Smart Cities</option> <option value="socsci">Social Sciences</option> <option value="siuj">Société Internationale d’Urologie Journal</option> <option value="societies">Societies</option> <option value="software">Software</option> <option value="soilsystems">Soil Systems</option> <option value="solar">Solar</option> <option value="solids">Solids</option> <option value="spectroscj">Spectroscopy Journal</option> <option value="sports">Sports</option> <option value="standards">Standards</option> <option value="stats">Stats</option> <option value="stresses">Stresses</option> <option value="surfaces">Surfaces</option> <option value="surgeries">Surgeries</option> <option value="std">Surgical Techniques Development</option> <option value="sustainability">Sustainability</option> <option value="suschem">Sustainable Chemistry</option> <option value="symmetry">Symmetry</option> <option value="synbio">SynBio</option> <option value="systems">Systems</option> <option value="targets">Targets</option> <option value="taxonomy">Taxonomy</option> <option value="technologies">Technologies</option> <option value="telecom">Telecom</option> <option value="textiles">Textiles</option> <option value="thalassrep">Thalassemia Reports</option> <option value="therapeutics">Therapeutics</option> <option value="thermo">Thermo</option> <option value="timespace">Time and Space</option> <option value="tomography">Tomography</option> <option value="tourismhosp">Tourism and Hospitality</option> <option value="toxics">Toxics</option> <option value="toxins">Toxins</option> <option value="transplantology">Transplantology</option> <option value="traumacare">Trauma Care</option> <option value="higheredu">Trends in Higher Education</option> <option value="tropicalmed">Tropical Medicine and Infectious Disease</option> <option value="universe">Universe</option> <option value="urbansci">Urban Science</option> <option value="uro">Uro</option> <option value="vaccines">Vaccines</option> <option value="vehicles">Vehicles</option> <option value="venereology">Venereology</option> <option value="vetsci">Veterinary Sciences</option> <option value="vibration">Vibration</option> <option value="virtualworlds">Virtual Worlds</option> <option value="viruses">Viruses</option> <option value="vision">Vision</option> <option value="waste">Waste</option> <option value="water">Water</option> <option value="wild">Wild</option> <option value="wind">Wind</option> <option value="women">Women</option> <option value="world">World</option> <option value="wevj">World Electric Vehicle Journal</option> <option value="youth">Youth</option> <option value="zoonoticdis">Zoonotic Diseases</option> </select> <input name="email" type="email" placeholder="Enter your email address..." required="required" /> <button class="genericCaptcha button button--dark UA_FooterNewsletterSubscribeButton" type="submit">Subscribe</button> </form> </div> </div> </div> <div id="footer-copyright"> <div class="row"> <div class="columns large-6 medium-6 small-12 text-left"> © 1996-2025 MDPI (Basel, Switzerland) unless otherwise stated </div> <div class="columns large-6 medium-6 small-12 small-text-left medium-text-right large-text-right"> <a data-dropdown="drop-view-disclaimer" aria-controls="drop-view-disclaimer" aria-expanded="false" data-options="align:top; is_hover:true; hover_timeout:2000;"> Disclaimer </a> <div id="drop-view-disclaimer" class="f-dropdown label__btn__dropdown label__btn__dropdown--wide text-left" data-dropdown-content aria-hidden="true" tabindex="-1"> Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. </div> <a href="/about/terms-and-conditions"> Terms and Conditions </a> <a href="/about/privacy"> Privacy Policy </a> </div> </div> </div> </div> <div id="cookie-notification" class="js-allow-cookies" style="display: none;"> <div class="columns large-10 medium-10 small-12"> We use cookies on our website to ensure you get the best experience.<br class="show-for-medium-up"/> Read more about our cookies <a href="/about/privacy">here</a>. </div> <div class="columns large-2 medium-2 small-12 small-only-text-left text-right"> <a class="button button--default" href="/accept_cookies">Accept</a> </div> </div> </div> <div id="main-share-modal" class="reveal-modal reveal-modal-new reveal-modal-new--small" data-reveal aria-labelledby="modalTitle" aria-hidden="true" role="dialog"> <div class="row"> <div class="small-12 columns"> <h2 style="margin: 0;">Share Link</h2> </div> <div class="small-12 columns"> <div class="social-media-links UA_ShareModalLinks" style="text-align: left;"> <a href="/cdn-cgi/l/email-protection#1f20397e726f246c6a7d757a7c6b22596d70723a2d2f525b4f563a2c5e3a2d2f3a2d2d4b6d7a7a3a2d2f4c6f7a7c767a6c3a2d2f5c737e6c6c7679767c7e6b7670713a2d2f76713a2d2f7e3a2d2f5c70726f737a673a2d2f5d6d7e657673767e713a2d2f4b6d706f767c7e733a2d2f59706d7a6c6b3a2d2f4a6c7671783a2d2f57666f7a6d6c6f7a7c6b6d7e733a2d2f7e717b3a2d2f53765b5e4d3a2d2f5b7e6b7e396e6a706b24397e726f247d707b6622776b6b6f6c25303068686831727b6f76317c7072302d2d28272f2a2f3a2c5e3a2f5e3a2f5e4b6d7a7a3a2d2f4c6f7a7c767a6c3a2d2f5c737e6c6c7679767c7e6b7670713a2d2f76713a2d2f7e3a2d2f5c70726f737a673a2d2f5d6d7e657673767e713a2d2f4b6d706f767c7e733a2d2f59706d7a6c6b3a2d2f4a6c7671783a2d2f57666f7a6d6c6f7a7c6b6d7e733a2d2f7e717b3a2d2f53765b5e4d3a2d2f5b7e6b7e15154b77766c3a2d2f6c6b6a7b663a2d2f7a676f7a6d76727a716b6c3a2d2f68766b773a2d2f7b7679797a6d7a716b3a2d2f7c70727d76717e6b7670716c3a2d2f70793a2d2f4a5e493a2d2f77666f7a6d6c6f7a7c6b6d7e733a2d2f7b7e6b7e3a2d2f7e717b3a2d2f53765b5e4d3a2d2f727a6b6d767c6c3a2d2f79706d3a2d2f7c737e6c6c7679667671783a2d2f7a7678776b3a2d2f6b6d7a7a3a2d2f6c6f7a7c767a6c3a2d2f79706a717b3a2d2f76713a2d2f7e3a2d2f5d6d7e657673767e713a2d2f5e6b737e716b767c3a2d2f59706d7a6c6b3a2d2f6d7a72717e716b3a2d5c3a2d2f6b777a3a2d2f72706c6b3a2d2f7b7a786d7e7b7a7b3a2d2f5d6d7e657673767e713a2d2f7d7670727a3a2d2f68766b773a2d2f777678773a2d2f796d7e78727a716b7e6b7670713a2d2f7d6a6b3a2d2f68766b773a2d2f776a787a3a2d2f6c6b6d6a7c6b6a6d7e733a2d2f7c70726f737a67766b66313a2d2f4b777a3a2d2f6c7a737a7c6b7670713a2d2f70793a2d2f6b777a3a2d2f6c6f7a7c767a6c3a2d2f687e6c3a2d2f7b70717a3a2d2f7d7e6c7a7b3a2d2f70713a2d2f6b777a3a2d2f716a727d7a6d3a2d2f70793a2d2f6b6d7a7a3a2d2f6c7e726f737a6c3a2d5c3a2d2f6877767c773a2d2f7a67766c6b3a2d2f76713a2d2f6b777a3a2d2f6f73706b3a2d2f7b7e6b7e3a2d2f7e717b3a2d2f76713a2d2f6b777a3a2d2f797e7c6b3a2d2f6b777a3a2d2f4a5e493a2d2f76727e787a6d663a2d2f7b707a6c3a2d2f71706b3a2d2f7e7c6e6a766d7a3a2d2f767179706d727e6b7670713a2d2f7d7a7370683a2d2f6b777a3a2d2f79706d7a6c6b3a2d2f7c7e71706f66313a2d2f5b6a7a3a2d2f6b703a2d2f6b777a3a2d2f7c70726f737a67766b663a2d2f70793a2d2f6b777a3a2d2f79706d7a6c6b3a2d5c3a2d2f707173663a2d2f6c6f7a7c767a6c3a2d2f6b777e6b3a2d2f7a67766c6b3a2d2f76713a2d2f6b777a3a2d2f6a6f6f7a6d3a2d2f7c7e71706f663a2d2f70793a2d2f6b777a3a2d2f6d7a72717e716b3a2d2f687a6d7a3a2d2f76717c736a7b7a7b3a2d2f76713a2d2f6b777a3a2d2f7c737e6c6c7679767c7e6b767071313a2d2f5e3a2d2f7c70727d76717e6b7670713a2d2f70793a2d2f77666f7a6d6c6f7a7c6b6d7e733a2d2f4a5e493a2d2f76727e787a6c3a2d2f7e717b3a2d2f53765b5e4d3a2d2f6f7076716b3a2d2f7c73706a7b6c3a2d2f687a6d7a3a2d2f76713a2d2f6b777a3a2d2f7a676f7a6d76727a716b313a2d2f4b777a3a2d2f77666f7a6d6c6f7a7c6b6d7e733a2d2f76727e787a6c3a2d2f687a6d7a3a2d2f6f77706b70786d7e72727a6b6d767c3a2d2f7e717b3a2d2f6d7e7b7670727a6b6d767c3a2d2f6f6d707c7a6c6c7a7b3a2d2f6b703a2d2f707d6b7e76713a2d2f706d6b777072706c7e767c6c3a2d2f68766b773a2d2f6d7a79737a7c6b7e717c7a3a2d2f797e7c6b706d3a2d2f697e736a7a6c313a2d2f4d7e683a2d2f6c6f7a7c6b6d7e3a2d2f687a6d7a3a2d2f7a676b6d7e7c6b7a7b3a2d2f796d70723a2d2f6b777a3a2d2f6b6d7a7a6c3a2d5c3a2d2f7e717b3a2d2f697a787a6b7e6b7670713a2d2f76717b767c7a6c3a2d2f3a2d2749566c3a2d263a2d2f687a6d7a3a2d2f7c7e737c6a737e6b7a7b313a2d2f4d7a787e6d7b7671783a2d2f6b777a3a2d2f53765b5e4d3a2d2f7b7e6b7e3a2d5c3a2d2f7d706b773a2d2f6b777a3a2d2f6f7076716b3a2d2f7c73706a7b3a5a2d3a272f3a262b6d7a797a6d6d7a7b3a2d2f6b703a2d2f7e6c3a2d2f4f7a7e743a2d2f4d7a6b6a6d716c3a2d2f3a2d274f4d3a2d263a5a2d3a272f3a262b7e717b3a2d2f6b777a3a2d2f796a737332687e697a79706d723a2d2f3a2d275948593a2d263a2d2f53765b5e4d3a2d2f687a6d7a3a2d2f76717c736a7b7a7b3a2d2f76713a2d2f6b77766c3a2d2f6c6b6a7b66313a2d2f4b777a3a2d2f6f7076716b3a2d2f7c73706a7b6c3a2d2f687a6d7a3a2d2f6f6d707c7a6c6c7a7b3a2d2f6b703a2d2f71706d727e7376657a3a2d2f6b777a3a2d2f76716b7a716c766b767a6c3a2d2f7e717b3a2d2f777a7678776b6c3a2d5c3a2d2f7e717b3a2d2f7b7679797a6d7a716b3a2d2f727a6b6d767c6c3a2d2f79706d3a2d2f7a7e7c773a2d2f7b7e6b7e3a2d2f6b666f7a3a2d2f3a2d274f4d3a2d2f7e717b3a2d2f5948593a2d263a2d2f687a6d7a3a2d2f7a676b6d7e7c6b7a7b314431313142" title="Email"> <i class="fa fa-envelope-square" style="font-size: 30px;"></i> </a> <a href="https://twitter.com/intent/tweet?text=Tree+Species+Classification+in+a+Complex+Brazilian+Tropical+Forest+Using+Hyperspectral+and+LiDAR+Data&amp;hashtags=mdpiforests&amp;url=https%3A%2F%2Fwww.mdpi.com%2F2278050&amp;via=Forests_MDPI" onclick="windowOpen(this.href,600,800); return false" title="Twitter" target="_blank" rel="noopener noreferrer"> <i class="fa fa-twitter-x-square" style="font-size: 30px;"></i> </a> <a href=" http://www.linkedin.com/shareArticle?mini=true&amp;url=https%3A%2F%2Fwww.mdpi.com%2F2278050&amp;title=Tree%20Species%20Classification%20in%20a%20Complex%20Brazilian%20Tropical%20Forest%20Using%20Hyperspectral%20and%20LiDAR%20Data%26source%3Dhttps%3A%2F%2Fwww.mdpi.com%26summary%3DThis%20study%20experiments%20with%20different%20combinations%20of%20UAV%20hyperspectral%20data%20and%20LiDAR%20metrics%20for%20classifying%20eight%20tree%20species%20found%20in%20a%20Brazilian%20Atlantic%20Forest%20remnant%2C%20the%20most%20degraded%20Brazilian%20biome%20with%20high%20fragmentation%20but%20with%20huge%20%5B...%5D" onclick="windowOpen(this.href,600,800); return false" title="LinkedIn" target="_blank" rel="noopener noreferrer"> <i class="fa fa-linkedin-square" style="font-size: 30px;"></i> </a> <a href="https://www.facebook.com/sharer.php?u=https://www.mdpi.com/2278050" title="facebook" target="_blank" rel="noopener noreferrer"> <i class="fa fa-facebook-square" style="font-size: 30px;"></i> </a> <a href="javascript:void(0);" title="Wechat" data-reveal-id="weixin-share-modal"> <i class="fa fa-weixin-square" style="font-size: 26px;"></i> </a> <a href="http://www.reddit.com/submit?url=https://www.mdpi.com/2278050" title="Reddit" target="_blank" rel="noopener noreferrer"> <i class="fa fa-reddit-square" style="font-size: 30px;"></i> </a> <a href="http://www.mendeley.com/import/?url=https://www.mdpi.com/2278050" title="Mendeley" target="_blank" rel="noopener noreferrer"> <i class="fa fa-mendeley-square" style="font-size: 30px;"></i> </a> <a href="http://www.citeulike.org/posturl?url=https://www.mdpi.com/2278050" title="CiteULike" target="_blank" rel="noopener noreferrer"> <i class="fa fa-citeulike-square" style="font-size: 30px;"></i> </a> </div> </div> <div class="small-9 columns"> <input id="js-clipboard-text" type="text" readonly value="https://www.mdpi.com/2278050" /> </div> <div class="small-3 columns text-left"> <a class="button button--color js-clipboard-copy" data-clipboard-target="#js-clipboard-text">Copy</a> </div> </div> <a class="close-reveal-modal" aria-label="Close"> <i class="material-icons">clear</i> </a> </div> <div id="weixin-share-modal" class="reveal-modal reveal-modal-new" data-reveal aria-labelledby="weixin-share-modal-title" aria-hidden="true" role="dialog"> <div class="row"> <div class="small-12 columns"> <h2 id="weixin-share-modal-title" style="margin: 0;">Share</h2> </div> <div class="small-12 columns"> <div class="weixin-qr-code-section"> <?xml version="1.0" standalone="no"?> <!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"> <svg width="300" height="300" version="1.1" xmlns="http://www.w3.org/2000/svg"> <desc>https://www.mdpi.com/2278050</desc> <g id="elements" fill="black" stroke="none"> <rect x="0" y="0" width="12" height="12" /> <rect x="12" y="0" width="12" height="12" /> <rect x="24" y="0" width="12" height="12" /> <rect x="36" y="0" width="12" height="12" /> <rect x="48" y="0" width="12" height="12" /> <rect x="60" y="0" width="12" height="12" /> <rect x="72" y="0" width="12" height="12" /> <rect x="120" y="0" width="12" height="12" /> <rect x="168" y="0" width="12" height="12" /> <rect x="180" y="0" width="12" height="12" /> <rect x="216" y="0" width="12" height="12" /> <rect x="228" y="0" width="12" height="12" /> <rect x="240" y="0" width="12" height="12" /> <rect x="252" y="0" width="12" height="12" /> <rect x="264" y="0" width="12" height="12" /> <rect x="276" y="0" width="12" height="12" /> <rect x="288" y="0" width="12" height="12" /> <rect x="0" y="12" width="12" height="12" /> <rect x="72" y="12" width="12" height="12" /> <rect x="108" y="12" width="12" height="12" /> <rect x="180" y="12" width="12" height="12" /> <rect x="216" y="12" width="12" height="12" /> <rect x="288" y="12" width="12" height="12" /> <rect x="0" y="24" width="12" height="12" /> <rect x="24" y="24" width="12" height="12" /> <rect x="36" y="24" width="12" height="12" /> <rect x="48" y="24" width="12" height="12" /> <rect x="72" y="24" width="12" height="12" /> <rect x="96" y="24" width="12" height="12" /> <rect x="132" y="24" width="12" height="12" /> <rect x="180" y="24" width="12" height="12" /> <rect x="216" y="24" width="12" height="12" /> <rect x="240" y="24" width="12" height="12" /> <rect x="252" y="24" width="12" height="12" /> <rect x="264" y="24" width="12" height="12" /> <rect x="288" y="24" width="12" height="12" /> <rect x="0" y="36" width="12" height="12" /> <rect x="24" y="36" width="12" height="12" /> <rect x="36" y="36" width="12" height="12" /> <rect x="48" y="36" width="12" height="12" /> <rect x="72" y="36" width="12" height="12" /> <rect x="108" y="36" width="12" height="12" /> <rect x="168" y="36" width="12" height="12" /> <rect x="216" y="36" width="12" height="12" /> <rect x="240" y="36" width="12" height="12" /> <rect x="252" y="36" width="12" height="12" /> <rect x="264" y="36" width="12" height="12" /> <rect x="288" y="36" width="12" height="12" /> <rect x="0" y="48" width="12" height="12" /> <rect x="24" y="48" width="12" height="12" /> <rect x="36" y="48" width="12" height="12" /> <rect x="48" y="48" width="12" height="12" /> <rect x="72" y="48" width="12" height="12" /> <rect x="108" y="48" width="12" height="12" /> <rect x="144" y="48" width="12" height="12" /> <rect x="156" y="48" width="12" height="12" /> <rect x="168" y="48" width="12" height="12" /> <rect x="180" y="48" width="12" height="12" /> <rect x="192" y="48" width="12" height="12" /> <rect x="216" y="48" width="12" height="12" /> <rect x="240" y="48" width="12" height="12" /> <rect x="252" y="48" width="12" height="12" /> <rect x="264" y="48" width="12" height="12" /> <rect x="288" y="48" width="12" height="12" /> <rect x="0" y="60" width="12" height="12" /> <rect x="72" y="60" width="12" height="12" /> <rect x="120" y="60" width="12" height="12" /> <rect x="132" y="60" width="12" height="12" /> <rect x="168" y="60" width="12" height="12" /> <rect x="180" y="60" width="12" height="12" /> <rect x="192" y="60" width="12" height="12" /> <rect x="216" y="60" width="12" height="12" /> <rect x="288" y="60" width="12" height="12" /> <rect x="0" y="72" width="12" height="12" /> <rect x="12" y="72" width="12" height="12" /> <rect x="24" y="72" width="12" height="12" /> <rect x="36" y="72" width="12" height="12" /> <rect x="48" y="72" width="12" height="12" /> <rect x="60" y="72" width="12" height="12" /> <rect x="72" y="72" width="12" height="12" /> <rect x="96" y="72" width="12" height="12" /> <rect x="120" y="72" width="12" height="12" /> <rect x="144" y="72" width="12" height="12" /> <rect x="168" y="72" width="12" height="12" /> <rect x="192" y="72" width="12" height="12" /> <rect x="216" y="72" width="12" height="12" /> <rect x="228" y="72" width="12" height="12" /> <rect x="240" y="72" width="12" height="12" /> <rect x="252" y="72" width="12" height="12" /> <rect x="264" y="72" width="12" height="12" /> <rect x="276" y="72" width="12" height="12" /> <rect x="288" y="72" width="12" height="12" /> <rect x="96" y="84" width="12" height="12" /> <rect x="132" y="84" width="12" height="12" /> <rect x="156" y="84" width="12" height="12" /> <rect x="168" y="84" width="12" height="12" /> <rect x="180" y="84" width="12" height="12" /> <rect x="0" y="96" width="12" height="12" /> <rect x="12" y="96" width="12" height="12" /> <rect x="24" y="96" width="12" height="12" /> <rect x="48" y="96" width="12" height="12" /> <rect x="60" y="96" width="12" height="12" /> <rect x="72" y="96" width="12" height="12" /> <rect x="84" y="96" width="12" height="12" /> <rect x="96" y="96" width="12" height="12" /> <rect x="144" y="96" width="12" height="12" /> <rect x="156" y="96" width="12" height="12" /> <rect x="180" y="96" width="12" height="12" /> <rect x="192" y="96" width="12" height="12" /> <rect x="204" y="96" width="12" height="12" /> <rect x="216" y="96" width="12" height="12" /> <rect x="264" y="96" width="12" height="12" /> <rect x="24" y="108" width="12" height="12" /> <rect x="36" y="108" width="12" height="12" /> <rect x="84" y="108" width="12" height="12" /> <rect x="108" y="108" width="12" height="12" /> <rect x="132" y="108" width="12" height="12" /> <rect x="192" y="108" width="12" height="12" /> <rect x="216" y="108" width="12" height="12" /> <rect x="288" y="108" width="12" height="12" /> <rect x="0" y="120" width="12" height="12" /> <rect x="12" y="120" width="12" height="12" /> <rect x="24" y="120" width="12" height="12" /> <rect x="36" y="120" width="12" height="12" /> <rect x="72" y="120" width="12" height="12" /> <rect x="96" y="120" width="12" height="12" /> <rect x="120" y="120" width="12" height="12" /> <rect x="132" y="120" width="12" height="12" /> <rect x="144" y="120" width="12" height="12" /> <rect x="192" y="120" width="12" height="12" /> <rect x="204" y="120" width="12" height="12" /> <rect x="216" y="120" width="12" height="12" /> <rect x="228" y="120" width="12" height="12" /> <rect x="264" y="120" width="12" height="12" /> <rect x="276" y="120" width="12" height="12" /> <rect x="288" y="120" width="12" height="12" /> <rect x="0" y="132" width="12" height="12" /> <rect x="12" y="132" width="12" height="12" /> <rect x="48" y="132" width="12" height="12" /> <rect x="84" y="132" width="12" height="12" /> <rect x="96" y="132" width="12" height="12" /> <rect x="108" y="132" width="12" height="12" /> <rect x="120" y="132" width="12" height="12" /> <rect x="132" y="132" width="12" height="12" /> <rect x="156" y="132" width="12" height="12" /> <rect x="168" y="132" width="12" height="12" /> <rect x="180" y="132" width="12" height="12" /> <rect x="216" y="132" width="12" height="12" /> <rect x="276" y="132" width="12" height="12" /> <rect x="24" y="144" width="12" height="12" /> <rect x="36" y="144" width="12" height="12" /> <rect x="48" y="144" width="12" height="12" /> <rect x="60" y="144" width="12" height="12" /> <rect x="72" y="144" width="12" height="12" /> <rect x="96" y="144" width="12" height="12" /> <rect x="120" y="144" width="12" height="12" /> <rect x="156" y="144" width="12" height="12" /> <rect x="168" y="144" width="12" height="12" /> <rect x="204" y="144" width="12" height="12" /> <rect x="216" y="144" width="12" height="12" /> <rect x="228" y="144" width="12" height="12" /> <rect x="252" y="144" width="12" height="12" /> <rect x="276" y="144" width="12" height="12" /> <rect x="288" y="144" width="12" height="12" /> <rect x="12" y="156" width="12" height="12" /> <rect x="96" y="156" width="12" height="12" /> <rect x="108" y="156" width="12" height="12" /> <rect x="120" y="156" width="12" height="12" /> <rect x="144" y="156" width="12" height="12" /> <rect x="156" y="156" width="12" height="12" /> <rect x="192" y="156" width="12" height="12" /> <rect x="216" y="156" width="12" height="12" /> <rect x="252" y="156" width="12" height="12" /> <rect x="288" y="156" width="12" height="12" /> <rect x="0" y="168" width="12" height="12" /> <rect x="36" y="168" width="12" height="12" /> <rect x="60" y="168" width="12" height="12" /> <rect x="72" y="168" width="12" height="12" /> <rect x="144" y="168" width="12" height="12" /> <rect x="168" y="168" width="12" height="12" /> <rect x="192" y="168" width="12" height="12" /> <rect x="204" y="168" width="12" height="12" /> <rect x="216" y="168" width="12" height="12" /> <rect x="228" y="168" width="12" height="12" /> <rect x="264" y="168" width="12" height="12" /> <rect x="276" y="168" width="12" height="12" /> <rect x="288" y="168" width="12" height="12" /> <rect x="12" y="180" width="12" height="12" /> <rect x="24" y="180" width="12" height="12" /> <rect x="60" y="180" width="12" height="12" /> <rect x="84" y="180" width="12" height="12" /> <rect x="96" y="180" width="12" height="12" /> <rect x="132" y="180" width="12" height="12" /> <rect x="156" y="180" width="12" height="12" /> <rect x="180" y="180" width="12" height="12" /> <rect x="192" y="180" width="12" height="12" /> <rect x="216" y="180" width="12" height="12" /> <rect x="240" y="180" width="12" height="12" /> <rect x="276" y="180" width="12" height="12" /> <rect x="0" y="192" width="12" height="12" /> <rect x="72" y="192" width="12" height="12" /> <rect x="108" y="192" width="12" height="12" /> <rect x="120" y="192" width="12" height="12" /> <rect x="144" y="192" width="12" height="12" /> <rect x="192" y="192" width="12" height="12" /> <rect x="204" y="192" width="12" height="12" /> <rect x="216" y="192" width="12" height="12" /> <rect x="228" y="192" width="12" height="12" /> <rect x="240" y="192" width="12" height="12" /> <rect x="252" y="192" width="12" height="12" /> <rect x="96" y="204" width="12" height="12" /> <rect x="108" y="204" width="12" height="12" /> <rect x="156" y="204" width="12" height="12" /> <rect x="180" y="204" width="12" height="12" /> <rect x="192" y="204" width="12" height="12" /> <rect x="240" y="204" width="12" height="12" /> <rect x="252" y="204" width="12" height="12" /> <rect x="276" y="204" width="12" height="12" /> <rect x="288" y="204" width="12" height="12" /> <rect x="0" y="216" width="12" height="12" /> <rect x="12" y="216" width="12" height="12" /> <rect x="24" y="216" width="12" height="12" /> <rect x="36" y="216" width="12" height="12" /> <rect x="48" y="216" width="12" height="12" /> <rect x="60" y="216" width="12" height="12" /> <rect x="72" y="216" width="12" height="12" /> <rect x="96" y="216" width="12" height="12" /> <rect x="120" y="216" width="12" height="12" /> <rect x="132" y="216" width="12" height="12" /> <rect x="156" y="216" width="12" height="12" /> <rect x="180" y="216" width="12" height="12" /> <rect x="192" y="216" width="12" height="12" /> <rect x="216" y="216" width="12" height="12" /> <rect x="240" y="216" width="12" height="12" /> <rect x="252" y="216" width="12" height="12" /> <rect x="276" y="216" width="12" height="12" /> <rect x="288" y="216" width="12" height="12" /> <rect x="0" y="228" width="12" height="12" /> <rect x="72" y="228" width="12" height="12" /> <rect x="96" y="228" width="12" height="12" /> <rect x="132" y="228" width="12" height="12" /> <rect x="144" y="228" width="12" height="12" /> <rect x="168" y="228" width="12" height="12" /> <rect x="192" y="228" width="12" height="12" /> <rect x="240" y="228" width="12" height="12" /> <rect x="252" y="228" width="12" height="12" /> <rect x="276" y="228" width="12" height="12" /> <rect x="0" y="240" width="12" height="12" /> <rect x="24" y="240" width="12" height="12" /> <rect x="36" y="240" width="12" height="12" /> <rect x="48" y="240" width="12" height="12" /> <rect x="72" y="240" width="12" height="12" /> <rect x="96" y="240" width="12" height="12" /> <rect x="108" y="240" width="12" height="12" /> <rect x="120" y="240" width="12" height="12" /> <rect x="144" y="240" width="12" height="12" /> <rect x="168" y="240" width="12" height="12" /> <rect x="192" y="240" width="12" height="12" /> <rect x="204" y="240" width="12" height="12" /> <rect x="216" y="240" width="12" height="12" /> <rect x="228" y="240" width="12" height="12" /> <rect x="240" y="240" width="12" height="12" /> <rect x="252" y="240" width="12" height="12" /> <rect x="276" y="240" width="12" height="12" /> <rect x="288" y="240" width="12" height="12" /> <rect x="0" y="252" width="12" height="12" /> <rect x="24" y="252" width="12" height="12" /> <rect x="36" y="252" width="12" height="12" /> <rect x="48" y="252" width="12" height="12" /> <rect x="72" y="252" width="12" height="12" /> <rect x="108" y="252" width="12" height="12" /> <rect x="120" y="252" width="12" height="12" /> <rect x="156" y="252" width="12" height="12" /> <rect x="168" y="252" width="12" height="12" /> <rect x="204" y="252" width="12" height="12" /> <rect x="228" y="252" width="12" height="12" /> <rect x="240" y="252" width="12" height="12" /> <rect x="252" y="252" width="12" height="12" /> <rect x="264" y="252" width="12" height="12" /> <rect x="0" y="264" width="12" height="12" /> <rect x="24" y="264" width="12" height="12" /> <rect x="36" y="264" width="12" height="12" /> <rect x="48" y="264" width="12" height="12" /> <rect x="72" y="264" width="12" height="12" /> <rect x="96" y="264" width="12" height="12" /> <rect x="108" y="264" width="12" height="12" /> <rect x="144" y="264" width="12" height="12" /> <rect x="168" y="264" width="12" height="12" /> <rect x="180" y="264" width="12" height="12" /> <rect x="240" y="264" width="12" height="12" /> <rect x="288" y="264" width="12" height="12" /> <rect x="0" y="276" width="12" height="12" /> <rect x="72" y="276" width="12" height="12" /> <rect x="96" y="276" width="12" height="12" /> <rect x="108" y="276" width="12" height="12" /> <rect x="132" y="276" width="12" height="12" /> <rect x="144" y="276" width="12" height="12" /> <rect x="156" y="276" width="12" height="12" /> <rect x="168" y="276" width="12" height="12" /> <rect x="180" y="276" width="12" height="12" /> <rect x="192" y="276" width="12" height="12" /> <rect x="216" y="276" width="12" height="12" /> <rect x="240" y="276" width="12" height="12" /> <rect x="252" y="276" width="12" height="12" /> <rect x="276" y="276" width="12" height="12" /> <rect x="0" y="288" width="12" height="12" /> <rect x="12" y="288" width="12" height="12" /> <rect x="24" y="288" width="12" height="12" /> <rect x="36" y="288" width="12" height="12" /> <rect x="48" y="288" width="12" height="12" /> <rect x="60" y="288" width="12" height="12" /> <rect x="72" y="288" width="12" height="12" /> <rect x="96" y="288" width="12" height="12" /> <rect x="120" y="288" width="12" height="12" /> <rect x="132" y="288" width="12" height="12" /> <rect x="144" y="288" width="12" height="12" /> <rect x="168" y="288" width="12" height="12" /> <rect x="180" y="288" width="12" height="12" /> <rect x="192" y="288" width="12" height="12" /> <rect x="228" y="288" width="12" height="12" /> <rect x="276" y="288" width="12" height="12" /> <rect x="288" y="288" width="12" height="12" /> </g> </svg> </div> </div> </div> <a class="close-reveal-modal" aria-label="Close"> <i class="material-icons">clear</i> </a> </div> <a href="#" class="back-to-top"><span class="show-for-medium-up">Back to Top</span><span class="show-for-small">Top</span></a> <script data-cfasync="false" src="/cdn-cgi/scripts/5c5dd728/cloudflare-static/email-decode.min.js"></script><script src="https://pub.mdpi-res.com/assets/js/modernizr-2.8.3.min.js?5227e0738f7f421d?1739771134"></script> <script src="https://pub.mdpi-res.com/assets/js/jquery-1.12.4.min.js?4f252523d4af0b47?1739771134"></script> <script src="https://pub.mdpi-res.com/assets/js/foundation-5.5.3.min.js?6b2ec41c18b29054?1739771134"></script> <script src="https://pub.mdpi-res.com/assets/js/foundation-5.5.3.equalizer.min.js?0f6c549b75ec554c?1739771134"></script> <script src="https://pub.mdpi-res.com/assets/js/jquery.multiselect.js?0edd3998731d1091?1739771134"></script> <script src="https://pub.mdpi-res.com/assets/js/jquery.cycle2.min.js?63413052928f97ee?1739771134"></script> <script> // old browser fix - this way the console log rows won't throw (silent) errors in browsers not supporting console log if (!window.console) window.console = {}; if (!window.console.log) window.console.log = function () { }; var currentJournalNameSystem = "forests"; $(document).ready(function() { $('select.foundation-select').multiselect({ search: true, minHeight: 130, maxHeight: 130, }); $(document).foundation({ orbit: { timer_speed: 4000, }, reveal: { animation: 'fadeAndPop', animation_speed: 100, } }); $(".chosen-select").each(function(element) { var maxSelected = (undefined !== $(this).data('maxselectedoptions') ? $(this).data('maxselectedoptions') : 100); $(this).on('chosen:ready', function(event, data) { var select = $(data.chosen.form_field); if (select.attr('id') === 'journal-browser-volume') { $(data.chosen.dropdown).addClass('UI_JournalBrowser_Volume_Options'); } if (select.attr('id') === 'journal-browser-issue') { $(data.chosen.dropdown).addClass('UI_JournalBrowser_Issue_Options'); } }).chosen({ display_disabled_options: false, disable_search_threshold: 7, max_selected_options: maxSelected, width: "100%" }); }); $(".toEncode").each(function(e) { var oldHref = $(this).attr("href"); var newHref = oldHref.replace('.botdefense.please.enable.javascript.','@'); $(this).attr("href", newHref); if (!$(this).hasClass("emailCaptcha")) { $(this).html(newHref.replace('mailto:', '')); } $(this).removeClass("visibility-hidden"); }); $(document).on('opened.fndtn.reveal', '[data-reveal]', function() { $(document).foundation('equalizer', 'reflow'); }); // fix the images that have tag height / width defined // otherwise the default foundation styles overwrite the tag definitions $("img").each(function() { if ($(this).attr('width') != undefined || $(this).attr('height') != undefined) { $(this).addClass("img-fixed"); } }); $("#basic_search, #advanced_search").submit(function(e) { var searchArguments = false; $(this).find("input,select").not("#search,.search-button").each(function() { if (undefined === $(this).val() || "" === $(this).val()) { $(this).attr('name', null); } else { $(this).attr('name'); searchArguments = true; } }); if (!searchArguments) { window.location = $(this).attr('action'); return false; } }); $(".hide-show-desktop-option").click(function(e) { e.preventDefault(); var parentDiv = $(this).closest("div"); $.ajax({ url: $(this).attr('href'), success: function(msg) { parentDiv.removeClass().hide(); } }); }); $(".generic-toggleable-header").click(function(e) { $(this).toggleClass("active"); $(this).next(".generic-toggleable-content").toggleClass("active"); }); /* * handle whole row as a link if the row contains only one visible link */ $("table.new tr").hover(function() { if ($(this).find("td:visible a").length == 1) { $(this).addClass("single-link"); } }, function() { $(this).removeClass("single-link"); }); $("table.new:not(.table-of-tables)").on("click", "tr.single-link", function(e) { var target = $(e.target); if (!e.ctrlKey && !target.is("a")) { $(this).find("td:visible a")[0].click(); } }); $(document).on("click", ".custom-accordion-for-small-screen-link", function(e) { if ($(this).closest("#basic_search").length > 0) { if ($(".search-container__advanced").first().is(":visible")) { openAdvanced() } } if (Foundation.utils.is_small_only()) { if ($(this).hasClass("active")) { $(this).removeClass("active"); $(this).next(".custom-accordion-for-small-screen-content").addClass("show-for-medium-up"); } else { $(this).addClass("active"); $(this).next(".custom-accordion-for-small-screen-content").removeClass("show-for-medium-up"); $(document).foundation('orbit', 'reflow'); } } if (undefined !== $(this).data("callback")) { var customCallback = $(this).data("callback"); func = window[customCallback]; func(); } }); $(document).on("click", ".js-open-small-search", function(e) { e.preventDefault(); $(this).toggleClass("active").closest(".tab-bar").toggleClass("active"); $(".search-container").toggleClass("hide-for-small-down"); }); $(document).on("click", ".js-open-menu", function(e) { $(".search-container").addClass("hide-for-small-down"); }); $(window).on('resize', function() { recalculate_main_browser_position(); recalculate_responsive_moving_containers(); }); updateSearchLabelVisibilities(); recalculate_main_browser_position(); recalculate_responsive_moving_containers(); if (window.document.documentMode == 11) { $("<link/>", { rel: "stylesheet", type: "text/css", href: "https://fonts.googleapis.com/icon?family=Material+Icons"}).appendTo("head"); } }); function recalculate_main_browser_position() { if (Foundation.utils.is_small_only()) { if ($("#js-main-top-container").parent("#js-large-main-top-container").length > 0) { $("#js-main-top-container").appendTo($("#js-small-main-top-container")); } } else { if ($("#js-main-top-container").parent("#js-small-main-top-container").length > 0) { $("#js-main-top-container").appendTo($("#js-large-main-top-container")); } } } function recalculate_responsive_moving_containers() { $(".responsive-moving-container.large").each(function() { var previousParent = $(".responsive-moving-container.active[data-id='"+$(this).data("id")+"']"); var movingContent = previousParent.html(); if (Foundation.utils.is_small_only()) { var currentParent = $(".responsive-moving-container.small[data-id='"+$(this).data("id")+"']"); } else if (Foundation.utils.is_medium_only()) { var currentParent = $(".responsive-moving-container.medium[data-id='"+$(this).data("id")+"']"); } else { var currentParent = $(".responsive-moving-container.large[data-id='"+$(this).data("id")+"']"); } if (previousParent.attr("class") !== currentParent.attr("class")) { currentParent.html(movingContent); previousParent.html(); currentParent.addClass("active"); previousParent.removeClass("active"); } }); } // cookies allowed is checked from a) local storage and b) from server separately so that the footer bar doesn't // get included in the custom page caches function checkCookiesAllowed() { var cookiesEnabled = localStorage.getItem("mdpi_cookies_enabled"); if (null === cookiesEnabled) { $.ajax({ url: "/ajax_cookie_value/mdpi_cookies_accepted", success: function(data) { if (data.value) { localStorage.setItem("mdpi_cookies_enabled", true); checkDisplaySurvey(); } else { $(".js-allow-cookies").show(); } } }); } else { checkDisplaySurvey(); } } function checkDisplaySurvey() { } window.addEventListener('CookiebotOnAccept', function (e) { var CookieDate = new Date; if (Cookiebot.consent.preferences) { CookieDate.setFullYear(CookieDate.getFullYear() + 1); document.cookie = "mdpi_layout_type_v2=mobile; path=/; expires=" + CookieDate.toUTCString() + ";"; $(".js-toggle-desktop-layout-link").css("display", "inline-block"); } }, false); window.addEventListener('CookiebotOnDecline', function (e) { if (!Cookiebot.consent.preferences) { $(".js-toggle-desktop-layout-link").hide(); if ("" === "desktop") { window.location = "/toggle_desktop_layout_cookie"; } } }, false); var hash = $(location).attr('hash'); if ("#share" === hash) { if (1 === $("#main-share-modal").length) { $('#main-share-modal').foundation('reveal', 'open'); } } </script> <script src="https://pub.mdpi-res.com/assets/js/lib.js?b86ef680a60436c6?1739771134"></script> <script src="https://pub.mdpi-res.com/assets/js/mdpi.js?c267ce58392b15da?1739771134"></script> <script>var banners_url = 'https://serve.mdpi.com';</script> <script type='text/javascript' src='https://pub.mdpi-res.com/assets/js/ifvisible.min.js?c621d19ecb761212?1739771134'></script> <script src="https://pub.mdpi-res.com/assets/js/xmltohtml/affix.js?ac4ea55275297c15?1739771134"></script> <script src="https://pub.mdpi-res.com/assets/js/clipboard.min.js?3f3688138a1b9fc4?1739771134"></script> <script type="text/javascript"> $(document).ready(function() { var helpFunctions = $(".middle-column__help__fixed"); var leftColumnAffix = $(".left-column__fixed"); var middleColumn = $("#middle-column"); var clone = null; helpFunctions.affix({ offset: { top: function() { return middleColumn.offset().top - 8 - (Foundation.utils.is_medium_only() ? 30 : 0); }, bottom: function() { return $("#footer").innerHeight() + 74 + (Foundation.utils.is_medium_only() ? 0 : 0); } } }); if (leftColumnAffix.length > 0) { clone = leftColumnAffix.clone(); clone.addClass("left-column__fixed__affix"); clone.insertBefore(leftColumnAffix); clone.css('width', leftColumnAffix.outerWidth() + 50); clone.affix({ offset: { top: function() { return leftColumnAffix.offset().top - 30 - (Foundation.utils.is_medium_only() ? 50 : 0); }, bottom: function() { return $("#footer").innerHeight() + 92 + (Foundation.utils.is_medium_only() ? 0 : 0); } } }); } $(window).on("resize", function() { if (clone !== null) { clone.css('width', leftColumnAffix.outerWidth() + 50); } }); new ClipboardJS('.js-clipboard-copy'); }); </script> <script src="https://pub.mdpi-res.com/assets/js/jquery-ui-1.13.2.min.js?1e2047978946a1d2?1739771134"></script> <script src="https://pub.mdpi-res.com/assets/js/slick.min.js?d5a61c749e44e471?1739771134"></script> <script> $(document).ready(function() { $(".link-article-menu").click(function(e) { e.preventDefault(); $(this).find('span').toggle(); $(this).next("div").toggleClass("active"); }); $(".js-similarity-related-articles").click(function(e) { e.preventDefault(); if ('' !== $('#recommended-articles-modal').attr('data-url')) { $('#recommended-articles-modal').foundation('reveal', 'open', $('#recommended-articles-modal').attr('data-url')); } }); $.ajax({ url: "/article/1139025/similarity-related/show-link", success: function(result) { if (result.show) { $('#recommended-articles-modal').attr('data-url', result.link); $('.js-article-similarity-container').show(); } } }); $("#recommended-articles-modal").on("click", ".ga-title-link-recommended-article", function(e) { var clickEventUrl = $(this).data("click-event-url"); if (typeof clickEventUrl !== "undefined") { fetch(clickEventUrl, { method: "GET", mode: "no-cors" }); } }); $(document).on('opened.fndtn.reveal', '[data-reveal]', function() { var modal = $(this); if (modal.attr('id') === "author-biographies-modal") { modal.find('.multiple-items').slick({ slidesToShow: 1, nextArrow: '<a class="slick-next" href="#"><i class="material-icons">chevron_right</i></a>', prevArrow: '<a class="slick-prev" href="#"><i class="material-icons">chevron_left</i></a>', slidesToScroll: 1, draggable: false, }); modal.find('.multiple-items').slick('refresh'); } }); }); </script> <script type="text/javascript"> if (-1 !== window.location.href.indexOf("?src=")) { window.history.replaceState({}, '', `${location.pathname}`); } $(document).ready(function() { var scifeedCounter = 0; var search = window.location.search; var mathjaxReady = false; // late image file loading $("img[data-lsrc]").each(function() { $(this).attr("src", $(this).data("lsrc")); }); // late mathjax initialization var head = document.getElementsByTagName("head")[0]; var script = document.createElement("script"); script.type = "text/x-mathjax-config"; script[(window.opera ? "innerHTML" : "text")] = "MathJax.Hub.processSectionDelay = 0;\n" + "MathJax.Hub.Config({\n" + " \"menuSettings\": {\n" + " CHTMLpreview: false\n" + " },\n" + " \"CHTML-preview\":{\n" + " disabled: true\n" + " },\n" + " \"HTML-CSS\": {\n" + " scale: 90,\n" + " availableFonts: [],\n" + " preferredFont: null,\n" + " preferredFonts: null,\n" + " webFont:\"Gyre-Pagella\",\n" + " imageFont:'TeX',\n" + " undefinedFamily:\"'Arial Unicode MS',serif\",\n" + " linebreaks: { automatic: false }\n" + " },\n" + " \"TeX\": {\n" + " extensions: ['noErrors.js'],\n" + " noErrors: {\n" + " inlineDelimiters: [\"\",\"\"],\n" + " multiLine: true,\n" + " style: {\n" + " 'font-size': '90%',\n" + " 'text-align': 'left',\n" + " 'color': 'black',\n" + " 'padding': '1px 3px',\n" + " 'border': '1px solid'\n" + " }\n" + " }\n" + " }\n" + "});\n" + "MathJax.Hub.Register.StartupHook('End', function() {\n" + " refreshMathjaxWidths();\n" + " mathjaxReady = true;\n" + "});\n" + "MathJax.Hub.Startup.signal.Interest(function (message) {\n" + " if (message == 'End') {\n" + " var hypoLink = document.getElementById('hypothesis_frame');\n" + " if (null !== hypoLink) {\n" + " hypoLink.setAttribute('src', 'https://commenting.mdpi.com/embed.js');\n" + " }\n" + " }\n" + "});"; head.appendChild(script); script = document.createElement("script"); script.type = "text/javascript"; script.src = "https://pub.mdpi-res.com/bundles/mathjax/MathJax.js?config=TeX-AMS-MML_HTMLorMML"; head.appendChild(script); // article version checker if (0 === search.indexOf('?type=check_update&version=')) { $.ajax({ url: "/1999-4907/14/5/945" + "/versioncheck" + search, success: function(result) { $(".js-check-update-container").html(result); } }); } $('#feed_option').click(function() { // tracker if ($('#scifeed_clicked').length<1) { $(this).append('<span style="display:none" id="scifeed_clicked">done</span>'); } $('#feed_data').toggle('slide', { direction: 'up'}, '1000'); // slideToggle(700); OR toggle(700) $("#scifeed_error_msg").html('').hide(); $("#scifeed_notice_msg").html('').hide(); }); $('#feed_option').click(function(event) { setTimeout(function(){ var captchaSection = $("#captchaSection"); captchaSection.removeClass('ui-helper-hidden').find('input').prop('disabled', false); // var img = captchaSection.find('img'); // img.attr('src', img.data('url') + "?" + (new Date()).getTime()); // $(".captcha_reload").trigger("click"); var img = document.getElementById('gregwar_captcha_scifeed'); img.src = '/generate-captcha/gcb_captcha?n=' + (new Date()).getTime(); },800); }); $(document).on('click', '.split_feeds', function() { var name = $( this ).attr('name'); var flag = 1 - ($(this).is(":checked")*1); $('.split_feeds').each(function (index) { if ($( this ).attr('name') !== name) { $(this)[0].checked = flag; } }); }); $(document).on('click', '#scifeed_submit, #scifeed_submit1', function(event) { event.preventDefault(); $(".captcha_reload").trigger("click"); $("#scifeed_error_msg").html(""); $("#scifeed_error_msg").hide(); }); $(document).on('click', '.subscription_toggle', function(event) { if ($(this).val() === 'Create SciFeed' && $('#scifeed_hidden_flag').length>0) { event.preventDefault(); // alert('Here there would be a captcha because user is not logged in'); var captchaSection = $("#captchaSection"); if (captchaSection.hasClass('ui-helper-hidden')) { captchaSection.removeClass('ui-helper-hidden').find('input').prop('disabled', false); var img = captchaSection.find('img'); img.attr('src', img.data('url') + "?" + (new Date()).getTime()); $("#reloadCaptcha").trigger("click"); } } }); $(document).on('click', '.scifeed_msg', function(){ $(this).hide(); }); $(document).on('click', '.article-scilit-search', function(e) { e.preventDefault(); var data = $(".article-scilit-search-data").val(); var dataArray = data.split(';').map(function(keyword) { return "(\"" + keyword.trim() + "\")"; }); var searchQuery = dataArray.join(" OR "); var searchUrl = encodeURI("https://www.scilit.com/articles/search?q="+ searchQuery + "&advanced=1&highlight=1"); var win = window.open(searchUrl, '_blank'); if (win) { win.focus(); } else { window.location(searchUrl); } }); display_stats(); citedCount(); follow_goto(); // Select the node that will be observed for mutations const targetNodes = document.getElementsByClassName('hypothesis-count-container'); // Options for the observer (which mutations to observe) const config = { attributes: false, childList: true, subtree: false }; // Callback function to execute when mutations are observed const callback = function(mutationList, observer) { for(const mutation of mutationList) { if (mutation.type === 'childList') { let node = $(mutation.target); if (parseInt(node.html()) > 0) { node.show(); } } } }; // Create an observer instance linked to the callback function const observer = new MutationObserver(callback); // Start observing the target node for configured mutations for(const targetNode of targetNodes) { observer.observe(targetNode, config); } // Select the node that will be observed for mutations const mathjaxTargetNode = document.getElementById('middle-column'); // Callback function to execute when mutations are observed const mathjaxCallback = function(mutationList, observer) { if (mathjaxReady && typeof(MathJax) !== 'undefined') { refreshMathjaxWidths(); } }; // Create an observer instance linked to the callback function const mathjaxObserver = new ResizeObserver(mathjaxCallback); // Start observing the target node for configured mutations mathjaxObserver.observe(mathjaxTargetNode); }); /* END $(document).ready */ function refreshMathjaxWidths() { let width = ($('.html-body').width()*0.9) + "px"; $('.MathJax_Display').css('max-width', width); $('.MJXc-display').css('max-width', width); } function sendScifeedFrom(form) { if (!$('#scifeed_email').val().trim()) { // empty email alert('Please, provide an email for subscribe to this scifeed'); return false; } else if (!$('#captchaSection').hasClass('ui-helper-hidden') && !$('#captchaSection').find('input').val().trim()) { // empty captcha alert('Please, fill the captcha field.'); return false; } else if( ((($('#scifeed_form').find('input:checkbox:checked').length)-($('#split_feeds:checked').length))<1) || ($('#scifeed_kwd_txt').length < 0 && !$('#scifeed_kwd_txt').val().trim()) || ($('#scifeed_author_txt').length<0 &&!$('#scifeed_author_txt').val().trim()) ) { alert('You did not select anything to subscribe'); return false; } else if(($('#scifeed_form').find('input:checkbox:checked').length)-($('#split_feeds2:checked').length)<1){ alert("You did not select anything to subscribe"); return false; } else { var url = $('#scifeed_subscribe_url').html(); var formData = $(form).serializeArray(); $.post(url, formData).done(function (data) { if (JSON.parse(data)) { $('.scifeed_msg').hide(); var res = JSON.parse(data); var successFeeds = 0; var errorFeeds = 0; if (res) { $('.scifeed_msg').html(''); $.each(res, function (index, val) { if (val) { if (val.error) { errorFeeds++; $("#scifeed_error_msg").append(index+' - '+val.error+'<br>'); } if (val.notice) // for successful feed creation { successFeeds++; // $("#scifeed_notice_msg").append(index+' - '+val.notice+'<br>'); $("#scifeed_notice_msg").append('<li>'+index+'</li>'); } } }); if (successFeeds>0) { text = $('#scifeed_notice_msg').html(); text = 'The following feed'+(successFeeds>1?'s have':' has')+ ' been sucessfully created:<br><ul>'+ text + '</ul>' +($('#scifeed_hidden_flag').length>0 ? 'You are not logged in, so you probably need to validate '+ (successFeeds>1?'them':' it')+'.<br>' :'' ) +'Please check your email'+(successFeeds>1?'s':'')+' for more details.'; //(successFeeds>1?' for each of them':'')+'.<br>'; $("#scifeed_notice_msg").html(text); $("#scifeed_notice_msg").show(); } if (errorFeeds>0) { $("#scifeed_error_msg").show();; } } $("#feed_data").hide(); } }); } } function follow_goto() { var hashStr = location.hash.replace("#",""); if(typeof hashStr !== 'undefined') { if( hashStr == 'supplementary') { document.getElementById('suppl_id').scrollIntoView(); } if( hashStr == 'citedby') { document.getElementById('cited_id').scrollIntoView(); } } } function cited() { $("#framed_div").toggle('fast', function(){ if ($(this).css('display') != 'none') { var loaded = document.getElementById("loaded"); if(loaded.innerHTML == "No") { // Load Xref result var container = document.getElementById("framed_div"); // This replace the content container.innerHTML = "<img src=\"https://pub.mdpi-res.com/img/loading_circle.gif?9a82694213036313?1739771134\" height=\"20\" width=\"20\" alt=\"Processing...\" style=\"vertical-align:middle; margin-right:0.6em;\">"; var url = "/citedby/10.3390%252Ff14050945/42"; $.post(url, function(result) { if (result.success) { container.innerHTML = result.view; } loaded.innerHTML = "Yes"; }); } } return true; // for not going at the beginning of the page... }) return true; // for not going at the beginning of the page... } function detect_device() { // Added by Bastien (18/08/2014): based on the http://detectmobilebrowsers.com/ detector var check = false; (function(a){if(/(android|bb\d+|meego).+mobile|avantgo|bada\/|blackberry|blazer|compal|elaine|fennec|hiptop|iemobile|ip(hone|od)|iris|kindle|lge |maemo|midp|mmp|mobile.+firefox|netfront|opera m(ob|in)i|palm( os)?|phone|p(ixi|re)\/|plucker|pocket|psp|series(4|6)0|symbian|treo|up\.(browser|link)|vodafone|wap|windows (ce|phone)|xda|xiino/i.test(a)||/1207|6310|6590|3gso|4thp|50[1-6]i|770s|802s|a wa|abac|ac(er|oo|s\-)|ai(ko|rn)|al(av|ca|co)|amoi|an(ex|ny|yw)|aptu|ar(ch|go)|as(te|us)|attw|au(di|\-m|r |s )|avan|be(ck|ll|nq)|bi(lb|rd)|bl(ac|az)|br(e|v)w|bumb|bw\-(n|u)|c55\/|capi|ccwa|cdm\-|cell|chtm|cldc|cmd\-|co(mp|nd)|craw|da(it|ll|ng)|dbte|dc\-s|devi|dica|dmob|do(c|p)o|ds(12|\-d)|el(49|ai)|em(l2|ul)|er(ic|k0)|esl8|ez([4-7]0|os|wa|ze)|fetc|fly(\-|_)|g1 u|g560|gene|gf\-5|g\-mo|go(\.w|od)|gr(ad|un)|haie|hcit|hd\-(m|p|t)|hei\-|hi(pt|ta)|hp( i|ip)|hs\-c|ht(c(\-| |_|a|g|p|s|t)|tp)|hu(aw|tc)|i\-(20|go|ma)|i230|iac( |\-|\/)|ibro|idea|ig01|ikom|im1k|inno|ipaq|iris|ja(t|v)a|jbro|jemu|jigs|kddi|keji|kgt( |\/)|klon|kpt |kwc\-|kyo(c|k)|le(no|xi)|lg( g|\/(k|l|u)|50|54|\-[a-w])|libw|lynx|m1\-w|m3ga|m50\/|ma(te|ui|xo)|mc(01|21|ca)|m\-cr|me(rc|ri)|mi(o8|oa|ts)|mmef|mo(01|02|bi|de|do|t(\-| |o|v)|zz)|mt(50|p1|v )|mwbp|mywa|n10[0-2]|n20[2-3]|n30(0|2)|n50(0|2|5)|n7(0(0|1)|10)|ne((c|m)\-|on|tf|wf|wg|wt)|nok(6|i)|nzph|o2im|op(ti|wv)|oran|owg1|p800|pan(a|d|t)|pdxg|pg(13|\-([1-8]|c))|phil|pire|pl(ay|uc)|pn\-2|po(ck|rt|se)|prox|psio|pt\-g|qa\-a|qc(07|12|21|32|60|\-[2-7]|i\-)|qtek|r380|r600|raks|rim9|ro(ve|zo)|s55\/|sa(ge|ma|mm|ms|ny|va)|sc(01|h\-|oo|p\-)|sdk\/|se(c(\-|0|1)|47|mc|nd|ri)|sgh\-|shar|sie(\-|m)|sk\-0|sl(45|id)|sm(al|ar|b3|it|t5)|so(ft|ny)|sp(01|h\-|v\-|v )|sy(01|mb)|t2(18|50)|t6(00|10|18)|ta(gt|lk)|tcl\-|tdg\-|tel(i|m)|tim\-|t\-mo|to(pl|sh)|ts(70|m\-|m3|m5)|tx\-9|up(\.b|g1|si)|utst|v400|v750|veri|vi(rg|te)|vk(40|5[0-3]|\-v)|vm40|voda|vulc|vx(52|53|60|61|70|80|81|83|85|98)|w3c(\-| )|webc|whit|wi(g |nc|nw)|wmlb|wonu|x700|yas\-|your|zeto|zte\-/i.test(a.substr(0,4)))check = true})(navigator.userAgent||navigator.vendor||window.opera); return check; } function display_stats(){ $("#article_stats_div").toggle(); return false; } /* * Cited By Scopus */ function citedCount(){ $("#framed_div_cited_count").toggle('fast', function(){ if ($(this).css('display') != 'none') { var loaded = document.getElementById("loaded_cite_count"); // to load only once the result! if(loaded.innerHTML == "No") { // Load Xref result var d = document.getElementById("framed_div_cited_count"); // This replace the content d.innerHTML = "<img src=\"https://pub.mdpi-res.com/img/loading_circle.gif?9a82694213036313?1739771134\" height=\"20\" width=\"20\" alt=\"Processing...\" style=\"vertical-align:middle; margin-right:0.6em;\">"; $.ajax({ method : "POST", url : "/cite-count/10.3390%252Ff14050945", success : function(data) { if (data.succ) { d.innerHTML = data.view; loaded.innerHTML = "Yes"; follow_goto(); } } }); } } // end else return true; // for not going at the beginning of the page... }) return true; // for not going at the beginning of the page... } </script><script type="text/javascript" src="https://pub.mdpi-res.com/assets/js/third-party/highcharts/highcharts.js?bdd06f45e34c33df?1739771134"></script><script type="text/javascript" src="https://pub.mdpi-res.com/assets/js/third-party/highcharts/modules/exporting.js?944dc938d06de3a8?1739771134"></script><script type="text/javascript" defer="defer"> var advancedStatsData; var selectedStatsType = "abstract"; $(function(){ var countWrapper = $('#counts-wrapper'); $('#author_stats_id #type_links a').on('click', function(e) { e.preventDefault(); selectedStatsType = $(this).data('type'); $('#article_advanced_stats').vectorMap('set', 'values', advancedStatsData[selectedStatsType]); $('#advanced_stats_max').html(advancedStatsData[selectedStatsType].max); $('#type_links a').removeClass('active'); $(this).addClass('active'); }); $.get('/1999-4907/14/5/945/stats', function (result) { if (!result.success) { return; } // process article metrics part in left column var viewNumber = countWrapper.find(".view-number"); viewNumber.html(result.metrics.views); viewNumber.parent().toggleClass("count-div--grey", result.metrics.views == 0); var downloadNumber = countWrapper.find(".download-number"); downloadNumber.html(result.metrics.downloads); downloadNumber.parent().toggleClass("count-div--grey", result.metrics.downloads == 0); var citationsNumber = countWrapper.find(".citations-number"); citationsNumber.html(result.metrics.citations); citationsNumber.parent().toggleClass("count-div--grey", result.metrics.citations == 0); if (result.metrics.views > 0 || result.metrics.downloads > 0 || result.metrics.citations > 0) { countWrapper.find("#js-counts-wrapper__views, #js-counts-wrapper__downloads").addClass("visible").show(); if (result.metrics.citations > 0) { countWrapper.find('.citations-number').html(result.metrics.citations).show(); countWrapper.find("#js-counts-wrapper__citations").addClass("visible").show(); } else { countWrapper.find("#js-counts-wrapper__citations").remove(); } $("[data-id='article-counters']").removeClass("hidden"); } if (result.metrics.altmetrics_score > 0) { $("#js-altmetrics-donut").show(); } // process view chart in main column var jsondata = result.chart; var series = new Array(); $.each(jsondata.elements, function(i, element) { var dataValues = new Array(); $.each(element.values, function(i, value) { dataValues.push(new Array(value.tip, value.value)); }); series[i] = {name: element.text, data:dataValues}; }); Highcharts.setOptions({ chart: { style: { fontFamily: 'Arial,sans-serif' } } }); $('#article_stats_swf').highcharts({ chart: { type: 'line', width: $("#tabs").width() //* 0.91 }, credits: { enabled: false }, exporting: { enabled: true }, title: { text: jsondata.title.text, x: -20 //center }, xAxis: { categories: jsondata.x_axis.labels.labels, offset: jsondata.x_axis.offset, labels:{ step: jsondata.x_axis.labels.steps, rotation: 30 } }, yAxis: { max: jsondata.y_axis.max, min: jsondata.y_axis.min, offset: jsondata.y_axis.offset, labels: { steps: jsondata.y_axis.steps }, title: { enabled: false } }, tooltip: { formatter: function (){ return this.key.replace("#val#", this.y); } }, legend: { align: 'top', itemDistance: 50 }, series: series }); }); $('#supplement_link').click(function() { document.getElementById('suppl_id').scrollIntoView(); }); $('#stats_link').click(function() { document.getElementById('stats_id').scrollIntoView(); }); // open mol viewer for molbank special supplementary files $('.showJmol').click(function(e) { e.preventDefault(); var jmolModal = $("#jmolModal"); var url = "/article/1139025/jsmol_viewer/__supplementary_id__"; url = url.replace(/__supplementary_id__/g, $(this).data('index')); $('#jsmol-content').attr('src', url); jmolModal.find(".content").html($(this).data('description')); jmolModal.foundation("reveal", "open"); }); }); !function() { "use strict"; function e(e) { try { if ("undefined" == typeof console) return; "error"in console ? console.error(e) : console.log(e) } catch (e) {} } function t(e) { return d.innerHTML = '<a href="' + e.replace(/"/g, "&quot;") + '"></a>', d.childNodes[0].getAttribute("href") || "" } function n(n, c) { var o = ""; var k = parseInt(n.substr(c + 4, 2), 16); for (var i = c; i < n.length; i += 2) { if (i != c + 4) { var s = parseInt(n.substr(i, 2), 16) ^ k; o += String.fromCharCode(s); } } try { o = decodeURIComponent(escape(o)); } catch (error) { console.error(error); } return t(o); } function c(t) { for (var r = t.querySelectorAll("a"), c = 0; c < r.length; c++) try { var o = r[c] , a = o.href.indexOf(l); a > -1 && (o.href = "mailto:" + n(o.href, a + l.length)) } catch (i) { e(i) } } function o(t) { for (var r = t.querySelectorAll(u), c = 0; c < r.length; c++) try { var o = r[c] , a = o.parentNode , i = o.getAttribute(f); if (i) { var l = n(i, 0) , d = document.createTextNode(l); a.replaceChild(d, o) } } catch (h) { e(h) } } function a(t) { for (var r = t.querySelectorAll("template"), n = 0; n < r.length; n++) try { i(r[n].content) } catch (c) { e(c) } } function i(t) { try { c(t), o(t), a(t) } catch (r) { e(r) } } var l = "/cnd-cgi/l/email-protection#" , u = ".__cf_email__" , f = "data-cfemail" , d = document.createElement("div"); i(document), function() { var e = document.currentScript || document.scripts[document.scripts.length - 1]; e.parentNode.removeChild(e) }() }(); </script><script type="text/javascript"> function setCookie(cname, cvalue, ctime) { ctime = (typeof ctime === 'undefined') ? 10*365*24*60*60*1000 : ctime; // default => 10 years var d = new Date(); d.setTime(d.getTime() + ctime); // ==> 1 hour = 60*60*1000 var expires = "expires="+d.toUTCString(); document.cookie = cname + "=" + cvalue + "; " + expires +"; path=/"; } function getCookie(cname) { var name = cname + "="; var ca = document.cookie.split(';'); for(var i=0; i<ca.length; i++) { var c = ca[i]; while (c.charAt(0)==' ') c = c.substring(1); if (c.indexOf(name) == 0) return c.substring(name.length, c.length); } return ""; } </script><script type="text/javascript" src="https://d1bxh8uas1mnw7.cloudfront.net/assets/embed.js"></script><script> $(document).ready(function() { if ($("#js-similarity-related-data").length > 0) { $.ajax({ url: '/article/1139025/similarity-related', success: function(response) { $("#js-similarity-related-data").html(response); $("#js-related-articles-menu").show(); $(document).foundation('tab', 'reflow'); MathJax.Hub.Queue(["Typeset", MathJax.Hub]); } }); } }); </script><link rel="stylesheet" href="https://pub.mdpi-res.com/assets/css/jquery-ui-1.10.4.custom.min.css?80647d88647bf347?1739771134"><link rel="stylesheet" href="https://pub.mdpi-res.com/assets/css/magnific-popup.min.css?04d343e036f8eecd?1739771134"><script type="text/javascript" src="https://pub.mdpi-res.com/assets/js/magnific-popup.min.js?2be3d9e7dc569146?1739771134"></script><script> $(function() { $(".js-show-more-academic-editors").on("click", function(e) { e.preventDefault(); $(this).hide(); $(".academic-editor-container").removeClass("hidden"); }); }); </script> <link rel="stylesheet" href="https://pub.mdpi-res.com/assets/css/vmap/jqvmap.min.css?126a06688aa11c13?1739771134"> <script src="https://pub.mdpi-res.com/assets/js/vmap/jquery.vmap.min.js?935f68d33bdd88a1?1739771134"></script> <script src="https://pub.mdpi-res.com/assets/js/vmap/jquery.vmap.world.js?16677403c0e1bef1?1739771134"></script> <script> function updateSlick() { $('.multiple-items').slick('setPosition'); } $(document).ready(function() { $('.multiple-items').slick({ slidesToShow: 1, nextArrow: '<a class="slick-next" href="#"><i class="material-icons">chevron_right</i></a>', prevArrow: '<a class="slick-prev" href="#"><i class="material-icons">chevron_left</i></a>', slidesToScroll: 1, responsive: [ { breakpoint: 1024, settings: { slidesToShow: 1, slidesToScroll: 1, } }, { breakpoint: 600, settings: { slidesToShow: 1, slidesToScroll: 1, } }, { breakpoint: 480, settings: { slidesToShow: 1, slidesToScroll: 1, } } ] }); $('.multiple-items').show(); $(document).on('click', '.reviewReportSelector', function(e) { let path = $(this).attr('data-path'); handleReviews(path, $(this)); }); $(document).on('click', '.viewReviewReports', function(e) { let versionOne = $('#versionTab_1'); if (!versionOne.hasClass('activeTab')) { let path = $(this).attr('data-path'); handleReviews(path, versionOne); } location.href = "#reviewReports"; }); $(document).on('click', '.reviewersResponse, .authorResponse', function(e) { let version = $(this).attr('data-version'); let targetVersion = $('#versionTab_' + version); if (!targetVersion.hasClass('activeTab')) { let path = targetVersion.attr('data-path'); handleReviews(path, targetVersion); } location.href = $(this).attr('data-link'); }); $(document).on('click', '.tab', function (e) { e.preventDefault(); $('.tab').removeClass('activeTab'); $(this).addClass('activeTab') $('.tab').each(function() { $(this).closest('.tab-title').removeClass('active'); }); $(this).closest('.tab-title').addClass('active') }); }); function handleReviews(path, target) { $.ajax({ url: path, context: this, success: function (data) { $('.activeTab').removeClass('activeTab'); target.addClass('activeTab'); $('#reviewSection').html(data.view); }, error: function (xhr, ajaxOptions, thrownError) { console.log(xhr.status); console.log(thrownError); } }); } </script> <script src="https://pub.mdpi-res.com/assets/js/xmltohtml/affix.js?v1?1739771134"></script> <script src="https://pub.mdpi-res.com/assets/js/xmltohtml/storage.js?e9b262d3a3476d25?1739771134"></script> <script src="https://pub.mdpi-res.com/assets/js/xmltohtml/jquery-scrollspy.js?09cbaec0dbb35a67?1739771134"></script> <script src="https://pub.mdpi-res.com/assets/js/xmltohtml/magnific-popup.js?4a09c18460afb26c?1739771134"></script> <script src="https://pub.mdpi-res.com/assets/js/xmltohtml/underscore.js?f893e294cde60c24?1739771134"></script> <script type="text/javascript"> $('document').ready(function(){ $("#left-column").addClass("show-for-large-up"); $("#middle-column").removeClass("medium-9").removeClass("left-bordered").addClass("medium-12"); $(window).on('resize scroll', function() { /* if ($('.button--drop-down').isInViewport($(".top-bar").outerHeight())) { */ if ($('.button--drop-down').isInViewport()) { $("#js-button-download").hide(); } else { $("#js-button-download").show(); } }); }); $(document).on('DOMNodeInserted', function(e) { var element = $(e.target); if (element.hasClass('menu') && element.hasClass('html-nav') ) { element.addClass("side-menu-ul"); } }); </script> <script src="https://pub.mdpi-res.com/assets/js/xmltohtml/articles.js?5118449d9ad8913a?1739771134"></script> <script> repositionOpenSideBar = function() { $('#left-column').addClass("show-for-large-up show-for-medium-up").show(); $('#middle-column').removeClass('large-12').removeClass('medium-12'); $('#middle-column').addClass('large-9'); } repositionCloseSideBar = function() { $('#left-column').removeClass("show-for-large-up show-for-medium-up").hide(); $('#middle-column').removeClass('large-9'); $('#middle-column').addClass('large-12').addClass('medium-12'); } </script> <!--[if lt IE 9]> <script src="https://pub.mdpi-res.com/assets/js/ie8/ie8.js?6eef8fcbc831f5bd?1739771134"></script> <script src="https://pub.mdpi-res.com/assets/js/ie8/jquery.xdomainrequest.min.js?a945caca315782b0?1739771134"></script> <![endif]--> <!-- Twitter universal website tag code --> <script type="text/plain" data-cookieconsent="marketing"> !function(e,t,n,s,u,a){e.twq||(s=e.twq=function(){s.exe?s.exe.apply(s,arguments):s.queue.push(arguments); },s.version='1.1',s.queue=[],u=t.createElement(n),u.async=!0,u.src='//static.ads-twitter.com/uwt.js', a=t.getElementsByTagName(n)[0],a.parentNode.insertBefore(u,a))}(window,document,'script'); // Insert Twitter Pixel ID and Standard Event data below twq('init','o2pip'); twq('track','PageView'); </script> <!-- End Twitter universal website tag code --> <script>(function(){function c(){var b=a.contentDocument||a.contentWindow.document;if(b){var d=b.createElement('script');d.innerHTML="window.__CF$cv$params={r:'913416e13cde89b0',t:'MTczOTc3NzYwOC4wMDAwMDA='};var a=document.createElement('script');a.nonce='';a.src='/cdn-cgi/challenge-platform/scripts/jsd/main.js';document.getElementsByTagName('head')[0].appendChild(a);";b.getElementsByTagName('head')[0].appendChild(d)}}if(document.body){var a=document.createElement('iframe');a.height=1;a.width=1;a.style.position='absolute';a.style.top=0;a.style.left=0;a.style.border='none';a.style.visibility='hidden';document.body.appendChild(a);if('loading'!==document.readyState)c();else if(window.addEventListener)document.addEventListener('DOMContentLoaded',c);else{var e=document.onreadystatechange||function(){};document.onreadystatechange=function(b){e(b);'loading'!==document.readyState&&(document.onreadystatechange=e,c())}}}})();</script></body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10