CINXE.COM
(PDF) Study of Occlusion-Robust Object Detection with Soft-NMS and Contextual Feature Extraction | Trọng Phan - Academia.edu
<!DOCTYPE html> <html > <head> <meta charset="utf-8"> <meta rel="search" type="application/opensearchdescription+xml" href="/open_search.xml" title="Academia.edu"> <meta content="width=device-width, initial-scale=1" name="viewport"> <meta name="google-site-verification" content="bKJMBZA7E43xhDOopFZkssMMkBRjvYERV-NaN4R6mrs"> <meta name="csrf-param" content="authenticity_token" /> <meta name="csrf-token" content="RcFvJSwS8N7GM4Ybw6UuHvL88o4oUD2W1wGsaBsl7q+jLXkivPkwEpJiDbG3KpdbBm3t7UPWyd/l6SkCssKZ6A==" /> <meta name="citation_title" content="Study of Occlusion-Robust Object Detection with Soft-NMS and Contextual Feature Extraction" /> <meta name="citation_publication_date" content="2021/01/01" /> <meta name="citation_journal_title" content="Journal of the Japan Society for Precision Engineering" /> <meta name="citation_author" content="Trọng Phan" /> <meta name="twitter:card" content="summary" /> <meta name="twitter:url" content="https://www.academia.edu/117398759/Study_of_Occlusion_Robust_Object_Detection_with_Soft_NMS_and_Contextual_Feature_Extraction" /> <meta name="twitter:title" content="Study of Occlusion-Robust Object Detection with Soft-NMS and Contextual Feature Extraction" /> <meta name="twitter:description" content="Academia.edu is a platform for academics to share research papers." /> <meta name="twitter:image" content="https://0.academia-photos.com/305263894/150060214/139636010/s200_tr_ng.phan.jpeg" /> <meta property="fb:app_id" content="2369844204" /> <meta property="og:type" content="article" /> <meta property="og:url" content="https://www.academia.edu/117398759/Study_of_Occlusion_Robust_Object_Detection_with_Soft_NMS_and_Contextual_Feature_Extraction" /> <meta property="og:title" content="Study of Occlusion-Robust Object Detection with Soft-NMS and Contextual Feature Extraction" /> <meta property="og:image" content="http://a.academia-assets.com/images/open-graph-icons/fb-paper.gif" /> <meta property="og:description" content="Study of Occlusion-Robust Object Detection with Soft-NMS and Contextual Feature Extraction" /> <meta property="article:author" content="https://independent.academia.edu/Tr%E1%BB%8DngPhan42" /> <meta name="description" content="Study of Occlusion-Robust Object Detection with Soft-NMS and Contextual Feature Extraction" /> <title>(PDF) Study of Occlusion-Robust Object Detection with Soft-NMS and Contextual Feature Extraction | Trọng Phan - Academia.edu</title> <link rel="canonical" href="https://www.academia.edu/117398759/Study_of_Occlusion_Robust_Object_Detection_with_Soft_NMS_and_Contextual_Feature_Extraction" /> <script async src="https://www.googletagmanager.com/gtag/js?id=G-5VKX33P2DS"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-5VKX33P2DS', { cookie_domain: 'academia.edu', send_page_view: false, }); gtag('event', 'page_view', { 'controller': "single_work", 'action': "show", 'controller_action': 'single_work#show', 'logged_in': 'false', 'edge': 'unknown', // Send nil if there is no A/B test bucket, in case some records get logged // with missing data - that way we can distinguish between the two cases. // ab_test_bucket should be of the form <ab_test_name>:<bucket> 'ab_test_bucket': null, }) </script> <script> var $controller_name = 'single_work'; var $action_name = "show"; var $rails_env = 'production'; var $app_rev = 'ca4e910d46e38122317a3848288563fbc11da57e'; var $domain = 'academia.edu'; var $app_host = "academia.edu"; var $asset_host = "academia-assets.com"; var $start_time = new Date().getTime(); var $recaptcha_key = "6LdxlRMTAAAAADnu_zyLhLg0YF9uACwz78shpjJB"; var $recaptcha_invisible_key = "6Lf3KHUUAAAAACggoMpmGJdQDtiyrjVlvGJ6BbAj"; var $disableClientRecordHit = false; </script> <script> window.require = { config: function() { return function() {} } } </script> <script> window.Aedu = window.Aedu || {}; window.Aedu.hit_data = null; window.Aedu.serverRenderTime = new Date(1732746767000); window.Aedu.timeDifference = new Date().getTime() - 1732746767000; </script> <script type="application/ld+json">{"@context":"https://schema.org","@type":"ScholarlyArticle","abstract":null,"author":[{"@context":"https://schema.org","@type":"Person","name":"Trọng Phan"}],"contributor":[],"dateCreated":"2024-04-12","dateModified":null,"datePublished":"2021-01-01","headline":"Study of Occlusion-Robust Object Detection with Soft-NMS and Contextual Feature Extraction","inLanguage":"ja","keywords":["Computer Science","Artificial Intelligence","Computer Vision","Feature Extraction"],"locationCreated":null,"publication":"Journal of the Japan Society for Precision Engineering","publisher":{"@context":"https://schema.org","@type":"Organization","name":"Japan Society for Precision Engineering"},"image":null,"thumbnailUrl":null,"url":"https://www.academia.edu/117398759/Study_of_Occlusion_Robust_Object_Detection_with_Soft_NMS_and_Contextual_Feature_Extraction","sourceOrganization":[{"@context":"https://schema.org","@type":"EducationalOrganization","name":null}]}</script><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/single_work_page/loswp-102fa537001ba4d8dcd921ad9bd56c474abc201906ea4843e7e7efe9dfbf561d.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/body-8d679e925718b5e8e4b18e9a4fab37f7eaa99e43386459376559080ac8f2856a.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/button-3cea6e0ad4715ed965c49bfb15dedfc632787b32ff6d8c3a474182b231146ab7.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/text_button-73590134e40cdb49f9abdc8e796cc00dc362693f3f0f6137d6cf9bb78c318ce7.css" /><link crossorigin="" href="https://fonts.gstatic.com/" rel="preconnect" /><link href="https://fonts.googleapis.com/css2?family=DM+Sans:ital,opsz,wght@0,9..40,100..1000;1,9..40,100..1000&family=Gupter:wght@400;500;700&family=IBM+Plex+Mono:wght@300;400&family=Material+Symbols+Outlined:opsz,wght,FILL,GRAD@20,400,0,0&display=swap" rel="stylesheet" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/common-10fa40af19d25203774df2d4a03b9b5771b45109c2304968038e88a81d1215c5.css" /> </head> <body> <div id='react-modal'></div> <div class="js-upgrade-ie-banner" style="display: none; text-align: center; padding: 8px 0; background-color: #ebe480;"><p style="color: #000; font-size: 12px; margin: 0 0 4px;">Academia.edu no longer supports Internet Explorer.</p><p style="color: #000; font-size: 12px; margin: 0;">To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to <a href="https://www.academia.edu/upgrade-browser">upgrade your browser</a>.</p></div><script>// Show this banner for all versions of IE if (!!window.MSInputMethodContext || /(MSIE)/.test(navigator.userAgent)) { document.querySelector('.js-upgrade-ie-banner').style.display = 'block'; }</script> <div class="bootstrap login"><div class="modal fade login-modal" id="login-modal"><div class="login-modal-dialog modal-dialog"><div class="modal-content"><div class="modal-header"><button class="close close" data-dismiss="modal" type="button"><span aria-hidden="true">×</span><span class="sr-only">Close</span></button><h4 class="modal-title text-center"><strong>Log In</strong></h4></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><button class="btn btn-fb btn-lg btn-block btn-v-center-content" id="login-facebook-oauth-button"><svg style="float: left; width: 19px; line-height: 1em; margin-right: .3em;" aria-hidden="true" focusable="false" data-prefix="fab" data-icon="facebook-square" class="svg-inline--fa fa-facebook-square fa-w-14" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 448 512"><path fill="currentColor" d="M400 32H48A48 48 0 0 0 0 80v352a48 48 0 0 0 48 48h137.25V327.69h-63V256h63v-54.64c0-62.15 37-96.48 93.67-96.48 27.14 0 55.52 4.84 55.52 4.84v61h-31.27c-30.81 0-40.42 19.12-40.42 38.73V256h68.78l-11 71.69h-57.78V480H400a48 48 0 0 0 48-48V80a48 48 0 0 0-48-48z"></path></svg><small><strong>Log in</strong> with <strong>Facebook</strong></small></button><br /><button class="btn btn-google btn-lg btn-block btn-v-center-content" id="login-google-oauth-button"><svg style="float: left; width: 22px; line-height: 1em; margin-right: .3em;" aria-hidden="true" focusable="false" data-prefix="fab" data-icon="google-plus" class="svg-inline--fa fa-google-plus fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M256,8C119.1,8,8,119.1,8,256S119.1,504,256,504,504,392.9,504,256,392.9,8,256,8ZM185.3,380a124,124,0,0,1,0-248c31.3,0,60.1,11,83,32.3l-33.6,32.6c-13.2-12.9-31.3-19.1-49.4-19.1-42.9,0-77.2,35.5-77.2,78.1S142.3,334,185.3,334c32.6,0,64.9-19.1,70.1-53.3H185.3V238.1H302.2a109.2,109.2,0,0,1,1.9,20.7c0,70.8-47.5,121.2-118.8,121.2ZM415.5,273.8v35.5H380V273.8H344.5V238.3H380V202.8h35.5v35.5h35.2v35.5Z"></path></svg><small><strong>Log in</strong> with <strong>Google</strong></small></button><br /><style type="text/css">.sign-in-with-apple-button { width: 100%; height: 52px; border-radius: 3px; border: 1px solid black; cursor: pointer; }</style><script src="https://appleid.cdn-apple.com/appleauth/static/jsapi/appleid/1/en_US/appleid.auth.js" type="text/javascript"></script><div class="sign-in-with-apple-button" data-border="false" data-color="white" id="appleid-signin"><span ="Sign Up with Apple" class="u-fs11"></span></div><script>AppleID.auth.init({ clientId: 'edu.academia.applesignon', scope: 'name email', redirectURI: 'https://www.academia.edu/sessions', state: "6e7a31a3cf718b09b4bc3e2226d36fc613c29b69d2db5da6fde098728768a683", });</script><script>// Hacky way of checking if on fast loswp if (window.loswp == null) { (function() { const Google = window?.Aedu?.Auth?.OauthButton?.Login?.Google; const Facebook = window?.Aedu?.Auth?.OauthButton?.Login?.Facebook; if (Google) { new Google({ el: '#login-google-oauth-button', rememberMeCheckboxId: 'remember_me', track: null }); } if (Facebook) { new Facebook({ el: '#login-facebook-oauth-button', rememberMeCheckboxId: 'remember_me', track: null }); } })(); }</script></div></div></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><div class="hr-heading login-hr-heading"><span class="hr-heading-text">or</span></div></div></div></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><form class="js-login-form" action="https://www.academia.edu/sessions" accept-charset="UTF-8" method="post"><input name="utf8" type="hidden" value="✓" autocomplete="off" /><input type="hidden" name="authenticity_token" value="vHyC5xrgws2foG2o7TtgdJ2XXe7vAKRG3bDlLnUXedVakJTgigsCAcvx5gKZtNkxaQZCjYSGUA/vWGBE3PAOkg==" autocomplete="off" /><div class="form-group"><label class="control-label" for="login-modal-email-input" style="font-size: 14px;">Email</label><input class="form-control" id="login-modal-email-input" name="login" type="email" /></div><div class="form-group"><label class="control-label" for="login-modal-password-input" style="font-size: 14px;">Password</label><input class="form-control" id="login-modal-password-input" name="password" type="password" /></div><input type="hidden" name="post_login_redirect_url" id="post_login_redirect_url" value="https://www.academia.edu/117398759/Study_of_Occlusion_Robust_Object_Detection_with_Soft_NMS_and_Contextual_Feature_Extraction" autocomplete="off" /><div class="checkbox"><label><input type="checkbox" name="remember_me" id="remember_me" value="1" checked="checked" /><small style="font-size: 12px; margin-top: 2px; display: inline-block;">Remember me on this computer</small></label></div><br><input type="submit" name="commit" value="Log In" class="btn btn-primary btn-block btn-lg js-login-submit" data-disable-with="Log In" /></br></form><script>typeof window?.Aedu?.recaptchaManagedForm === 'function' && window.Aedu.recaptchaManagedForm( document.querySelector('.js-login-form'), document.querySelector('.js-login-submit') );</script><small style="font-size: 12px;"><br />or <a data-target="#login-modal-reset-password-container" data-toggle="collapse" href="javascript:void(0)">reset password</a></small><div class="collapse" id="login-modal-reset-password-container"><br /><div class="well margin-0x"><form class="js-password-reset-form" action="https://www.academia.edu/reset_password" accept-charset="UTF-8" method="post"><input name="utf8" type="hidden" value="✓" autocomplete="off" /><input type="hidden" name="authenticity_token" value="NWOZR8xtM+gemzIC+TcpTLPmige/L569bMWBiqGZ1GfTj49AXIbzJErKuaiNuJAJR3eVZNSpavReLQTgCH6jIA==" autocomplete="off" /><p>Enter the email address you signed up with and we'll email you a reset link.</p><div class="form-group"><input class="form-control" name="email" type="email" /></div><input class="btn btn-primary btn-block g-recaptcha js-password-reset-submit" data-sitekey="6Lf3KHUUAAAAACggoMpmGJdQDtiyrjVlvGJ6BbAj" type="submit" value="Email me a link" /></form></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/collapse-45805421cf446ca5adf7aaa1935b08a3a8d1d9a6cc5d91a62a2a3a00b20b3e6a.js"], function() { // from javascript_helper.rb $("#login-modal-reset-password-container").on("shown.bs.collapse", function() { $(this).find("input[type=email]").focus(); }); }); </script> </div></div></div><div class="modal-footer"><div class="text-center"><small style="font-size: 12px;">Need an account? <a rel="nofollow" href="https://www.academia.edu/signup">Click here to sign up</a></small></div></div></div></div></div></div><script>// If we are on subdomain or non-bootstrapped page, redirect to login page instead of showing modal (function(){ if (typeof $ === 'undefined') return; var host = window.location.hostname; if ((host === $domain || host === "www."+$domain) && (typeof $().modal === 'function')) { $("#nav_log_in").click(function(e) { // Don't follow the link and open the modal e.preventDefault(); $("#login-modal").on('shown.bs.modal', function() { $(this).find("#login-modal-email-input").focus() }).modal('show'); }); } })()</script> <div id="fb-root"></div><script>window.fbAsyncInit = function() { FB.init({ appId: "2369844204", version: "v8.0", status: true, cookie: true, xfbml: true }); // Additional initialization code. if (window.InitFacebook) { // facebook.ts already loaded, set it up. window.InitFacebook(); } else { // Set a flag for facebook.ts to find when it loads. window.academiaAuthReadyFacebook = true; } };</script> <div id="google-root"></div><script>window.loadGoogle = function() { if (window.InitGoogle) { // google.ts already loaded, set it up. window.InitGoogle("331998490334-rsn3chp12mbkiqhl6e7lu2q0mlbu0f1b"); } else { // Set a flag for google.ts to use when it loads. window.GoogleClientID = "331998490334-rsn3chp12mbkiqhl6e7lu2q0mlbu0f1b"; } };</script> <div class="header--container" id="main-header-container"><div class="header--inner-container header--inner-container-ds2"><div class="header-ds2--left-wrapper"><div class="header-ds2--left-wrapper-inner"><a data-main-header-link-target="logo_home" href="https://www.academia.edu/"><img class="hide-on-desktop-redesign" style="height: 24px; width: 24px;" alt="Academia.edu" src="//a.academia-assets.com/images/academia-logo-redesign-2015-A.svg" width="24" height="24" /><img width="145.2" height="18" class="hide-on-mobile-redesign" style="height: 24px;" alt="Academia.edu" src="//a.academia-assets.com/images/academia-logo-redesign-2015.svg" /></a><div class="header--search-container header--search-container-ds2"><form class="js-SiteSearch-form select2-no-default-pills" action="https://www.academia.edu/search" accept-charset="UTF-8" method="get"><input name="utf8" type="hidden" value="✓" autocomplete="off" /><svg style="width: 14px; height: 14px;" aria-hidden="true" focusable="false" data-prefix="fas" data-icon="search" class="header--search-icon svg-inline--fa fa-search fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M505 442.7L405.3 343c-4.5-4.5-10.6-7-17-7H372c27.6-35.3 44-79.7 44-128C416 93.1 322.9 0 208 0S0 93.1 0 208s93.1 208 208 208c48.3 0 92.7-16.4 128-44v16.3c0 6.4 2.5 12.5 7 17l99.7 99.7c9.4 9.4 24.6 9.4 33.9 0l28.3-28.3c9.4-9.4 9.4-24.6.1-34zM208 336c-70.7 0-128-57.2-128-128 0-70.7 57.2-128 128-128 70.7 0 128 57.2 128 128 0 70.7-57.2 128-128 128z"></path></svg><input class="header--search-input header--search-input-ds2 js-SiteSearch-form-input" data-main-header-click-target="search_input" name="q" placeholder="Search" type="text" /></form></div></div></div><nav class="header--nav-buttons header--nav-buttons-ds2 js-main-nav"><a class="ds2-5-button ds2-5-button--secondary js-header-login-url header-button-ds2 header-login-ds2 hide-on-mobile-redesign" href="https://www.academia.edu/login" rel="nofollow">Log In</a><a class="ds2-5-button ds2-5-button--secondary header-button-ds2 hide-on-mobile-redesign" href="https://www.academia.edu/signup" rel="nofollow">Sign Up</a><button class="header--hamburger-button header--hamburger-button-ds2 hide-on-desktop-redesign js-header-hamburger-button"><div class="icon-bar"></div><div class="icon-bar" style="margin-top: 4px;"></div><div class="icon-bar" style="margin-top: 4px;"></div></button></nav></div><ul class="header--dropdown-container js-header-dropdown"><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/login" rel="nofollow">Log In</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/signup" rel="nofollow">Sign Up</a></li><li class="header--dropdown-row js-header-dropdown-expand-button"><button class="header--dropdown-button">more<svg aria-hidden="true" focusable="false" data-prefix="fas" data-icon="caret-down" class="header--dropdown-button-icon svg-inline--fa fa-caret-down fa-w-10" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 320 512"><path fill="currentColor" d="M31.3 192h257.3c17.8 0 26.7 21.5 14.1 34.1L174.1 354.8c-7.8 7.8-20.5 7.8-28.3 0L17.2 226.1C4.6 213.5 13.5 192 31.3 192z"></path></svg></button></li><li><ul class="header--expanded-dropdown-container"><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/about">About</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/press">Press</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://medium.com/@academia">Blog</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/documents">Papers</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/terms">Terms</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/privacy">Privacy</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/copyright">Copyright</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/hiring"><svg aria-hidden="true" focusable="false" data-prefix="fas" data-icon="briefcase" class="header--dropdown-row-icon svg-inline--fa fa-briefcase fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M320 336c0 8.84-7.16 16-16 16h-96c-8.84 0-16-7.16-16-16v-48H0v144c0 25.6 22.4 48 48 48h416c25.6 0 48-22.4 48-48V288H320v48zm144-208h-80V80c0-25.6-22.4-48-48-48H176c-25.6 0-48 22.4-48 48v48H48c-25.6 0-48 22.4-48 48v80h512v-80c0-25.6-22.4-48-48-48zm-144 0H192V96h128v32z"></path></svg>We're Hiring!</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://support.academia.edu/"><svg aria-hidden="true" focusable="false" data-prefix="fas" data-icon="question-circle" class="header--dropdown-row-icon svg-inline--fa fa-question-circle fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M504 256c0 136.997-111.043 248-248 248S8 392.997 8 256C8 119.083 119.043 8 256 8s248 111.083 248 248zM262.655 90c-54.497 0-89.255 22.957-116.549 63.758-3.536 5.286-2.353 12.415 2.715 16.258l34.699 26.31c5.205 3.947 12.621 3.008 16.665-2.122 17.864-22.658 30.113-35.797 57.303-35.797 20.429 0 45.698 13.148 45.698 32.958 0 14.976-12.363 22.667-32.534 33.976C247.128 238.528 216 254.941 216 296v4c0 6.627 5.373 12 12 12h56c6.627 0 12-5.373 12-12v-1.333c0-28.462 83.186-29.647 83.186-106.667 0-58.002-60.165-102-116.531-102zM256 338c-25.365 0-46 20.635-46 46 0 25.364 20.635 46 46 46s46-20.636 46-46c0-25.365-20.635-46-46-46z"></path></svg>Help Center</a></li><li class="header--dropdown-row js-header-dropdown-collapse-button"><button class="header--dropdown-button">less<svg aria-hidden="true" focusable="false" data-prefix="fas" data-icon="caret-up" class="header--dropdown-button-icon svg-inline--fa fa-caret-up fa-w-10" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 320 512"><path fill="currentColor" d="M288.662 352H31.338c-17.818 0-26.741-21.543-14.142-34.142l128.662-128.662c7.81-7.81 20.474-7.81 28.284 0l128.662 128.662c12.6 12.599 3.676 34.142-14.142 34.142z"></path></svg></button></li></ul></li></ul></div> <script src="//a.academia-assets.com/assets/webpack_bundles/fast_loswp-bundle-71e03f93a0fba43adc4297a781256a72e56b0d578ac299a0d81b09f4c7bc6f70.js" defer="defer"></script><script>window.loswp = {}; window.loswp.author = 305263894; window.loswp.bulkDownloadFilterCounts = {}; window.loswp.hasDownloadableAttachment = true; window.loswp.hasViewableAttachments = true; // TODO: just use routes for this window.loswp.loginUrl = "https://www.academia.edu/login?post_login_redirect_url=https%3A%2F%2Fwww.academia.edu%2F117398759%2FStudy_of_Occlusion_Robust_Object_Detection_with_Soft_NMS_and_Contextual_Feature_Extraction%3Fauto%3Ddownload"; window.loswp.translateUrl = "https://www.academia.edu/login?post_login_redirect_url=https%3A%2F%2Fwww.academia.edu%2F117398759%2FStudy_of_Occlusion_Robust_Object_Detection_with_Soft_NMS_and_Contextual_Feature_Extraction%3Fshow_translation%3Dtrue"; window.loswp.previewableAttachments = [{"id":113265852,"identifier":"Attachment_113265852","shouldShowBulkDownload":false}]; window.loswp.shouldDetectTimezone = true; window.loswp.shouldShowBulkDownload = true; window.loswp.showSignupCaptcha = false window.loswp.willEdgeCache = false; window.loswp.work = {"work":{"id":117398759,"created_at":"2024-04-12T07:22:52.411-07:00","from_world_paper_id":252787896,"updated_at":"2024-04-12T09:22:01.730-07:00","_data":{"publisher":"Japan Society for Precision Engineering","publication_date":"2021,,","publication_name":"Journal of the Japan Society for Precision Engineering"},"document_type":"paper","pre_hit_view_count_baseline":null,"quality":"high","language":"ja","title":"Study of Occlusion-Robust Object Detection with Soft-NMS and Contextual Feature Extraction","broadcastable":false,"draft":null,"has_indexable_attachment":true,"indexable":true}}["work"]; window.loswp.workCoauthors = [305263894]; window.loswp.locale = "en"; window.loswp.countryCode = "SG"; window.loswp.cwvAbTestBucket = ""; window.loswp.designVariant = "ds_vanilla"; window.loswp.fullPageMobileSutdModalVariant = "full_page_mobile_sutd_modal"; window.loswp.useOptimizedScribd4genScript = false; window.loswp.appleClientId = 'edu.academia.applesignon';</script><script defer="" src="https://accounts.google.com/gsi/client"></script><div class="ds-loswp-container"><div class="ds-work-card--grid-container"><div class="ds-work-card--container js-loswp-work-card"><div class="ds-work-card--cover"><div class="ds-work-cover--wrapper"><div class="ds-work-cover--container"><button class="ds-work-cover--clickable js-swp-download-button" data-signup-modal="{"location":"swp-splash-paper-cover","attachmentId":113265852,"attachmentType":"pdf"}"><img alt="First page of “Study of Occlusion-Robust Object Detection with Soft-NMS and Contextual Feature Extraction”" class="ds-work-cover--cover-thumbnail" src="https://0.academia-photos.com/attachment_thumbnails/113265852/mini_magick20240802-1-fp66dn.png?1722635977" /><img alt="PDF Icon" class="ds-work-cover--file-icon" src="//a.academia-assets.com/assets/single_work_splash/adobe.icon-574afd46eb6b03a77a153a647fb47e30546f9215c0ee6a25df597a779717f9ef.svg" /><div class="ds-work-cover--hover-container"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">download</span><p>Download Free PDF</p></div><div class="ds-work-cover--ribbon-container">Download Free PDF</div><div class="ds-work-cover--ribbon-triangle"></div></button></div></div></div><div class="ds-work-card--work-information"><h1 class="ds-work-card--work-title">Study of Occlusion-Robust Object Detection with Soft-NMS and Contextual Feature Extraction</h1><div class="ds-work-card--work-authors ds-work-card--detail"><a class="ds-work-card--author js-wsj-grid-card-author ds2-5-body-md ds2-5-body-link" data-author-id="305263894" href="https://independent.academia.edu/Tr%E1%BB%8DngPhan42"><img alt="Profile image of Trọng Phan" class="ds-work-card--author-avatar" src="https://0.academia-photos.com/305263894/150060214/139636010/s65_tr_ng.phan.jpeg" />Trọng Phan</a></div><div class="ds-work-card--detail"><p class="ds-work-card--detail ds2-5-body-sm">2021, Journal of the Japan Society for Precision Engineering</p><div class="ds-work-card--work-metadata"><div class="ds-work-card--work-metadata__stat"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">visibility</span><p class="ds2-5-body-sm" id="work-metadata-view-count">…</p></div><div class="ds-work-card--work-metadata__stat"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">description</span><p class="ds2-5-body-sm">7 pages</p></div><div class="ds-work-card--work-metadata__stat"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">link</span><p class="ds2-5-body-sm">1 file</p></div></div><script>(async () => { const workId = 117398759; const worksViewsPath = "/v0/works/views?subdomain_param=api&work_ids%5B%5D=117398759"; const getWorkViews = async (workId) => { const response = await fetch(worksViewsPath); if (!response.ok) { throw new Error('Failed to load work views'); } const data = await response.json(); return data.views[workId]; }; // Get the view count for the work - we send this immediately rather than waiting for // the DOM to load, so it can be available as soon as possible (but without holding up // the backend or other resource requests, because it's a bit expensive and not critical). const viewCount = await getWorkViews(workId); const updateViewCount = (viewCount) => { const viewCountNumber = Number(viewCount); if (!viewCountNumber) { throw new Error('Failed to parse view count'); } const commaizedViewCount = viewCountNumber.toLocaleString(); const viewCountBody = document.getElementById('work-metadata-view-count'); if (viewCountBody) { viewCountBody.textContent = `${commaizedViewCount} views`; } else { throw new Error('Failed to find work views element'); } }; // If the DOM is still loading, wait for it to be ready before updating the view count. if (document.readyState === "loading") { document.addEventListener('DOMContentLoaded', () => { updateViewCount(viewCount); }); // Otherwise, just update it immediately. } else { updateViewCount(viewCount); } })();</script></div><div class="ds-work-card--button-container"><button class="ds2-5-button js-swp-download-button" data-signup-modal="{"location":"continue-reading-button--work-card","attachmentId":113265852,"attachmentType":"pdf","workUrl":"https://www.academia.edu/117398759/Study_of_Occlusion_Robust_Object_Detection_with_Soft_NMS_and_Contextual_Feature_Extraction"}">See full PDF</button><button class="ds2-5-button ds2-5-button--secondary js-swp-download-button" data-signup-modal="{"location":"download-pdf-button--work-card","attachmentId":113265852,"attachmentType":"pdf","workUrl":"https://www.academia.edu/117398759/Study_of_Occlusion_Robust_Object_Detection_with_Soft_NMS_and_Contextual_Feature_Extraction"}"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">download</span>Download PDF</button></div></div></div></div><div data-auto_select="false" data-client_id="331998490334-rsn3chp12mbkiqhl6e7lu2q0mlbu0f1b" data-doc_id="113265852" data-landing_url="https://www.academia.edu/117398759/Study_of_Occlusion_Robust_Object_Detection_with_Soft_NMS_and_Contextual_Feature_Extraction" data-login_uri="https://www.academia.edu/registrations/google_one_tap" data-moment_callback="onGoogleOneTapEvent" id="g_id_onload"></div><div class="ds-top-related-works--grid-container"><div class="ds-related-content--container ds-top-related-works--container"><h2 class="ds-related-content--heading">Related papers</h2><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="0" data-entity-id="117977765" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/117977765/Identification_of_Conditions_with_High_Speed_and_Accuracy_of_Target_Prediction_Method_by_Switching_from_Ballistic_Eye_Movement_to_Homing_Eye_Movement">Identification of Conditions with High Speed and Accuracy of Target Prediction Method by Switching from Ballistic Eye Movement to Homing Eye Movement</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="54377698" href="https://independent.academia.edu/murataatsuo">atsuo murata</a></div><p class="ds-related-work--metadata ds2-5-body-xs">The Japanese Journal of Ergonomics, 2022</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Identification of Conditions with High Speed and Accuracy of Target Prediction Method by Switching from Ballistic Eye Movement to Homing Eye Movement","attachmentId":113709684,"attachmentType":"pdf","work_url":"https://www.academia.edu/117977765/Identification_of_Conditions_with_High_Speed_and_Accuracy_of_Target_Prediction_Method_by_Switching_from_Ballistic_Eye_Movement_to_Homing_Eye_Movement","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/117977765/Identification_of_Conditions_with_High_Speed_and_Accuracy_of_Target_Prediction_Method_by_Switching_from_Ballistic_Eye_Movement_to_Homing_Eye_Movement"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="1" data-entity-id="107378786" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/107378786/Confident_Margin_%E3%82%92%E7%94%A8%E3%81%84%E3%81%9F_SVM_%E3%81%AE%E3%81%9F%E3%82%81%E3%81%AE%E7%89%B9%E5%BE%B4%E9%81%B8%E6%8A%9E%E6%89%8B%E6%B3%95">Confident Margin を用いた SVM のための特徴選択手法</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="252060107" href="https://independent.academia.edu/SatriyoNugroho45">Satriyo Nugroho</a></div><p class="ds-related-work--abstract ds2-5-body-sm">あらまし 本論文では SVM におけるマージンをベースとした特徴選択手法を提案する. しかし一般的なマージン (Normal Margin と呼ぶことにする) の場合, マージンの大きさと SVM の学習により得られる識別関数の良さが適切に対応しない場合があることが明らかとなった. すなわち, Normal Margin を評価値とした特徴選択を行って得られた特徴セットが, 必ずしも最良の識別関数を与えるとは限らない. そこでこの問題を解決するために Confident Margin (CM) という新しい評価基準を導入し, それを用いた特徴選択アルゴリズム SBS-CM を ...</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Confident Margin を用いた SVM のための特徴選択手法","attachmentId":106060919,"attachmentType":"pdf","work_url":"https://www.academia.edu/107378786/Confident_Margin_%E3%82%92%E7%94%A8%E3%81%84%E3%81%9F_SVM_%E3%81%AE%E3%81%9F%E3%82%81%E3%81%AE%E7%89%B9%E5%BE%B4%E9%81%B8%E6%8A%9E%E6%89%8B%E6%B3%95","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/107378786/Confident_Margin_%E3%82%92%E7%94%A8%E3%81%84%E3%81%9F_SVM_%E3%81%AE%E3%81%9F%E3%82%81%E3%81%AE%E7%89%B9%E5%BE%B4%E9%81%B8%E6%8A%9E%E6%89%8B%E6%B3%95"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="2" data-entity-id="125189532" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/125189532/Stability_Analysis_Based_on_Continuous_discrete_Time_System_of_Dynamic_Object_Manipulation_through_A_Soft_Interface">Stability Analysis Based on Continuous-discrete Time System of Dynamic Object Manipulation through A Soft Interface</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="277396598" href="https://independent.academia.edu/HiraiShinichi">Shinichi Hirai</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Journal of the Robotics Society of Japan, 2006</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Stability Analysis Based on Continuous-discrete Time System of Dynamic Object Manipulation through A Soft Interface","attachmentId":119276754,"attachmentType":"pdf","work_url":"https://www.academia.edu/125189532/Stability_Analysis_Based_on_Continuous_discrete_Time_System_of_Dynamic_Object_Manipulation_through_A_Soft_Interface","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/125189532/Stability_Analysis_Based_on_Continuous_discrete_Time_System_of_Dynamic_Object_Manipulation_through_A_Soft_Interface"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="3" data-entity-id="45265554" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/45265554/Motion_Parameter_Estimation_by_Using_Spatially_Localized_Reticles_and_Its_Application_to_Real_Time_Object_Tracking">Motion Parameter Estimation by Using Spatially Localized Reticles and Its Application to Real-Time Object Tracking</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="97302135" href="https://independent.academia.edu/WMitsuhashi">Wataru Mitsuhashi</a></div><p class="ds-related-work--metadata ds2-5-body-xs">IEEJ Transactions on Electronics Information and Systems</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Motion Parameter Estimation by Using Spatially Localized Reticles and Its Application to Real-Time Object Tracking","attachmentId":65828105,"attachmentType":"pdf","work_url":"https://www.academia.edu/45265554/Motion_Parameter_Estimation_by_Using_Spatially_Localized_Reticles_and_Its_Application_to_Real_Time_Object_Tracking","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/45265554/Motion_Parameter_Estimation_by_Using_Spatially_Localized_Reticles_and_Its_Application_to_Real_Time_Object_Tracking"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="4" data-entity-id="78747217" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/78747217/High_Speed_Binocular_Active_Camera_System_for_Capturing_Good_Image_of_a_Moving_Object">High-Speed Binocular Active Camera System for Capturing Good Image of a Moving Object</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="11974847" href="https://independent.academia.edu/Haiyuanwu">Haiyuan wu</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2008</p><p class="ds-related-work--abstract ds2-5-body-sm">あらまし 本論文では,独立して動作する 2台の能動カメラを用い,高速に運動している物体(150度/秒程度) を追跡し,対象の鮮明な画像を撮影できる高速追従型 2 眼能動カメラシステムを提案する.本論文では,鮮明な 画像を,適切な大きさで撮影された,ぶれやピントずれがない対象の画像と定義する.能動カメラは,FV-PTZ (Fixed Viewpoint Pan-Tilt-Zoom)カメラを使用し,画像内における対象追跡は K-means tracker アルゴリズ ムを用いる.提案システムでは,追跡対象を注視するために,「人間の両目」のように 2 台の能動カメラを協調 させ,視線を追跡対象上の 1 点で交わらせるように能動カメラを制御する.これを実現するために,K-means tracker に信頼度の概念を導入し,相対的な信頼度を用いた 2 台の能動カメラの制御方法を提案する.この信頼 度とエピポーラ拘束を利用することにより,それぞれの能動カメラの追従性能を保ったまま,矛盾のないカメラ の首振り制御を行い,また,ズーム・フォーカスも同時に制御することにより,より鮮明な対象の画像の撮影し 続ける能動追跡システムを実現する. キーワード 能動追跡,2 眼能動カメラ,K-means tracker,鮮明な画像,ズーム・フォーカス制御</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"High-Speed Binocular Active Camera System for Capturing Good Image of a Moving Object","attachmentId":85684720,"attachmentType":"pdf","work_url":"https://www.academia.edu/78747217/High_Speed_Binocular_Active_Camera_System_for_Capturing_Good_Image_of_a_Moving_Object","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/78747217/High_Speed_Binocular_Active_Camera_System_for_Capturing_Good_Image_of_a_Moving_Object"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="5" data-entity-id="117779026" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/117779026/Real_Time_Free_Viewpoint_Image_Synthesis_Using_Multi_View_Images_and_on_the_Fly_Estimation_of_View_Dependent_Depth_Map">Real-Time Free-Viewpoint Image Synthesis Using Multi-View Images and on-the-Fly Estimation of View-Dependent Depth Map</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="32442920" href="https://u-tokyo.academia.edu/TakeshiNaemura">Takeshi Naemura</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Eizō Jōhō Media Gakkaishi, 2006</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Real-Time Free-Viewpoint Image Synthesis Using Multi-View Images and on-the-Fly Estimation of View-Dependent Depth Map","attachmentId":113552981,"attachmentType":"pdf","work_url":"https://www.academia.edu/117779026/Real_Time_Free_Viewpoint_Image_Synthesis_Using_Multi_View_Images_and_on_the_Fly_Estimation_of_View_Dependent_Depth_Map","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/117779026/Real_Time_Free_Viewpoint_Image_Synthesis_Using_Multi_View_Images_and_on_the_Fly_Estimation_of_View_Dependent_Depth_Map"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="6" data-entity-id="106273031" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/106273031/Mining_Quantitative_Frequent_Itemsets_Using_Adaptive_Density_Based_Subspace_Clustering">Mining Quantitative Frequent Itemsets Using Adaptive Density-Based Subspace Clustering</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="237870011" href="https://independent.academia.edu/HiroshiMotoda">Hiroshi Motoda</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Fifth IEEE International Conference on Data Mining (ICDM'05)</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Mining Quantitative Frequent Itemsets Using Adaptive Density-Based Subspace Clustering","attachmentId":105513907,"attachmentType":"pdf","work_url":"https://www.academia.edu/106273031/Mining_Quantitative_Frequent_Itemsets_Using_Adaptive_Density_Based_Subspace_Clustering","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/106273031/Mining_Quantitative_Frequent_Itemsets_Using_Adaptive_Density_Based_Subspace_Clustering"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="7" data-entity-id="52228567" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/52228567/Studies_on_Range_Image_Segmentation_using_Curvature_Signs">Studies on Range Image Segmentation using Curvature Signs</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="126789624" href="https://independent.academia.edu/HitoshiWakizako">Hitoshi Wakizako</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Journal of the Robotics Society of Japan</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Studies on Range Image Segmentation using Curvature Signs","attachmentId":69590922,"attachmentType":"pdf","work_url":"https://www.academia.edu/52228567/Studies_on_Range_Image_Segmentation_using_Curvature_Signs","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/52228567/Studies_on_Range_Image_Segmentation_using_Curvature_Signs"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="8" data-entity-id="93678808" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/93678808/An_Online_Spatial_Aggregation_Method_for_Interactive_Sensor_Data_Browsing">An Online Spatial Aggregation Method for Interactive Sensor Data Browsing</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="59519246" href="https://independent.academia.edu/YuichiroAnzai">Yuichiro Anzai</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2003</p><p class="ds-related-work--abstract ds2-5-body-sm">Global environmental information provided by distributed sensor networks is useful for various kinds of loca- tion-oriented applications. Spatial aggregation methods such as spatial integration with map data and spatial interpolation between sensor data are effective in interactive and flexible browsing of widely distributed sensor data. We thus propose an online spatial aggregation method for interactive sensor data browsing. Our method incrementally aggregates sensor data provided by sensor data servers based on location information and shows the aggregated results for each decomposed region.</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"An Online Spatial Aggregation Method for Interactive Sensor Data Browsing","attachmentId":96349061,"attachmentType":"pdf","work_url":"https://www.academia.edu/93678808/An_Online_Spatial_Aggregation_Method_for_Interactive_Sensor_Data_Browsing","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/93678808/An_Online_Spatial_Aggregation_Method_for_Interactive_Sensor_Data_Browsing"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="9" data-entity-id="83463949" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/83463949/Speedup_of_OWCTY_Model_Checking_Algorithm_Using_Strongly_Connected_Components">Speedup of OWCTY Model Checking Algorithm Using Strongly Connected Components</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="41525999" href="https://independent.academia.edu/UedaKazunori">Kazunori Ueda</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Transactions of the Japanese Society for Artificial Intelligence, 2011</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Speedup of OWCTY Model Checking Algorithm Using Strongly Connected Components","attachmentId":88797124,"attachmentType":"pdf","work_url":"https://www.academia.edu/83463949/Speedup_of_OWCTY_Model_Checking_Algorithm_Using_Strongly_Connected_Components","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/83463949/Speedup_of_OWCTY_Model_Checking_Algorithm_Using_Strongly_Connected_Components"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div></div></div><div class="ds-sticky-ctas--wrapper js-loswp-sticky-ctas hidden"><div class="ds-sticky-ctas--grid-container"><div class="ds-sticky-ctas--container"><button class="ds2-5-button js-swp-download-button" data-signup-modal="{"location":"continue-reading-button--sticky-ctas","attachmentId":113265852,"attachmentType":"pdf","workUrl":null}">See full PDF</button><button class="ds2-5-button ds2-5-button--secondary js-swp-download-button" data-signup-modal="{"location":"download-pdf-button--sticky-ctas","attachmentId":113265852,"attachmentType":"pdf","workUrl":null}"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">download</span>Download PDF</button></div></div></div><div class="ds-below-fold--grid-container"><div class="ds-work--container js-loswp-embedded-document"><div class="attachment_preview" data-attachment="Attachment_113265852" style="display: none"><div class="js-scribd-document-container"><div class="scribd--document-loading js-scribd-document-loader" style="display: block;"><img alt="Loading..." src="//a.academia-assets.com/images/loaders/paper-load.gif" /><p>Loading Preview</p></div></div><div style="text-align: center;"><div class="scribd--no-preview-alert js-preview-unavailable"><p>Sorry, preview is currently unavailable. You can download the paper by clicking the button above.</p></div></div></div></div><div class="ds-sidebar--container js-work-sidebar"><div class="ds-related-content--container"><h2 class="ds-related-content--heading">Related papers</h2><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="0" data-entity-id="108133180" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/108133180/Robust_Picture_Matching_Using_Optimum_Selection_of_Partial_Templates">Robust Picture Matching Using Optimum Selection of Partial Templates</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="59598963" href="https://independent.academia.edu/HaruhisaOkuda">Haruhisa Okuda</a></div><p class="ds-related-work--metadata ds2-5-body-xs">IEEJ Transactions on Electronics, Information and Systems, 2004</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Robust Picture Matching Using Optimum Selection of Partial Templates","attachmentId":106598098,"attachmentType":"pdf","work_url":"https://www.academia.edu/108133180/Robust_Picture_Matching_Using_Optimum_Selection_of_Partial_Templates","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/108133180/Robust_Picture_Matching_Using_Optimum_Selection_of_Partial_Templates"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="1" data-entity-id="117798864" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/117798864/Robust_sensing_against_bubble_noises_in_aquatic_environments_with_a_stereo_vision_system">Robust sensing against bubble noises in aquatic environments with a stereo vision system</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="43644598" href="https://independent.academia.edu/AtsushiYamashita1">Atsushi Yamashita</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006.</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Robust sensing against bubble noises in aquatic environments with a stereo vision system","attachmentId":113568262,"attachmentType":"pdf","work_url":"https://www.academia.edu/117798864/Robust_sensing_against_bubble_noises_in_aquatic_environments_with_a_stereo_vision_system","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/117798864/Robust_sensing_against_bubble_noises_in_aquatic_environments_with_a_stereo_vision_system"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="2" data-entity-id="118724393" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/118724393/Simultaneous_Color_Image_and_Depth_Map_Acquisition_with_a_Single_Camera_using_a_Color_Filtered_Aperture">Simultaneous Color Image and Depth Map Acquisition with a Single Camera using a Color-Filtered Aperture</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="148846853" href="https://independent.academia.edu/yusukemoriuchi">yusuke moriuchi</a></div><p class="ds-related-work--metadata ds2-5-body-xs">The Journal of The Institute of Image Information and Television Engineers, 2017</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Simultaneous Color Image and Depth Map Acquisition with a Single Camera using a Color-Filtered Aperture","attachmentId":114284028,"attachmentType":"pdf","work_url":"https://www.academia.edu/118724393/Simultaneous_Color_Image_and_Depth_Map_Acquisition_with_a_Single_Camera_using_a_Color_Filtered_Aperture","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/118724393/Simultaneous_Color_Image_and_Depth_Map_Acquisition_with_a_Single_Camera_using_a_Color_Filtered_Aperture"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="3" data-entity-id="117778927" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/117778927/Spatial_Domain_Definition_of_Focus_Measurement_Method_for_Light_Field_Rendering_and_Its_Application_for_Images_Captured_with_Unstructured_Array_of_Cameras">Spatial Domain Definition of Focus Measurement Method for Light Field Rendering and Its Application for Images Captured with Unstructured Array of Cameras</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="32442920" href="https://u-tokyo.academia.edu/TakeshiNaemura">Takeshi Naemura</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Eizō Jōhō Media Gakkaishi, 2005</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Spatial Domain Definition of Focus Measurement Method for Light Field Rendering and Its Application for Images Captured with Unstructured Array of Cameras","attachmentId":113552847,"attachmentType":"pdf","work_url":"https://www.academia.edu/117778927/Spatial_Domain_Definition_of_Focus_Measurement_Method_for_Light_Field_Rendering_and_Its_Application_for_Images_Captured_with_Unstructured_Array_of_Cameras","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/117778927/Spatial_Domain_Definition_of_Focus_Measurement_Method_for_Light_Field_Rendering_and_Its_Application_for_Images_Captured_with_Unstructured_Array_of_Cameras"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="4" data-entity-id="70191628" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/70191628/Silhouette_Refining_for_the_Volume_Intersection_Method_with_Random_Pattern_Backgrounds">Silhouette Refining for the Volume Intersection Method with Random Pattern Backgrounds</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="46641092" href="https://independent.academia.edu/MasaakiIiyama">Masaaki Iiyama</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2006</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Silhouette Refining for the Volume Intersection Method with Random Pattern Backgrounds","attachmentId":80036799,"attachmentType":"pdf","work_url":"https://www.academia.edu/70191628/Silhouette_Refining_for_the_Volume_Intersection_Method_with_Random_Pattern_Backgrounds","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/70191628/Silhouette_Refining_for_the_Volume_Intersection_Method_with_Random_Pattern_Backgrounds"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="5" data-entity-id="87759998" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/87759998/Deep_Learning_and_Random_Forest_Based_Crack_Detection_from_an_Image_of_Concrete_Surface">Deep Learning and Random Forest Based Crack Detection from an Image of Concrete Surface</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="25157709" href="https://independent.academia.edu/KazuakiOKUBO">Kazuaki OKUBO</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Journal of Japan Society of Civil Engineers, Ser. F3 (Civil Engineering Informatics), 2017</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Deep Learning and Random Forest Based Crack Detection from an Image of Concrete Surface","attachmentId":91884903,"attachmentType":"pdf","work_url":"https://www.academia.edu/87759998/Deep_Learning_and_Random_Forest_Based_Crack_Detection_from_an_Image_of_Concrete_Surface","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/87759998/Deep_Learning_and_Random_Forest_Based_Crack_Detection_from_an_Image_of_Concrete_Surface"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="6" data-entity-id="79973526" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/79973526/Analysis_and_implementation_of_non_linear_spatial_filtering_for_image_processing">Analysis and implementation of non linear spatial filtering for image processing</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="207154827" href="https://independent.academia.edu/VociF">Francesco Voci</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2004</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Analysis and implementation of non linear spatial filtering for image processing","attachmentId":86509510,"attachmentType":"pdf","work_url":"https://www.academia.edu/79973526/Analysis_and_implementation_of_non_linear_spatial_filtering_for_image_processing","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/79973526/Analysis_and_implementation_of_non_linear_spatial_filtering_for_image_processing"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="7" data-entity-id="117779061" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/117779061/Use_of_View_Parameters_in_Free_Viewpoint_Image_Synthesis_to_Configure_a_Multi_View_Acquisition_System_with_Lens_Array">Use of View-Parameters in Free-Viewpoint Image Synthesis to Configure a Multi-View Acquisition System with Lens Array</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="32442920" href="https://u-tokyo.academia.edu/TakeshiNaemura">Takeshi Naemura</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Eizō Jōhō Media Gakkaishi, 2006</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Use of View-Parameters in Free-Viewpoint Image Synthesis to Configure a Multi-View Acquisition System with Lens Array","attachmentId":113552987,"attachmentType":"pdf","work_url":"https://www.academia.edu/117779061/Use_of_View_Parameters_in_Free_Viewpoint_Image_Synthesis_to_Configure_a_Multi_View_Acquisition_System_with_Lens_Array","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/117779061/Use_of_View_Parameters_in_Free_Viewpoint_Image_Synthesis_to_Configure_a_Multi_View_Acquisition_System_with_Lens_Array"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="8" data-entity-id="117778933" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/117778933/Special_Issue_Image_Technology_of_Next_Generation_Self_Similarity_Modeling_for_Interpolation_and_Data_Compression_of_a_Multi_View_3_D_Image">Special Issue Image Technology of Next Generation. Self-Similarity Modeling for Interpolation and Data Compression of a Multi-View 3-D Image</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="32442920" href="https://u-tokyo.academia.edu/TakeshiNaemura">Takeshi Naemura</a></div><p class="ds-related-work--metadata ds2-5-body-xs">The Journal of the Institute of Television Engineers of Japan, 1994</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Special Issue Image Technology of Next Generation. Self-Similarity Modeling for Interpolation and Data Compression of a Multi-View 3-D Image","attachmentId":113552918,"attachmentType":"pdf","work_url":"https://www.academia.edu/117778933/Special_Issue_Image_Technology_of_Next_Generation_Self_Similarity_Modeling_for_Interpolation_and_Data_Compression_of_a_Multi_View_3_D_Image","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/117778933/Special_Issue_Image_Technology_of_Next_Generation_Self_Similarity_Modeling_for_Interpolation_and_Data_Compression_of_a_Multi_View_3_D_Image"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="9" data-entity-id="117779073" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/117779073/Interactive_Multi_view_Video_Segmentation_System_using_Spatio_temporal_Information_Propagation">Interactive Multi-view Video Segmentation System using Spatio-temporal Information Propagation</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="32442920" href="https://u-tokyo.academia.edu/TakeshiNaemura">Takeshi Naemura</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Eizō Jōhō Media Gakkaishi, 2012</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Interactive Multi-view Video Segmentation System using Spatio-temporal Information Propagation","attachmentId":113552991,"attachmentType":"pdf","work_url":"https://www.academia.edu/117779073/Interactive_Multi_view_Video_Segmentation_System_using_Spatio_temporal_Information_Propagation","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/117779073/Interactive_Multi_view_Video_Segmentation_System_using_Spatio_temporal_Information_Propagation"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="10" data-entity-id="118035487" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/118035487/Sensing_of_Feeling_and_Kansei_A_Neural_Network_Approach_for_Classifying_Eye_Shape_by_Features_Obtained_from_the_Subjective_Evaluation_Standard">Sensing of Feeling and "Kansei". A Neural Network Approach for Classifying Eye Shape by Features Obtained from the Subjective Evaluation Standard</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="279199289" href="https://kanazawa-it.academia.edu/SakuichiOhtsuka">Sakuichi Ohtsuka</a></div><p class="ds-related-work--metadata ds2-5-body-xs">The Journal of the Institute of Television Engineers of Japan, 1995</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Sensing of Feeling and \"Kansei\". A Neural Network Approach for Classifying Eye Shape by Features Obtained from the Subjective Evaluation Standard","attachmentId":113754814,"attachmentType":"pdf","work_url":"https://www.academia.edu/118035487/Sensing_of_Feeling_and_Kansei_A_Neural_Network_Approach_for_Classifying_Eye_Shape_by_Features_Obtained_from_the_Subjective_Evaluation_Standard","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/118035487/Sensing_of_Feeling_and_Kansei_A_Neural_Network_Approach_for_Classifying_Eye_Shape_by_Features_Obtained_from_the_Subjective_Evaluation_Standard"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="11" data-entity-id="76765574" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/76765574/Studies_on_speech_recognition_based_on_discriminative_statistical_models_and_heuristic_search_strategies">Studies on speech recognition based on discriminative statistical models and heuristic search strategies</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="52852749" href="https://kyoto-u.academia.edu/TatsuyaKawahara">Tatsuya Kawahara</a></div><p class="ds-related-work--metadata ds2-5-body-xs">1995</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Studies on speech recognition based on discriminative statistical models and heuristic search strategies","attachmentId":84359954,"attachmentType":"pdf","work_url":"https://www.academia.edu/76765574/Studies_on_speech_recognition_based_on_discriminative_statistical_models_and_heuristic_search_strategies","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/76765574/Studies_on_speech_recognition_based_on_discriminative_statistical_models_and_heuristic_search_strategies"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="12" data-entity-id="87263612" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/87263612/A_Bayesian_Network_based_Method_for_Reducing_Noise_data_for_Anomaly_Detection_in_Flexible_Wireless_Sensor_Networks">A Bayesian Network based Method for Reducing Noise-data for Anomaly Detection in Flexible Wireless Sensor Networks</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="54074974" href="https://nitech.academia.edu/TakayukiIto">Takayuki Ito</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2013</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"A Bayesian Network based Method for Reducing Noise-data for Anomaly Detection in Flexible Wireless Sensor Networks","attachmentId":91522239,"attachmentType":"pdf","work_url":"https://www.academia.edu/87263612/A_Bayesian_Network_based_Method_for_Reducing_Noise_data_for_Anomaly_Detection_in_Flexible_Wireless_Sensor_Networks","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/87263612/A_Bayesian_Network_based_Method_for_Reducing_Noise_data_for_Anomaly_Detection_in_Flexible_Wireless_Sensor_Networks"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="13" data-entity-id="56505201" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/56505201/%E7%A7%BB%E5%8B%95%E3%83%AD%E3%83%9C%E3%83%83%E3%83%88%E3%81%AB%E3%82%88%E3%82%8BArtificial_Subtle_Expressions%E3%82%92%E7%94%A8%E3%81%84%E3%81%9F%E7%A2%BA%E4%BF%A1%E5%BA%A6%E8%A1%A8%E5%87%BA">移動ロボットによるArtificial Subtle Expressionsを用いた確信度表出</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="43754967" href="https://shinshu-u.academia.edu/kazukik">Kazuki Kobayashi</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Transactions of the Japanese Society for Artificial Intelligence, 2013</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"移動ロボットによるArtificial Subtle Expressionsを用いた確信度表出","attachmentId":71859059,"attachmentType":"pdf","work_url":"https://www.academia.edu/56505201/%E7%A7%BB%E5%8B%95%E3%83%AD%E3%83%9C%E3%83%83%E3%83%88%E3%81%AB%E3%82%88%E3%82%8BArtificial_Subtle_Expressions%E3%82%92%E7%94%A8%E3%81%84%E3%81%9F%E7%A2%BA%E4%BF%A1%E5%BA%A6%E8%A1%A8%E5%87%BA","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/56505201/%E7%A7%BB%E5%8B%95%E3%83%AD%E3%83%9C%E3%83%83%E3%83%88%E3%81%AB%E3%82%88%E3%82%8BArtificial_Subtle_Expressions%E3%82%92%E7%94%A8%E3%81%84%E3%81%9F%E7%A2%BA%E4%BF%A1%E5%BA%A6%E8%A1%A8%E5%87%BA"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="14" data-entity-id="65987456" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/65987456/Improvement_of_Matching_Functions_for_Retrieving_Unified_Presentation_Contents">Improvement of Matching Functions for Retrieving Unified Presentation Contents</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="37267579" href="https://independent.academia.edu/HaruoYokota">Haruo Yokota</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2004</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Improvement of Matching Functions for Retrieving Unified Presentation Contents","attachmentId":77352274,"attachmentType":"pdf","work_url":"https://www.academia.edu/65987456/Improvement_of_Matching_Functions_for_Retrieving_Unified_Presentation_Contents","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/65987456/Improvement_of_Matching_Functions_for_Retrieving_Unified_Presentation_Contents"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="15" data-entity-id="87088673" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/87088673/%E8%A6%96%E8%A6%9A%E7%9A%84%E3%81%AA%E5%86%85%E5%AE%B9%E3%81%AB%E3%82%88%E3%82%8A%E7%94%BB%E5%83%8F%E6%A4%9C%E7%B4%A2%E3%82%92%E8%A1%8C%E3%81%86%E3%81%9F%E3%82%81%E3%81%AE%E5%86%85%E5%AE%B9%E4%BC%9D%E9%81%94%E6%89%8B%E6%B3%95%E3%81%AB%E9%96%A2%E3%81%99%E3%82%8B%E7%A0%94%E7%A9%B6">視覚的な内容により画像検索を行うための内容伝達手法に関する研究</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="34000686" href="https://independent.academia.edu/ShigenoriMaeda">Shigenori Maeda</a></div><p class="ds-related-work--metadata ds2-5-body-xs">1999</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"視覚的な内容により画像検索を行うための内容伝達手法に関する研究","attachmentId":91398365,"attachmentType":"pdf","work_url":"https://www.academia.edu/87088673/%E8%A6%96%E8%A6%9A%E7%9A%84%E3%81%AA%E5%86%85%E5%AE%B9%E3%81%AB%E3%82%88%E3%82%8A%E7%94%BB%E5%83%8F%E6%A4%9C%E7%B4%A2%E3%82%92%E8%A1%8C%E3%81%86%E3%81%9F%E3%82%81%E3%81%AE%E5%86%85%E5%AE%B9%E4%BC%9D%E9%81%94%E6%89%8B%E6%B3%95%E3%81%AB%E9%96%A2%E3%81%99%E3%82%8B%E7%A0%94%E7%A9%B6","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/87088673/%E8%A6%96%E8%A6%9A%E7%9A%84%E3%81%AA%E5%86%85%E5%AE%B9%E3%81%AB%E3%82%88%E3%82%8A%E7%94%BB%E5%83%8F%E6%A4%9C%E7%B4%A2%E3%82%92%E8%A1%8C%E3%81%86%E3%81%9F%E3%82%81%E3%81%AE%E5%86%85%E5%AE%B9%E4%BC%9D%E9%81%94%E6%89%8B%E6%B3%95%E3%81%AB%E9%96%A2%E3%81%99%E3%82%8B%E7%A0%94%E7%A9%B6"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="16" data-entity-id="84208092" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/84208092/Recovering_Partial_Shape_from_Perspective_Matrix_Using_Simple_Zoom_Lens_Camera_Model">Recovering Partial Shape from Perspective Matrix Using Simple Zoom-Lens Camera Model</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="219990267" href="https://independent.academia.edu/okadayoshihiro">okada yoshihiro</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Ieej Transactions on Electronics, Information and Systems, 2004</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Recovering Partial Shape from Perspective Matrix Using Simple Zoom-Lens Camera Model","attachmentId":89311509,"attachmentType":"pdf","work_url":"https://www.academia.edu/84208092/Recovering_Partial_Shape_from_Perspective_Matrix_Using_Simple_Zoom_Lens_Camera_Model","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/84208092/Recovering_Partial_Shape_from_Perspective_Matrix_Using_Simple_Zoom_Lens_Camera_Model"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="17" data-entity-id="117977776" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/117977776/Comparison_of_Lighting_Condition_with_High_Visibility_in_Depth_Discrimination_between_Age_Groups">Comparison of Lighting Condition with High Visibility in Depth Discrimination between Age Groups</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="54377698" href="https://independent.academia.edu/murataatsuo">atsuo murata</a></div><p class="ds-related-work--metadata ds2-5-body-xs">The Japanese Journal of Ergonomics, 2020</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Comparison of Lighting Condition with High Visibility in Depth Discrimination between Age Groups","attachmentId":113709685,"attachmentType":"pdf","work_url":"https://www.academia.edu/117977776/Comparison_of_Lighting_Condition_with_High_Visibility_in_Depth_Discrimination_between_Age_Groups","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/117977776/Comparison_of_Lighting_Condition_with_High_Visibility_in_Depth_Discrimination_between_Age_Groups"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="18" data-entity-id="117778889" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/117778889/Extraction_and_Viewing_Parameter_Control_of_Objects_in_a_3D_TV_System">Extraction and Viewing Parameter Control of Objects in a 3D TV System</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="32442920" href="https://u-tokyo.academia.edu/TakeshiNaemura">Takeshi Naemura</a></div><p class="ds-related-work--metadata ds2-5-body-xs">IEICE Technical Report; IEICE Tech. Rep., 2009</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Extraction and Viewing Parameter Control of Objects in a 3D TV System","attachmentId":113552890,"attachmentType":"pdf","work_url":"https://www.academia.edu/117778889/Extraction_and_Viewing_Parameter_Control_of_Objects_in_a_3D_TV_System","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/117778889/Extraction_and_Viewing_Parameter_Control_of_Objects_in_a_3D_TV_System"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="19" data-entity-id="118798239" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/118798239/Physical_Barrier_Detection_for_Wheelchair_Users_with_Depth_Imaging">Physical Barrier Detection for Wheelchair Users with Depth Imaging</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="4079549" href="https://independent.academia.edu/YoshihiroYasumuro">Yoshihiro Yasumuro</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Doboku gakkai ronbunshu, 2018</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Physical Barrier Detection for Wheelchair Users with Depth Imaging","attachmentId":114339024,"attachmentType":"pdf","work_url":"https://www.academia.edu/118798239/Physical_Barrier_Detection_for_Wheelchair_Users_with_Depth_Imaging","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/118798239/Physical_Barrier_Detection_for_Wheelchair_Users_with_Depth_Imaging"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="20" data-entity-id="100464010" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/100464010/Comparison_of_GP_and_SAP_in_the_image_processing_filter_construction_using_pathology_images">Comparison of GP and SAP in the image-processing filter construction using pathology images</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="34038591" href="https://independent.academia.edu/MFukumoto">Manabu Fukumoto</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2010 3rd International Congress on Image and Signal Processing, 2010</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Comparison of GP and SAP in the image-processing filter construction using pathology images","attachmentId":101281321,"attachmentType":"pdf","work_url":"https://www.academia.edu/100464010/Comparison_of_GP_and_SAP_in_the_image_processing_filter_construction_using_pathology_images","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/100464010/Comparison_of_GP_and_SAP_in_the_image_processing_filter_construction_using_pathology_images"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="21" data-entity-id="117395859" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/117395859/Effect_of_Singular_Value_Decomposition_and_Weighting_by_Singular_Value_of_Document_Term_Matrix_for_Large_scale_Data_Perspective_and_Targeted_Data_Extraction">Effect of Singular Value Decomposition and Weighting by Singular Value of Document-Term Matrix, for Large-scale Data Perspective and Targeted Data Extraction</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="293889218" href="https://independent.academia.edu/TakeshiKobayakawa">Takeshi Kobayakawa</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Journal of Natural Language Processing, 2013</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{"location":"wsj-grid-card-download-pdf-modal","work_title":"Effect of Singular Value Decomposition and Weighting by Singular Value of Document-Term Matrix, for Large-scale Data Perspective and Targeted Data Extraction","attachmentId":113263748,"attachmentType":"pdf","work_url":"https://www.academia.edu/117395859/Effect_of_Singular_Value_Decomposition_and_Weighting_by_Singular_Value_of_Document_Term_Matrix_for_Large_scale_Data_Perspective_and_Targeted_Data_Extraction","alternativeTracking":true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/117395859/Effect_of_Singular_Value_Decomposition_and_Weighting_by_Singular_Value_of_Document_Term_Matrix_for_Large_scale_Data_Perspective_and_Targeted_Data_Extraction"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div></div><div class="ds-related-content--container"><h2 class="ds-related-content--heading">Related topics</h2><div class="ds-research-interests--pills-container"><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="422" href="https://www.academia.edu/Documents/in/Computer_Science">Computer Science</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="465" href="https://www.academia.edu/Documents/in/Artificial_Intelligence">Artificial Intelligence</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="854" href="https://www.academia.edu/Documents/in/Computer_Vision">Computer Vision</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="160144" href="https://www.academia.edu/Documents/in/Feature_Extraction">Feature Extraction</a></div></div></div></div></div><div class="footer--content"><ul class="footer--main-links hide-on-mobile"><li><a href="https://www.academia.edu/about">About</a></li><li><a href="https://www.academia.edu/press">Press</a></li><li><a rel="nofollow" href="https://medium.com/academia">Blog</a></li><li><a href="https://www.academia.edu/documents">Papers</a></li><li><a href="https://www.academia.edu/topics">Topics</a></li><li><a href="https://www.academia.edu/hiring"><svg style="width: 13px; height: 13px; position: relative; bottom: -1px;" aria-hidden="true" focusable="false" data-prefix="fas" data-icon="briefcase" class="svg-inline--fa fa-briefcase fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M320 336c0 8.84-7.16 16-16 16h-96c-8.84 0-16-7.16-16-16v-48H0v144c0 25.6 22.4 48 48 48h416c25.6 0 48-22.4 48-48V288H320v48zm144-208h-80V80c0-25.6-22.4-48-48-48H176c-25.6 0-48 22.4-48 48v48H48c-25.6 0-48 22.4-48 48v80h512v-80c0-25.6-22.4-48-48-48zm-144 0H192V96h128v32z"></path></svg> <strong>We're Hiring!</strong></a></li><li><a href="https://support.academia.edu/"><svg style="width: 12px; height: 12px; position: relative; bottom: -1px;" aria-hidden="true" focusable="false" data-prefix="fas" data-icon="question-circle" class="svg-inline--fa fa-question-circle fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M504 256c0 136.997-111.043 248-248 248S8 392.997 8 256C8 119.083 119.043 8 256 8s248 111.083 248 248zM262.655 90c-54.497 0-89.255 22.957-116.549 63.758-3.536 5.286-2.353 12.415 2.715 16.258l34.699 26.31c5.205 3.947 12.621 3.008 16.665-2.122 17.864-22.658 30.113-35.797 57.303-35.797 20.429 0 45.698 13.148 45.698 32.958 0 14.976-12.363 22.667-32.534 33.976C247.128 238.528 216 254.941 216 296v4c0 6.627 5.373 12 12 12h56c6.627 0 12-5.373 12-12v-1.333c0-28.462 83.186-29.647 83.186-106.667 0-58.002-60.165-102-116.531-102zM256 338c-25.365 0-46 20.635-46 46 0 25.364 20.635 46 46 46s46-20.636 46-46c0-25.365-20.635-46-46-46z"></path></svg> <strong>Help Center</strong></a></li></ul><ul class="footer--research-interests"><li>Find new research papers in:</li><li><a href="https://www.academia.edu/Documents/in/Physics">Physics</a></li><li><a href="https://www.academia.edu/Documents/in/Chemistry">Chemistry</a></li><li><a href="https://www.academia.edu/Documents/in/Biology">Biology</a></li><li><a href="https://www.academia.edu/Documents/in/Health_Sciences">Health Sciences</a></li><li><a href="https://www.academia.edu/Documents/in/Ecology">Ecology</a></li><li><a href="https://www.academia.edu/Documents/in/Earth_Sciences">Earth Sciences</a></li><li><a href="https://www.academia.edu/Documents/in/Cognitive_Science">Cognitive Science</a></li><li><a href="https://www.academia.edu/Documents/in/Mathematics">Mathematics</a></li><li><a href="https://www.academia.edu/Documents/in/Computer_Science">Computer Science</a></li></ul><ul class="footer--legal-links hide-on-mobile"><li><a href="https://www.academia.edu/terms">Terms</a></li><li><a href="https://www.academia.edu/privacy">Privacy</a></li><li><a href="https://www.academia.edu/copyright">Copyright</a></li><li>Academia ©2024</li></ul></div> </body> </html>