CINXE.COM

Search results for: Yi-Chun Kuo

<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: Yi-Chun Kuo</title> <meta name="description" content="Search results for: Yi-Chun Kuo"> <meta name="keywords" content="Yi-Chun Kuo"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="Yi-Chun Kuo" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="Yi-Chun Kuo"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 4</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: Yi-Chun Kuo</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4</span> Fitness Action Recognition Based on MediaPipe</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Zixuan%20Xu">Zixuan Xu</a>, <a href="https://publications.waset.org/abstracts/search?q=Yichun%20Lou"> Yichun Lou</a>, <a href="https://publications.waset.org/abstracts/search?q=Yang%20Song"> Yang Song</a>, <a href="https://publications.waset.org/abstracts/search?q=Zihuai%20Lin"> Zihuai Lin</a> </p> <p class="card-text"><strong>Abstract:</strong></p> MediaPipe is an open-source machine learning computer vision framework that can be ported into a multi-platform environment, which makes it easier to use it to recognize the human activity. Based on this framework, many human recognition systems have been created, but the fundamental issue is the recognition of human behavior and posture. In this paper, two methods are proposed to recognize human gestures based on MediaPipe, the first one uses the Adaptive Boosting algorithm to recognize a series of fitness gestures, and the second one uses the Fast Dynamic Time Warping algorithm to recognize 413 continuous fitness actions. These two methods are also applicable to any human posture movement recognition. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=computer%20vision" title="computer vision">computer vision</a>, <a href="https://publications.waset.org/abstracts/search?q=MediaPipe" title=" MediaPipe"> MediaPipe</a>, <a href="https://publications.waset.org/abstracts/search?q=adaptive%20boosting" title=" adaptive boosting"> adaptive boosting</a>, <a href="https://publications.waset.org/abstracts/search?q=fast%20dynamic%20time%20warping" title=" fast dynamic time warping"> fast dynamic time warping</a> </p> <a href="https://publications.waset.org/abstracts/160758/fitness-action-recognition-based-on-mediapipe" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/160758.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">118</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3</span> Saliency Detection Using a Background Probability Model</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Junling%20Li">Junling Li</a>, <a href="https://publications.waset.org/abstracts/search?q=Fang%20Meng"> Fang Meng</a>, <a href="https://publications.waset.org/abstracts/search?q=Yichun%20Zhang"> Yichun Zhang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Image saliency detection has been long studied, while several challenging problems are still unsolved, such as detecting saliency inaccurately in complex scenes or suppressing salient objects in the image borders. In this paper, we propose a new saliency detection algorithm in order to solving these problems. We represent the image as a graph with superixels as nodes. By considering appearance similarity between the boundary and the background, the proposed method chooses non-saliency boundary nodes as background priors to construct the background probability model. The probability that each node belongs to the model is computed, which measures its similarity with backgrounds. Thus we can calculate saliency by the transformed probability as a metric. We compare our algorithm with ten-state-of-the-art salient detection methods on the public database. Experimental results show that our simple and effective approach can attack those challenging problems that had been baffling in image saliency detection. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=visual%20saliency" title="visual saliency">visual saliency</a>, <a href="https://publications.waset.org/abstracts/search?q=background%20probability" title=" background probability"> background probability</a>, <a href="https://publications.waset.org/abstracts/search?q=boundary%20knowledge" title=" boundary knowledge"> boundary knowledge</a>, <a href="https://publications.waset.org/abstracts/search?q=background%20priors" title=" background priors"> background priors</a> </p> <a href="https://publications.waset.org/abstracts/13729/saliency-detection-using-a-background-probability-model" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/13729.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">429</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2</span> Explore the New Urbanization Patterns of the Varied Terrain Inland Areas: The Case of Hubei Province</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Zhan%20Chen">Zhan Chen</a>, <a href="https://publications.waset.org/abstracts/search?q=Yaping%20Huang"> Yaping Huang</a>, <a href="https://publications.waset.org/abstracts/search?q=Xiao%20Shen"> Xiao Shen</a>, <a href="https://publications.waset.org/abstracts/search?q=Yichun%20Li"> Yichun Li</a> </p> <p class="card-text"><strong>Abstract:</strong></p> New urbanization is a strategic fulcrum of China's future development, regional urbanization is a hot research field, different from the contiguous urbanization patterns of the eastern coastal plains and the node type urbanization patterns of the southwest mountainous areas, central inland areas has the realistic conditions of complex terrain conditions and kinds of phases, the dominant power of urbanization development, organizational power, coordination of the urbanization development and the natural environment, will be the core issue in the process of urbanization. This article starts from the characteristics of the typical urbanization development in such areas of Hubei Province, analyzing the current outstanding and typical problems in the process of urbanization in Hubei Province, and propose targeted to promote the basic ideas and implementation paths of the development of new urbanization, in order to provide experience and learn from similar cities of the development of urbanization. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=varied%20terrain" title="varied terrain">varied terrain</a>, <a href="https://publications.waset.org/abstracts/search?q=inland%20area" title=" inland area"> inland area</a>, <a href="https://publications.waset.org/abstracts/search?q=path%20explore" title=" path explore"> path explore</a>, <a href="https://publications.waset.org/abstracts/search?q=Hubei%20Province" title=" Hubei Province"> Hubei Province</a> </p> <a href="https://publications.waset.org/abstracts/15857/explore-the-new-urbanization-patterns-of-the-varied-terrain-inland-areas-the-case-of-hubei-province" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/15857.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">356</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1</span> An Educational Electronic Health Record with a Configurable User Interface</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Floriane%20Shala">Floriane Shala</a>, <a href="https://publications.waset.org/abstracts/search?q=Evangeline%20Wagner"> Evangeline Wagner</a>, <a href="https://publications.waset.org/abstracts/search?q=Yichun%20Zhao"> Yichun Zhao</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Background: Proper educational training and support are proven to be major components of EHR (Electronic Health Record) implementation and use. However, the majority of health providers are not sufficiently trained in EHR use, leading to adverse events, errors, and decreased quality of care. In response to this, students studying Health Information Science, Public Health, Nursing, and Medicine should all gain a thorough understanding of EHR use at different levels for different purposes. The design of a usable and safe EHR system that accommodates the needs and workflows of different users, user groups, and disciplines is required for EHR learning to be efficient and effective. Objectives: This project builds several artifacts which seek to address both the educational and usability aspects of an educational EHR. The artifacts proposed are models for and examples of such an EHR with a configurable UI to be learned by students who need a background in EHR use during their degrees. Methods: Review literature and gather professional opinions from domain experts on usability, the use of workflow patterns, UI configurability and design, and the educational aspect of EHR use. Conduct interviews in a semi-casual virtual setting with open discussion in order to gain a deeper understanding of the principal aspects of EHR use in educational settings. Select a specific task and user group to illustrate how the proposed solution will function based on the current research. Develop three artifacts based on the available research, professional opinions, and prior knowledge of the topic. The artifacts capture the user task and user鈥檚 interactions with the EHR for learning. The first generic model provides a general understanding of the EHR system process. The second model is a specific example of performing the task of MRI ordering with a configurable UI. The third artifact includes UI mock-ups showcasing the models in a practical and visual way. Significance: Due to the lack of educational EHRs, medical professionals do not receive sufficient EHR training. Implementing an educational EHR with a usable and configurable interface to suit the needs of different user groups and disciplines will help facilitate EHR learning and training and ultimately improve the quality of patient care. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=education" title="education">education</a>, <a href="https://publications.waset.org/abstracts/search?q=EHR" title=" EHR"> EHR</a>, <a href="https://publications.waset.org/abstracts/search?q=usability" title=" usability"> usability</a>, <a href="https://publications.waset.org/abstracts/search?q=configurable" title=" configurable"> configurable</a> </p> <a href="https://publications.waset.org/abstracts/143904/an-educational-electronic-health-record-with-a-configurable-user-interface" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/143904.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">157</span> </span> </div> </div> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">&copy; 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">&times;</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10