CINXE.COM

Matus Pleva | Technical University of Kosice - Academia.edu

<!DOCTYPE html> <html lang="en" xmlns:fb="http://www.facebook.com/2008/fbml" class="wf-loading"> <head prefix="og: https://ogp.me/ns# fb: https://ogp.me/ns/fb# academia: https://ogp.me/ns/fb/academia#"> <meta charset="utf-8"> <meta name=viewport content="width=device-width, initial-scale=1"> <meta rel="search" type="application/opensearchdescription+xml" href="/open_search.xml" title="Academia.edu"> <title>Matus Pleva | Technical University of Kosice - Academia.edu</title> <!-- _ _ _ | | (_) | | __ _ ___ __ _ __| | ___ _ __ ___ _ __ _ ___ __| |_ _ / _` |/ __/ _` |/ _` |/ _ \ '_ ` _ \| |/ _` | / _ \/ _` | | | | | (_| | (_| (_| | (_| | __/ | | | | | | (_| || __/ (_| | |_| | \__,_|\___\__,_|\__,_|\___|_| |_| |_|_|\__,_(_)___|\__,_|\__,_| We're hiring! See https://www.academia.edu/hiring --> <link href="//a.academia-assets.com/images/favicons/favicon-production.ico" rel="shortcut icon" type="image/vnd.microsoft.icon"> <link rel="apple-touch-icon" sizes="57x57" href="//a.academia-assets.com/images/favicons/apple-touch-icon-57x57.png"> <link rel="apple-touch-icon" sizes="60x60" href="//a.academia-assets.com/images/favicons/apple-touch-icon-60x60.png"> <link rel="apple-touch-icon" sizes="72x72" href="//a.academia-assets.com/images/favicons/apple-touch-icon-72x72.png"> <link rel="apple-touch-icon" sizes="76x76" href="//a.academia-assets.com/images/favicons/apple-touch-icon-76x76.png"> <link rel="apple-touch-icon" sizes="114x114" href="//a.academia-assets.com/images/favicons/apple-touch-icon-114x114.png"> <link rel="apple-touch-icon" sizes="120x120" href="//a.academia-assets.com/images/favicons/apple-touch-icon-120x120.png"> <link rel="apple-touch-icon" sizes="144x144" href="//a.academia-assets.com/images/favicons/apple-touch-icon-144x144.png"> <link rel="apple-touch-icon" sizes="152x152" href="//a.academia-assets.com/images/favicons/apple-touch-icon-152x152.png"> <link rel="apple-touch-icon" sizes="180x180" href="//a.academia-assets.com/images/favicons/apple-touch-icon-180x180.png"> <link rel="icon" type="image/png" href="//a.academia-assets.com/images/favicons/favicon-32x32.png" sizes="32x32"> <link rel="icon" type="image/png" href="//a.academia-assets.com/images/favicons/favicon-194x194.png" sizes="194x194"> <link rel="icon" type="image/png" href="//a.academia-assets.com/images/favicons/favicon-96x96.png" sizes="96x96"> <link rel="icon" type="image/png" href="//a.academia-assets.com/images/favicons/android-chrome-192x192.png" sizes="192x192"> <link rel="icon" type="image/png" href="//a.academia-assets.com/images/favicons/favicon-16x16.png" sizes="16x16"> <link rel="manifest" href="//a.academia-assets.com/images/favicons/manifest.json"> <meta name="msapplication-TileColor" content="#2b5797"> <meta name="msapplication-TileImage" content="//a.academia-assets.com/images/favicons/mstile-144x144.png"> <meta name="theme-color" content="#ffffff"> <script> window.performance && window.performance.measure && window.performance.measure("Time To First Byte", "requestStart", "responseStart"); </script> <script> (function() { if (!window.URLSearchParams || !window.history || !window.history.replaceState) { return; } var searchParams = new URLSearchParams(window.location.search); var paramsToDelete = [ 'fs', 'sm', 'swp', 'iid', 'nbs', 'rcc', // related content category 'rcpos', // related content carousel position 'rcpg', // related carousel page 'rchid', // related content hit id 'f_ri', // research interest id, for SEO tracking 'f_fri', // featured research interest, for SEO tracking (param key without value) 'f_rid', // from research interest directory for SEO tracking 'f_loswp', // from research interest pills on LOSWP sidebar for SEO tracking 'rhid', // referrring hit id ]; if (paramsToDelete.every((key) => searchParams.get(key) === null)) { return; } paramsToDelete.forEach((key) => { searchParams.delete(key); }); var cleanUrl = new URL(window.location.href); cleanUrl.search = searchParams.toString(); history.replaceState({}, document.title, cleanUrl); })(); </script> <script async src="https://www.googletagmanager.com/gtag/js?id=G-5VKX33P2DS"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-5VKX33P2DS', { cookie_domain: 'academia.edu', send_page_view: false, }); gtag('event', 'page_view', { 'controller': "profiles/works", 'action': "summary", 'controller_action': 'profiles/works#summary', 'logged_in': 'false', 'edge': 'unknown', // Send nil if there is no A/B test bucket, in case some records get logged // with missing data - that way we can distinguish between the two cases. // ab_test_bucket should be of the form <ab_test_name>:<bucket> 'ab_test_bucket': null, }) </script> <script type="text/javascript"> window.sendUserTiming = function(timingName) { if (!(window.performance && window.performance.measure)) return; var entries = window.performance.getEntriesByName(timingName, "measure"); if (entries.length !== 1) return; var timingValue = Math.round(entries[0].duration); gtag('event', 'timing_complete', { name: timingName, value: timingValue, event_category: 'User-centric', }); }; window.sendUserTiming("Time To First Byte"); </script> <meta name="csrf-param" content="authenticity_token" /> <meta name="csrf-token" content="4jihy9umoA6JQhg57C44cXpfh4q9OYq0q3gEQnSIhR+M6pR0gw41so9z9IaW+41MRhLF7V8hJU+ybKsG5KRYVA==" /> <link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/wow-77f7b87cb1583fc59aa8f94756ebfe913345937eb932042b4077563bebb5fb4b.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/social/home-9e8218e1301001388038e3fc3427ed00d079a4760ff7745d1ec1b2d59103170a.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/heading-b2b823dd904da60a48fd1bfa1defd840610c2ff414d3f39ed3af46277ab8df3b.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/button-3cea6e0ad4715ed965c49bfb15dedfc632787b32ff6d8c3a474182b231146ab7.css" /><link crossorigin="" href="https://fonts.gstatic.com/" rel="preconnect" /><link href="https://fonts.googleapis.com/css2?family=DM+Sans:ital,opsz,wght@0,9..40,100..1000;1,9..40,100..1000&amp;family=Gupter:wght@400;500;700&amp;family=IBM+Plex+Mono:wght@300;400&amp;family=Material+Symbols+Outlined:opsz,wght,FILL,GRAD@20,400,0,0&amp;display=swap" rel="stylesheet" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/common-10fa40af19d25203774df2d4a03b9b5771b45109c2304968038e88a81d1215c5.css" /> <meta name="author" content="matus pleva" /> <meta name="description" content="Matus Pleva, Technical University of Kosice: 20 Followers, 25 Following, 153 Research papers." /> <meta name="google-site-verification" content="bKJMBZA7E43xhDOopFZkssMMkBRjvYERV-NaN4R6mrs" /> <script> var $controller_name = 'works'; var $action_name = "summary"; var $rails_env = 'production'; var $app_rev = '92477ec68c09d28ae4730a4143c926f074776319'; var $domain = 'academia.edu'; var $app_host = "academia.edu"; var $asset_host = "academia-assets.com"; var $start_time = new Date().getTime(); var $recaptcha_key = "6LdxlRMTAAAAADnu_zyLhLg0YF9uACwz78shpjJB"; var $recaptcha_invisible_key = "6Lf3KHUUAAAAACggoMpmGJdQDtiyrjVlvGJ6BbAj"; var $disableClientRecordHit = false; </script> <script> window.Aedu = { hit_data: null }; window.Aedu.SiteStats = {"premium_universities_count":15276,"monthly_visitors":"113 million","monthly_visitor_count":113458213,"monthly_visitor_count_in_millions":113,"user_count":277551158,"paper_count":55203019,"paper_count_in_millions":55,"page_count":432000000,"page_count_in_millions":432,"pdf_count":16500000,"pdf_count_in_millions":16}; window.Aedu.serverRenderTime = new Date(1732824530000); window.Aedu.timeDifference = new Date().getTime() - 1732824530000; window.Aedu.isUsingCssV1 = false; window.Aedu.enableLocalization = true; window.Aedu.activateFullstory = false; window.Aedu.serviceAvailability = { status: {"attention_db":"on","bibliography_db":"on","contacts_db":"on","email_db":"on","indexability_db":"on","mentions_db":"on","news_db":"on","notifications_db":"on","offsite_mentions_db":"on","redshift":"on","redshift_exports_db":"on","related_works_db":"on","ring_db":"on","user_tests_db":"on"}, serviceEnabled: function(service) { return this.status[service] === "on"; }, readEnabled: function(service) { return this.serviceEnabled(service) || this.status[service] === "read_only"; }, }; window.Aedu.viewApmTrace = function() { // Check if x-apm-trace-id meta tag is set, and open the trace in APM // in a new window if it is. var apmTraceId = document.head.querySelector('meta[name="x-apm-trace-id"]'); if (apmTraceId) { var traceId = apmTraceId.content; // Use trace ID to construct URL, an example URL looks like: // https://app.datadoghq.com/apm/traces?query=trace_id%31298410148923562634 var apmUrl = 'https://app.datadoghq.com/apm/traces?query=trace_id%3A' + traceId; window.open(apmUrl, '_blank'); } }; </script> <!--[if lt IE 9]> <script src="//cdnjs.cloudflare.com/ajax/libs/html5shiv/3.7.2/html5shiv.min.js"></script> <![endif]--> <link href="https://fonts.googleapis.com/css?family=Roboto:100,100i,300,300i,400,400i,500,500i,700,700i,900,900i" rel="stylesheet"> <link href="//maxcdn.bootstrapcdn.com/font-awesome/4.3.0/css/font-awesome.min.css" rel="stylesheet"> <link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/libraries-a9675dcb01ec4ef6aa807ba772c7a5a00c1820d3ff661c1038a20f80d06bb4e4.css" /> <link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/academia-bdb9e8c097f01e611f2fc5e2f1a9dc599beede975e2ae5629983543a1726e947.css" /> <link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system_legacy-056a9113b9a0f5343d013b29ee1929d5a18be35fdcdceb616600b4db8bd20054.css" /> <script src="//a.academia-assets.com/assets/webpack_bundles/runtime-bundle-005434038af4252ca37c527588411a3d6a0eabb5f727fac83f8bbe7fd88d93bb.js"></script> <script src="//a.academia-assets.com/assets/webpack_bundles/webpack_libraries_and_infrequently_changed.wjs-bundle-bae13f9b51961d5f1e06008e39e31d0138cb31332e8c2e874c6d6a250ec2bb14.js"></script> <script src="//a.academia-assets.com/assets/webpack_bundles/core_webpack.wjs-bundle-19a25d160d01bde427443d06cd6b810c4c92c6026e7cb31519e06313eb24ed90.js"></script> <script src="//a.academia-assets.com/assets/webpack_bundles/sentry.wjs-bundle-5fe03fddca915c8ba0f7edbe64c194308e8ce5abaed7bffe1255ff37549c4808.js"></script> <script> jade = window.jade || {}; jade.helpers = window.$h; jade._ = window._; </script> <!-- Google Tag Manager --> <script id="tag-manager-head-root">(function(w,d,s,l,i){w[l]=w[l]||[];w[l].push({'gtm.start': new Date().getTime(),event:'gtm.js'});var f=d.getElementsByTagName(s)[0], j=d.createElement(s),dl=l!='dataLayer'?'&l='+l:'';j.async=true;j.src= 'https://www.googletagmanager.com/gtm.js?id='+i+dl;f.parentNode.insertBefore(j,f); })(window,document,'script','dataLayer_old','GTM-5G9JF7Z');</script> <!-- End Google Tag Manager --> <script> window.gptadslots = []; window.googletag = window.googletag || {}; window.googletag.cmd = window.googletag.cmd || []; </script> <script type="text/javascript"> // TODO(jacob): This should be defined, may be rare load order problem. // Checking if null is just a quick fix, will default to en if unset. // Better fix is to run this immedietely after I18n is set. if (window.I18n != null) { I18n.defaultLocale = "en"; I18n.locale = "en"; I18n.fallbacks = true; } </script> <link rel="canonical" href="https://tuke.academia.edu/MatusPleva" /> </head> <!--[if gte IE 9 ]> <body class='ie ie9 c-profiles/works a-summary logged_out'> <![endif]--> <!--[if !(IE) ]><!--> <body class='c-profiles/works a-summary logged_out'> <!--<![endif]--> <div id="fb-root"></div><script>window.fbAsyncInit = function() { FB.init({ appId: "2369844204", version: "v8.0", status: true, cookie: true, xfbml: true }); // Additional initialization code. if (window.InitFacebook) { // facebook.ts already loaded, set it up. window.InitFacebook(); } else { // Set a flag for facebook.ts to find when it loads. window.academiaAuthReadyFacebook = true; } };</script><script>window.fbAsyncLoad = function() { // Protection against double calling of this function if (window.FB) { return; } (function(d, s, id){ var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) {return;} js = d.createElement(s); js.id = id; js.src = "//connect.facebook.net/en_US/sdk.js"; fjs.parentNode.insertBefore(js, fjs); }(document, 'script', 'facebook-jssdk')); } if (!window.defer_facebook) { // Autoload if not deferred window.fbAsyncLoad(); } else { // Defer loading by 5 seconds setTimeout(function() { window.fbAsyncLoad(); }, 5000); }</script> <div id="google-root"></div><script>window.loadGoogle = function() { if (window.InitGoogle) { // google.ts already loaded, set it up. window.InitGoogle("331998490334-rsn3chp12mbkiqhl6e7lu2q0mlbu0f1b"); } else { // Set a flag for google.ts to use when it loads. window.GoogleClientID = "331998490334-rsn3chp12mbkiqhl6e7lu2q0mlbu0f1b"; } };</script><script>window.googleAsyncLoad = function() { // Protection against double calling of this function (function(d) { var js; var id = 'google-jssdk'; var ref = d.getElementsByTagName('script')[0]; if (d.getElementById(id)) { return; } js = d.createElement('script'); js.id = id; js.async = true; js.onload = loadGoogle; js.src = "https://accounts.google.com/gsi/client" ref.parentNode.insertBefore(js, ref); }(document)); } if (!window.defer_google) { // Autoload if not deferred window.googleAsyncLoad(); } else { // Defer loading by 5 seconds setTimeout(function() { window.googleAsyncLoad(); }, 5000); }</script> <div id="tag-manager-body-root"> <!-- Google Tag Manager (noscript) --> <noscript><iframe src="https://www.googletagmanager.com/ns.html?id=GTM-5G9JF7Z" height="0" width="0" style="display:none;visibility:hidden"></iframe></noscript> <!-- End Google Tag Manager (noscript) --> <!-- Event listeners for analytics --> <script> window.addEventListener('load', function() { if (document.querySelector('input[name="commit"]')) { document.querySelector('input[name="commit"]').addEventListener('click', function() { gtag('event', 'click', { event_category: 'button', event_label: 'Log In' }) }) } }); </script> </div> <script>var _comscore = _comscore || []; _comscore.push({ c1: "2", c2: "26766707" }); (function() { var s = document.createElement("script"), el = document.getElementsByTagName("script")[0]; s.async = true; s.src = (document.location.protocol == "https:" ? "https://sb" : "http://b") + ".scorecardresearch.com/beacon.js"; el.parentNode.insertBefore(s, el); })();</script><img src="https://sb.scorecardresearch.com/p?c1=2&amp;c2=26766707&amp;cv=2.0&amp;cj=1" style="position: absolute; visibility: hidden" /> <div id='react-modal'></div> <div class='DesignSystem'> <a class='u-showOnFocus' href='#site'> Skip to main content </a> </div> <div id="upgrade_ie_banner" style="display: none;"><p>Academia.edu no longer supports Internet Explorer.</p><p>To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to&nbsp;<a href="https://www.academia.edu/upgrade-browser">upgrade your browser</a>.</p></div><script>// Show this banner for all versions of IE if (!!window.MSInputMethodContext || /(MSIE)/.test(navigator.userAgent)) { document.getElementById('upgrade_ie_banner').style.display = 'block'; }</script> <div class="DesignSystem bootstrap ShrinkableNav"><div class="navbar navbar-default main-header"><div class="container-wrapper" id="main-header-container"><div class="container"><div class="navbar-header"><div class="nav-left-wrapper u-mt0x"><div class="nav-logo"><a data-main-header-link-target="logo_home" href="https://www.academia.edu/"><img class="visible-xs-inline-block" style="height: 24px;" alt="Academia.edu" src="//a.academia-assets.com/images/academia-logo-redesign-2015-A.svg" width="24" height="24" /><img width="145.2" height="18" class="hidden-xs" style="height: 24px;" alt="Academia.edu" src="//a.academia-assets.com/images/academia-logo-redesign-2015.svg" /></a></div><div class="nav-search"><div class="SiteSearch-wrapper select2-no-default-pills"><form class="js-SiteSearch-form DesignSystem" action="https://www.academia.edu/search" accept-charset="UTF-8" method="get"><input name="utf8" type="hidden" value="&#x2713;" autocomplete="off" /><i class="SiteSearch-icon fa fa-search u-fw700 u-positionAbsolute u-tcGrayDark"></i><input class="js-SiteSearch-form-input SiteSearch-form-input form-control" data-main-header-click-target="search_input" name="q" placeholder="Search" type="text" value="" /></form></div></div></div><div class="nav-right-wrapper pull-right"><ul class="NavLinks js-main-nav list-unstyled"><li class="NavLinks-link"><a class="js-header-login-url Button Button--inverseGray Button--sm u-mb4x" id="nav_log_in" rel="nofollow" href="https://www.academia.edu/login">Log In</a></li><li class="NavLinks-link u-p0x"><a class="Button Button--inverseGray Button--sm u-mb4x" rel="nofollow" href="https://www.academia.edu/signup">Sign Up</a></li></ul><button class="hidden-lg hidden-md hidden-sm u-ml4x navbar-toggle collapsed" data-target=".js-mobile-header-links" data-toggle="collapse" type="button"><span class="icon-bar"></span><span class="icon-bar"></span><span class="icon-bar"></span></button></div></div><div class="collapse navbar-collapse js-mobile-header-links"><ul class="nav navbar-nav"><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/login">Log In</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/signup">Sign Up</a></li><li class="u-borderColorGrayLight u-borderBottom1 js-mobile-nav-expand-trigger"><a href="#">more&nbsp<span class="caret"></span></a></li><li><ul class="js-mobile-nav-expand-section nav navbar-nav u-m0x collapse"><li class="u-borderColorGrayLight u-borderBottom1"><a rel="false" href="https://www.academia.edu/about">About</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/press">Press</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://medium.com/@academia">Blog</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="false" href="https://www.academia.edu/documents">Papers</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/terms">Terms</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/privacy">Privacy</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/copyright">Copyright</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/hiring"><i class="fa fa-briefcase"></i>&nbsp;We're Hiring!</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://support.academia.edu/"><i class="fa fa-question-circle"></i>&nbsp;Help Center</a></li><li class="js-mobile-nav-collapse-trigger u-borderColorGrayLight u-borderBottom1 dropup" style="display:none"><a href="#">less&nbsp<span class="caret"></span></a></li></ul></li></ul></div></div></div><script>(function(){ var $moreLink = $(".js-mobile-nav-expand-trigger"); var $lessLink = $(".js-mobile-nav-collapse-trigger"); var $section = $('.js-mobile-nav-expand-section'); $moreLink.click(function(ev){ ev.preventDefault(); $moreLink.hide(); $lessLink.show(); $section.collapse('show'); }); $lessLink.click(function(ev){ ev.preventDefault(); $moreLink.show(); $lessLink.hide(); $section.collapse('hide'); }); })() if ($a.is_logged_in() || false) { new Aedu.NavigationController({ el: '.js-main-nav', showHighlightedNotification: false }); } else { $(".js-header-login-url").attr("href", $a.loginUrlWithRedirect()); } Aedu.autocompleteSearch = new AutocompleteSearch({el: '.js-SiteSearch-form'});</script></div></div> <div id='site' class='fixed'> <div id="content" class="clearfix"> <script>document.addEventListener('DOMContentLoaded', function(){ var $dismissible = $(".dismissible_banner"); $dismissible.click(function(ev) { $dismissible.hide(); }); });</script> <script src="//a.academia-assets.com/assets/webpack_bundles/profile.wjs-bundle-e032f1d55548c2f2dee4eac9fe52f38beaf13471f2298bb2ea82725ae930b83c.js" defer="defer"></script><script>Aedu.rankings = { showPaperRankingsLink: false } $viewedUser = Aedu.User.set_viewed( {"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva","photo":"/images/s65_no_pic.png","has_photo":false,"department":{"id":48769,"name":"Department of Electronics and Multimedia Telecomunications","url":"https://tuke.academia.edu/Departments/Department_of_Electronics_and_Multimedia_Telecomunications/Documents","university":{"id":2493,"name":"Technical University of Kosice","url":"https://tuke.academia.edu/"}},"position":"Faculty Member","position_id":1,"is_analytics_public":false,"interests":[]} ); if ($a.is_logged_in() && $viewedUser.is_current_user()) { $('body').addClass('profile-viewed-by-owner'); } $socialProfiles = []</script><div id="js-react-on-rails-context" style="display:none" data-rails-context="{&quot;inMailer&quot;:false,&quot;i18nLocale&quot;:&quot;en&quot;,&quot;i18nDefaultLocale&quot;:&quot;en&quot;,&quot;href&quot;:&quot;https://tuke.academia.edu/MatusPleva&quot;,&quot;location&quot;:&quot;/MatusPleva&quot;,&quot;scheme&quot;:&quot;https&quot;,&quot;host&quot;:&quot;tuke.academia.edu&quot;,&quot;port&quot;:null,&quot;pathname&quot;:&quot;/MatusPleva&quot;,&quot;search&quot;:null,&quot;httpAcceptLanguage&quot;:null,&quot;serverSide&quot;:false}"></div> <div class="js-react-on-rails-component" style="display:none" data-component-name="ProfileCheckPaperUpdate" data-props="{}" data-trace="false" data-dom-id="ProfileCheckPaperUpdate-react-component-4e5666ff-0188-4592-aabb-6389ec7209a4"></div> <div id="ProfileCheckPaperUpdate-react-component-4e5666ff-0188-4592-aabb-6389ec7209a4"></div> <div class="DesignSystem"><div class="onsite-ping" id="onsite-ping"></div></div><div class="profile-user-info DesignSystem"><div class="social-profile-container"><div class="left-panel-container"><div class="user-info-component-wrapper"><div class="user-summary-cta-container"><div class="user-summary-container"><div class="social-profile-avatar-container"><img class="profile-avatar u-positionAbsolute" border="0" alt="" src="//a.academia-assets.com/images/s200_no_pic.png" /></div><div class="title-container"><h1 class="ds2-5-heading-sans-serif-sm">Matus Pleva</h1><div class="affiliations-container fake-truncate js-profile-affiliations"><div><a class="u-tcGrayDarker" href="https://tuke.academia.edu/">Technical University of Kosice</a>, <a class="u-tcGrayDarker" href="https://tuke.academia.edu/Departments/Department_of_Electronics_and_Multimedia_Telecomunications/Documents">Department of Electronics and Multimedia Telecomunications</a>, <span class="u-tcGrayDarker">Faculty Member</span></div></div></div></div><div class="sidebar-cta-container"><button class="ds2-5-button hidden profile-cta-button grow js-profile-follow-button" data-broccoli-component="user-info.follow-button" data-click-track="profile-user-info-follow-button" data-follow-user-fname="Matus" data-follow-user-id="160433363" data-follow-user-source="profile_button" data-has-google="false"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">add</span>Follow</button><button class="ds2-5-button hidden profile-cta-button grow js-profile-unfollow-button" data-broccoli-component="user-info.unfollow-button" data-click-track="profile-user-info-unfollow-button" data-unfollow-user-id="160433363"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">done</span>Following</button></div></div><div class="user-stats-container"><a><div class="stat-container js-profile-followers"><p class="label">Followers</p><p class="data">20</p></div></a><a><div class="stat-container js-profile-followees" data-broccoli-component="user-info.followees-count" data-click-track="profile-expand-user-info-following"><p class="label">Following</p><p class="data">25</p></div></a><a><div class="stat-container js-profile-coauthors" data-broccoli-component="user-info.coauthors-count" data-click-track="profile-expand-user-info-coauthors"><p class="label">Co-authors</p><p class="data">13</p></div></a><span><div class="stat-container"><p class="label"><span class="js-profile-total-view-text">Public Views</span></p><p class="data"><span class="js-profile-view-count"></span></p></div></span></div></div></div><div class="right-panel-container"><div class="user-content-wrapper"><div class="uploads-container" id="social-redesign-work-container"><div class="upload-header"><h2 class="ds2-5-heading-sans-serif-xs">Uploads</h2></div><div class="documents-container backbone-social-profile-documents" style="width: 100%;"><div class="u-taCenter"></div><div class="profile--tab_content_container js-tab-pane tab-pane active" id="all"><div class="profile--tab_heading_container js-section-heading" data-section="Papers" id="Papers"><h3 class="profile--tab_heading_container">Papers by Matus Pleva</h3></div><div class="js-work-strip profile--work_container" data-work-id="114726374"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726374/Transition_Relevance_Places_Machine_Learning_Based_Detection_in_Dialogue_Interactions"><img alt="Research paper thumbnail of Transition-Relevance Places Machine Learning-Based Detection in Dialogue Interactions" class="work-thumbnail" src="https://attachments.academia-assets.com/111344869/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726374/Transition_Relevance_Places_Machine_Learning_Based_Detection_in_Dialogue_Interactions">Transition-Relevance Places Machine Learning-Based Detection in Dialogue Interactions</a></div><div class="wp-workCard_item"><span>Elektronika Ir Elektrotechnika</span><span>, Jun 27, 2023</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="ed15e7e1833de279ccdef9a1b8b9afd3" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:111344869,&quot;asset_id&quot;:114726374,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/111344869/download_file?st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726374"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726374"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726374; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726374]").text(description); $(".js-view-count[data-work-id=114726374]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726374; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726374']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726374, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "ed15e7e1833de279ccdef9a1b8b9afd3" } } $('.js-work-strip[data-work-id=114726374]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726374,"title":"Transition-Relevance Places Machine Learning-Based Detection in Dialogue Interactions","translated_title":"","metadata":{"publisher":"Kaunas University of Technology","grobid_abstract":"A transition-relevance place (TRP) represents a place in a conversation where a change of speaker can occur. The appearance and use of these points in the dialogue ensures a correct and smooth alternation between the speakers. In the presented article, we focused on the study of prosodic speech parameters in the Slovak language, and we tried to experimentally verify the potential of these parameters to detect TRP. To study turn-taking issues in dyadic conversations, the Slovak dialogue corpus was collected and annotated. TRP places were identified by the human annotator in the manual labelling process. The data were then divided into chunks that reflect the length of the interpausal dialogue units and the prosodic features were computed. In the Matlab environment, we compared different types of classifiers based on machine learning in the role of an automatic TRP detector based on pitch and intensity parameters. The achieved results indicate that prosodic parameters can be useful in detecting TRP after splitting the dialogue into interpausal units. The designed approach can serve as a tool for automatic conversational analysis or can be used to label large databases for training predictive models, which can help machines to enhance human-machine spoken dialogue applications.","publication_date":{"day":27,"month":6,"year":2023,"errors":{}},"publication_name":"Elektronika Ir Elektrotechnika","grobid_abstract_attachment_id":111344869},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726374/Transition_Relevance_Places_Machine_Learning_Based_Detection_in_Dialogue_Interactions","translated_internal_url":"","created_at":"2024-02-10T08:41:11.394-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":111344869,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/111344869/thumbnails/1.jpg","file_name":"33853-Article_Text-123897-1-10-20230705.pdf","download_url":"https://www.academia.edu/attachments/111344869/download_file?st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Transition_Relevance_Places_Machine_Lear.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/111344869/33853-Article_Text-123897-1-10-20230705-libre.pdf?1707584132=\u0026response-content-disposition=attachment%3B+filename%3DTransition_Relevance_Places_Machine_Lear.pdf\u0026Expires=1732828129\u0026Signature=Bff3isvv7LQbHJZSy~nL2RPjMkuxnMgF-H306vFyRHtHtz6LYsY6yzDbsNhg7Ad~TYySV-PDVMi810xcMMGyGuvOzB3wTqADJmtt~bAsiJ32FJowhnR3jDv-TJHDKJx~ySS11LzwgYzSnVHHu8z4MneShJ8Bhd0H45DZHAO00MyFjH4nAMfXdQENfF9JwIV2dJpgqHvomFehRZSrQqRMDVasfstlBhwHnZ1xwOmHHcE0F~suCQPoehr8MYrphazz1gCzBVwWru6brgrDHRR44C3iKmVAzoPOHqcpa6PJSGg9HG8BcRc9IcvTU5Z-LkWY8DPGLLSaYhGfjfCwLX5kcw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Transition_Relevance_Places_Machine_Learning_Based_Detection_in_Dialogue_Interactions","translated_slug":"","page_count":7,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[{"id":111344869,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/111344869/thumbnails/1.jpg","file_name":"33853-Article_Text-123897-1-10-20230705.pdf","download_url":"https://www.academia.edu/attachments/111344869/download_file?st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Transition_Relevance_Places_Machine_Lear.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/111344869/33853-Article_Text-123897-1-10-20230705-libre.pdf?1707584132=\u0026response-content-disposition=attachment%3B+filename%3DTransition_Relevance_Places_Machine_Lear.pdf\u0026Expires=1732828129\u0026Signature=Bff3isvv7LQbHJZSy~nL2RPjMkuxnMgF-H306vFyRHtHtz6LYsY6yzDbsNhg7Ad~TYySV-PDVMi810xcMMGyGuvOzB3wTqADJmtt~bAsiJ32FJowhnR3jDv-TJHDKJx~ySS11LzwgYzSnVHHu8z4MneShJ8Bhd0H45DZHAO00MyFjH4nAMfXdQENfF9JwIV2dJpgqHvomFehRZSrQqRMDVasfstlBhwHnZ1xwOmHHcE0F~suCQPoehr8MYrphazz1gCzBVwWru6brgrDHRR44C3iKmVAzoPOHqcpa6PJSGg9HG8BcRc9IcvTU5Z-LkWY8DPGLLSaYhGfjfCwLX5kcw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"},{"id":111344868,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/111344868/thumbnails/1.jpg","file_name":"33853-Article_Text-123897-1-10-20230705.pdf","download_url":"https://www.academia.edu/attachments/111344868/download_file","bulk_download_file_name":"Transition_Relevance_Places_Machine_Lear.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/111344868/33853-Article_Text-123897-1-10-20230705-libre.pdf?1707584132=\u0026response-content-disposition=attachment%3B+filename%3DTransition_Relevance_Places_Machine_Lear.pdf\u0026Expires=1732828129\u0026Signature=GnPu~7hVQQm632Q7R8hbn3SU7ClVADIW-G06DwYSdZaRsvSqf3hdKoF0P6gtzN1Xlpb57x9Mp-cRXkt-V4l32UbvlqHvSiwo~TLa1xEowPzK4F0N9PSwpCaBPM~oFzOer7h~W0XHbnBvJwJ1b~5In6VNHoDTIZe8p2RiZOA1TIw2bUh0keOd7KFWzS6y-PEPUanlzEy30WY1gNJVuwMbzbeS4CJdbqsjIOQlQW2WT9iqrgCrEpGR3T8AsLZOnpj-bsI-cq8hLBJvGXJwTbYVc~Upg6q20B-IjWyCYseLsxp8iGOgUFDH2JVutWU4oYlkV8CC393Vhd0LDpyadGoEcw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":465,"name":"Artificial Intelligence","url":"https://www.academia.edu/Documents/in/Artificial_Intelligence"},{"id":1432,"name":"Natural Language Processing","url":"https://www.academia.edu/Documents/in/Natural_Language_Processing"},{"id":6121,"name":"Slovak","url":"https://www.academia.edu/Documents/in/Slovak"},{"id":24342,"name":"Conversation","url":"https://www.academia.edu/Documents/in/Conversation"}],"urls":[{"id":39345804,"url":"https://eejournal.ktu.lt/index.php/elt/article/download/33853/15935"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726373"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726373/A_Preliminary_Study_on_Taiwanese_OCR_for_Assisting_Textual_Database_Construction_from_Historical_Documents"><img alt="Research paper thumbnail of A Preliminary Study on Taiwanese OCR for Assisting Textual Database Construction from Historical Documents" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726373/A_Preliminary_Study_on_Taiwanese_OCR_for_Assisting_Textual_Database_Construction_from_Historical_Documents">A Preliminary Study on Taiwanese OCR for Assisting Textual Database Construction from Historical Documents</a></div><div class="wp-workCard_item"><span>2022 13th International Symposium on Chinese Spoken Language Processing (ISCSLP)</span><span>, Dec 11, 2022</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726373"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726373"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726373; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726373]").text(description); $(".js-view-count[data-work-id=114726373]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726373; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726373']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726373, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726373]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726373,"title":"A Preliminary Study on Taiwanese OCR for Assisting Textual Database Construction from Historical Documents","translated_title":"","metadata":{"publication_date":{"day":11,"month":12,"year":2022,"errors":{}},"publication_name":"2022 13th International Symposium on Chinese Spoken Language Processing (ISCSLP)"},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726373/A_Preliminary_Study_on_Taiwanese_OCR_for_Assisting_Textual_Database_Construction_from_Historical_Documents","translated_internal_url":"","created_at":"2024-02-10T08:41:11.209-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"A_Preliminary_Study_on_Taiwanese_OCR_for_Assisting_Textual_Database_Construction_from_Historical_Documents","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":1432,"name":"Natural Language Processing","url":"https://www.academia.edu/Documents/in/Natural_Language_Processing"},{"id":265584,"name":"Optical Character Recognition","url":"https://www.academia.edu/Documents/in/Optical_Character_Recognition"},{"id":740724,"name":"Writing System","url":"https://www.academia.edu/Documents/in/Writing_System"}],"urls":[{"id":39345803,"url":"https://doi.org/10.1109/iscslp57327.2022.10038277"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726372"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726372/Handwriting_Data_Analysis_from_Crayonic_KeyVault_Smart_Security_Device"><img alt="Research paper thumbnail of Handwriting Data Analysis from Crayonic KeyVault Smart Security Device" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726372/Handwriting_Data_Analysis_from_Crayonic_KeyVault_Smart_Security_Device">Handwriting Data Analysis from Crayonic KeyVault Smart Security Device</a></div><div class="wp-workCard_item"><span>2022 20th International Conference on Emerging eLearning Technologies and Applications (ICETA)</span><span>, Oct 20, 2022</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726372"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726372"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726372; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726372]").text(description); $(".js-view-count[data-work-id=114726372]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726372; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726372']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726372, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726372]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726372,"title":"Handwriting Data Analysis from Crayonic KeyVault Smart Security Device","translated_title":"","metadata":{"publication_date":{"day":20,"month":10,"year":2022,"errors":{}},"publication_name":"2022 20th International Conference on Emerging eLearning Technologies and Applications (ICETA)"},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726372/Handwriting_Data_Analysis_from_Crayonic_KeyVault_Smart_Security_Device","translated_internal_url":"","created_at":"2024-02-10T08:41:11.006-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Handwriting_Data_Analysis_from_Crayonic_KeyVault_Smart_Security_Device","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":465,"name":"Artificial Intelligence","url":"https://www.academia.edu/Documents/in/Artificial_Intelligence"},{"id":9173,"name":"Biometrics","url":"https://www.academia.edu/Documents/in/Biometrics"},{"id":11984,"name":"Speech Recognition","url":"https://www.academia.edu/Documents/in/Speech_Recognition"},{"id":46118,"name":"Handwriting","url":"https://www.academia.edu/Documents/in/Handwriting"},{"id":479186,"name":"Accelerometer","url":"https://www.academia.edu/Documents/in/Accelerometer"},{"id":1231305,"name":"Data Set","url":"https://www.academia.edu/Documents/in/Data_Set"}],"urls":[{"id":39345802,"url":"https://doi.org/10.1109/iceta57911.2022.9974843"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726371"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726371/Children_robot_spoken_interaction_in_selected_educational_scenarios"><img alt="Research paper thumbnail of Children-robot spoken interaction in selected educational scenarios" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726371/Children_robot_spoken_interaction_in_selected_educational_scenarios">Children-robot spoken interaction in selected educational scenarios</a></div><div class="wp-workCard_item"><span>2022 20th International Conference on Emerging eLearning Technologies and Applications (ICETA)</span><span>, Oct 20, 2022</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726371"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726371"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726371; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726371]").text(description); $(".js-view-count[data-work-id=114726371]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726371; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726371']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726371, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726371]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726371,"title":"Children-robot spoken interaction in selected educational scenarios","translated_title":"","metadata":{"publication_date":{"day":20,"month":10,"year":2022,"errors":{}},"publication_name":"2022 20th International Conference on Emerging eLearning Technologies and Applications (ICETA)"},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726371/Children_robot_spoken_interaction_in_selected_educational_scenarios","translated_internal_url":"","created_at":"2024-02-10T08:41:10.813-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Children_robot_spoken_interaction_in_selected_educational_scenarios","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":99861,"name":"ROBOT","url":"https://www.academia.edu/Documents/in/ROBOT"},{"id":382135,"name":"Wizard of Oz","url":"https://www.academia.edu/Documents/in/Wizard_of_Oz"}],"urls":[{"id":39345801,"url":"https://doi.org/10.1109/iceta57911.2022.9974859"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726370"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726370/Multimodal_dialogue_system_with_NAO_and_VoiceXML_dialogue_manager"><img alt="Research paper thumbnail of Multimodal dialogue system with NAO and VoiceXML dialogue manager" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726370/Multimodal_dialogue_system_with_NAO_and_VoiceXML_dialogue_manager">Multimodal dialogue system with NAO and VoiceXML dialogue manager</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The proposed paper describes a multimodal interactive system based on NAO humanoid robot with an ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The proposed paper describes a multimodal interactive system based on NAO humanoid robot with an external dialogue management module VoiceON. System can be controlled by voice. NAO produces speech and gestures as a response to the user inputs. Presented multimodal system uses built-in speech recognition and speech synthesis modules adapted to Slovak language, which was originally not supported. Moreover, system accepts VoiceXML dialogue applications for all supported languages. To manage dialogue interaction, a previously developed VoiceXML dialogue manager was adopted. A new module for generation of multimodal output (speech and gestures) was designed, which enables NAO to produce gestures together with speech in several modes.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726370"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726370"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726370; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726370]").text(description); $(".js-view-count[data-work-id=114726370]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726370; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726370']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726370, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726370]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726370,"title":"Multimodal dialogue system with NAO and VoiceXML dialogue manager","translated_title":"","metadata":{"abstract":"The proposed paper describes a multimodal interactive system based on NAO humanoid robot with an external dialogue management module VoiceON. System can be controlled by voice. NAO produces speech and gestures as a response to the user inputs. Presented multimodal system uses built-in speech recognition and speech synthesis modules adapted to Slovak language, which was originally not supported. Moreover, system accepts VoiceXML dialogue applications for all supported languages. To manage dialogue interaction, a previously developed VoiceXML dialogue manager was adopted. A new module for generation of multimodal output (speech and gestures) was designed, which enables NAO to produce gestures together with speech in several modes.","publication_date":{"day":1,"month":9,"year":2017,"errors":{}}},"translated_abstract":"The proposed paper describes a multimodal interactive system based on NAO humanoid robot with an external dialogue management module VoiceON. System can be controlled by voice. NAO produces speech and gestures as a response to the user inputs. Presented multimodal system uses built-in speech recognition and speech synthesis modules adapted to Slovak language, which was originally not supported. Moreover, system accepts VoiceXML dialogue applications for all supported languages. To manage dialogue interaction, a previously developed VoiceXML dialogue manager was adopted. A new module for generation of multimodal output (speech and gestures) was designed, which enables NAO to produce gestures together with speech in several modes.","internal_url":"https://www.academia.edu/114726370/Multimodal_dialogue_system_with_NAO_and_VoiceXML_dialogue_manager","translated_internal_url":"","created_at":"2024-02-10T08:41:10.624-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Multimodal_dialogue_system_with_NAO_and_VoiceXML_dialogue_manager","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"}],"urls":[{"id":39345800,"url":"https://doi.org/10.1109/coginfocom.2017.8268286"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726369"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726369/Identification_of_Trolling_in_Memes_Using_Convolutional_Neural_Networks"><img alt="Research paper thumbnail of Identification of Trolling in Memes Using Convolutional Neural Networks" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726369/Identification_of_Trolling_in_Memes_Using_Convolutional_Neural_Networks">Identification of Trolling in Memes Using Convolutional Neural Networks</a></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726369"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726369"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726369; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726369]").text(description); $(".js-view-count[data-work-id=114726369]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726369; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726369']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726369, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726369]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726369,"title":"Identification of Trolling in Memes Using Convolutional Neural Networks","translated_title":"","metadata":{"publication_date":{"day":19,"month":4,"year":2023,"errors":{}}},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726369/Identification_of_Trolling_in_Memes_Using_Convolutional_Neural_Networks","translated_internal_url":"","created_at":"2024-02-10T08:41:10.409-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Identification_of_Trolling_in_Memes_Using_Convolutional_Neural_Networks","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":465,"name":"Artificial Intelligence","url":"https://www.academia.edu/Documents/in/Artificial_Intelligence"},{"id":988977,"name":"Lexical Analysis","url":"https://www.academia.edu/Documents/in/Lexical_Analysis"},{"id":1568111,"name":"Convolutional Neural Network","url":"https://www.academia.edu/Documents/in/Convolutional_Neural_Network"}],"urls":[{"id":39345799,"url":"https://doi.org/10.1109/radioelektronika57919.2023.10109044"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726368"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726368/Preliminary_evaluation_of_the_multimodal_interactive_system_for_NAO_robot"><img alt="Research paper thumbnail of Preliminary evaluation of the multimodal interactive system for NAO robot" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726368/Preliminary_evaluation_of_the_multimodal_interactive_system_for_NAO_robot">Preliminary evaluation of the multimodal interactive system for NAO robot</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The proposed paper brings a description of the pilot version of the multimodal dialogue system fo...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The proposed paper brings a description of the pilot version of the multimodal dialogue system for NAO humanoid robot. Designed system enables multimodal interaction with the user in such manner that it takes a speech input from the user and it answers by a combination of synthetic speech and gestures. The core of the system is an external dialogue manager, which interprets VoiceXML language. A pilot speech communication application was designed and a preliminary evaluation was performed using subjective methods. Results of the preliminary evaluation highlights importance of involving gestures into communication exchange. Moreover the paper brings a discussion of usage scenarios, where designed system can be used for educational purposes.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726368"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726368"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726368; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726368]").text(description); $(".js-view-count[data-work-id=114726368]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726368; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726368']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726368, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726368]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726368,"title":"Preliminary evaluation of the multimodal interactive system for NAO robot","translated_title":"","metadata":{"abstract":"The proposed paper brings a description of the pilot version of the multimodal dialogue system for NAO humanoid robot. Designed system enables multimodal interaction with the user in such manner that it takes a speech input from the user and it answers by a combination of synthetic speech and gestures. The core of the system is an external dialogue manager, which interprets VoiceXML language. A pilot speech communication application was designed and a preliminary evaluation was performed using subjective methods. Results of the preliminary evaluation highlights importance of involving gestures into communication exchange. Moreover the paper brings a discussion of usage scenarios, where designed system can be used for educational purposes.","publication_date":{"day":1,"month":10,"year":2017,"errors":{}}},"translated_abstract":"The proposed paper brings a description of the pilot version of the multimodal dialogue system for NAO humanoid robot. Designed system enables multimodal interaction with the user in such manner that it takes a speech input from the user and it answers by a combination of synthetic speech and gestures. The core of the system is an external dialogue manager, which interprets VoiceXML language. A pilot speech communication application was designed and a preliminary evaluation was performed using subjective methods. Results of the preliminary evaluation highlights importance of involving gestures into communication exchange. Moreover the paper brings a discussion of usage scenarios, where designed system can be used for educational purposes.","internal_url":"https://www.academia.edu/114726368/Preliminary_evaluation_of_the_multimodal_interactive_system_for_NAO_robot","translated_internal_url":"","created_at":"2024-02-10T08:41:10.199-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Preliminary_evaluation_of_the_multimodal_interactive_system_for_NAO_robot","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":472,"name":"Human Computer Interaction","url":"https://www.academia.edu/Documents/in/Human_Computer_Interaction"},{"id":3147,"name":"Gesture","url":"https://www.academia.edu/Documents/in/Gesture"},{"id":11081,"name":"Multimodal Interaction","url":"https://www.academia.edu/Documents/in/Multimodal_Interaction"},{"id":428932,"name":"Humanoid robot","url":"https://www.academia.edu/Documents/in/Humanoid_robot"}],"urls":[{"id":39345798,"url":"https://doi.org/10.1109/iceta.2017.8102515"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726367"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726367/Static_Audio_Keystroke_Dynamics"><img alt="Research paper thumbnail of Static Audio Keystroke Dynamics" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726367/Static_Audio_Keystroke_Dynamics">Static Audio Keystroke Dynamics</a></div><div class="wp-workCard_item"><span>Springer eBooks</span><span>, 2015</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">In this paper we investigate the accuracy of an identification scheme based on the sound of typin...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">In this paper we investigate the accuracy of an identification scheme based on the sound of typing a password. The novelty of this paper lies in the comparison of performance between timing based and audio based keystroke dynamics data in both an authentication and an identification setting. We collected data of 50 people typing the same given password 100 times, divided into 4 sessions of 25 typings, and tested how well the system could recognize the correct typist. When training with data of 3 sessions and testing with the remaining session we achieved a maximal accuracy of 97.3 % using cross validation. Repeating this with training with 1 session and testing with the 3 remaining sessions we achieved an accuracy of still 90.6 %. The results show the potential of using Audio Keystroke Dynamics information as a way to identify users during log on.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726367"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726367"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726367; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726367]").text(description); $(".js-view-count[data-work-id=114726367]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726367; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726367']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726367, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726367]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726367,"title":"Static Audio Keystroke Dynamics","translated_title":"","metadata":{"abstract":"In this paper we investigate the accuracy of an identification scheme based on the sound of typing a password. The novelty of this paper lies in the comparison of performance between timing based and audio based keystroke dynamics data in both an authentication and an identification setting. We collected data of 50 people typing the same given password 100 times, divided into 4 sessions of 25 typings, and tested how well the system could recognize the correct typist. When training with data of 3 sessions and testing with the remaining session we achieved a maximal accuracy of 97.3 % using cross validation. Repeating this with training with 1 session and testing with the 3 remaining sessions we achieved an accuracy of still 90.6 %. The results show the potential of using Audio Keystroke Dynamics information as a way to identify users during log on.","publisher":"Springer Nature","publication_date":{"day":null,"month":null,"year":2015,"errors":{}},"publication_name":"Springer eBooks"},"translated_abstract":"In this paper we investigate the accuracy of an identification scheme based on the sound of typing a password. The novelty of this paper lies in the comparison of performance between timing based and audio based keystroke dynamics data in both an authentication and an identification setting. We collected data of 50 people typing the same given password 100 times, divided into 4 sessions of 25 typings, and tested how well the system could recognize the correct typist. When training with data of 3 sessions and testing with the remaining session we achieved a maximal accuracy of 97.3 % using cross validation. Repeating this with training with 1 session and testing with the 3 remaining sessions we achieved an accuracy of still 90.6 %. The results show the potential of using Audio Keystroke Dynamics information as a way to identify users during log on.","internal_url":"https://www.academia.edu/114726367/Static_Audio_Keystroke_Dynamics","translated_internal_url":"","created_at":"2024-02-10T08:41:10.018-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Static_Audio_Keystroke_Dynamics","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":11984,"name":"Speech Recognition","url":"https://www.academia.edu/Documents/in/Speech_Recognition"},{"id":303449,"name":"Novelty","url":"https://www.academia.edu/Documents/in/Novelty"},{"id":433921,"name":"Password","url":"https://www.academia.edu/Documents/in/Password"},{"id":558449,"name":"Keystroke logging","url":"https://www.academia.edu/Documents/in/Keystroke_logging"},{"id":1564564,"name":"Keystroke Dynamics","url":"https://www.academia.edu/Documents/in/Keystroke_Dynamics"},{"id":3647879,"name":"Springer Ebooks","url":"https://www.academia.edu/Documents/in/Springer_Ebooks"}],"urls":[{"id":39345797,"url":"https://doi.org/10.1007/978-3-319-26404-2_13"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726366"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726366/What_Behavioral_Signals_could_be_Used_to_Identify_Users"><img alt="Research paper thumbnail of What Behavioral Signals could be Used to Identify Users?" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726366/What_Behavioral_Signals_could_be_Used_to_Identify_Users">What Behavioral Signals could be Used to Identify Users?</a></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726366"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726366"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726366; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726366]").text(description); $(".js-view-count[data-work-id=114726366]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726366; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726366']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726366, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726366]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726366,"title":"What Behavioral Signals could be Used to Identify Users?","translated_title":"","metadata":{"publication_date":{"day":19,"month":1,"year":2023,"errors":{}}},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726366/What_Behavioral_Signals_could_be_Used_to_Identify_Users","translated_internal_url":"","created_at":"2024-02-10T08:41:09.332-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"What_Behavioral_Signals_could_be_Used_to_Identify_Users","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":472,"name":"Human Computer Interaction","url":"https://www.academia.edu/Documents/in/Human_Computer_Interaction"},{"id":9173,"name":"Biometrics","url":"https://www.academia.edu/Documents/in/Biometrics"},{"id":50642,"name":"Virtual Reality","url":"https://www.academia.edu/Documents/in/Virtual_Reality"},{"id":759573,"name":"Login","url":"https://www.academia.edu/Documents/in/Login"}],"urls":[{"id":39345796,"url":"https://doi.org/10.1109/sami58000.2023.10044502"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726362"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726362/BERT_based_Chinese_Medicine_Named_Entity_Recognition_Model_Applied_to_Medication_Reminder_Dialogue_System"><img alt="Research paper thumbnail of BERT-based Chinese Medicine Named Entity Recognition Model Applied to Medication Reminder Dialogue System" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726362/BERT_based_Chinese_Medicine_Named_Entity_Recognition_Model_Applied_to_Medication_Reminder_Dialogue_System">BERT-based Chinese Medicine Named Entity Recognition Model Applied to Medication Reminder Dialogue System</a></div><div class="wp-workCard_item"><span>2022 13th International Symposium on Chinese Spoken Language Processing (ISCSLP)</span><span>, Dec 11, 2022</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726362"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726362"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726362; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726362]").text(description); $(".js-view-count[data-work-id=114726362]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726362; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726362']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726362, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726362]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726362,"title":"BERT-based Chinese Medicine Named Entity Recognition Model Applied to Medication Reminder Dialogue System","translated_title":"","metadata":{"publication_date":{"day":11,"month":12,"year":2022,"errors":{}},"publication_name":"2022 13th International Symposium on Chinese Spoken Language Processing (ISCSLP)"},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726362/BERT_based_Chinese_Medicine_Named_Entity_Recognition_Model_Applied_to_Medication_Reminder_Dialogue_System","translated_internal_url":"","created_at":"2024-02-10T08:41:09.012-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"BERT_based_Chinese_Medicine_Named_Entity_Recognition_Model_Applied_to_Medication_Reminder_Dialogue_System","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":29205,"name":"Named Entity Recognition","url":"https://www.academia.edu/Documents/in/Named_Entity_Recognition"},{"id":60107,"name":"Traditional Chinese Medicine","url":"https://www.academia.edu/Documents/in/Traditional_Chinese_Medicine"}],"urls":[{"id":39345793,"url":"https://doi.org/10.1109/iscslp57327.2022.10037867"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726360"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726360/EMG_Data_Collection_for_Multimodal_Keystroke_Analysis"><img alt="Research paper thumbnail of EMG Data Collection for Multimodal Keystroke Analysis" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726360/EMG_Data_Collection_for_Multimodal_Keystroke_Analysis">EMG Data Collection for Multimodal Keystroke Analysis</a></div><div class="wp-workCard_item"><span>2022 12th International Conference on Advanced Computer Information Technologies (ACIT)</span><span>, Sep 26, 2022</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726360"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726360"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726360; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726360]").text(description); $(".js-view-count[data-work-id=114726360]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726360; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726360']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726360, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726360]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726360,"title":"EMG Data Collection for Multimodal Keystroke Analysis","translated_title":"","metadata":{"publication_date":{"day":26,"month":9,"year":2022,"errors":{}},"publication_name":"2022 12th International Conference on Advanced Computer Information Technologies (ACIT)"},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726360/EMG_Data_Collection_for_Multimodal_Keystroke_Analysis","translated_internal_url":"","created_at":"2024-02-10T08:41:08.711-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"EMG_Data_Collection_for_Multimodal_Keystroke_Analysis","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":53293,"name":"Software","url":"https://www.academia.edu/Documents/in/Software"},{"id":433921,"name":"Password","url":"https://www.academia.edu/Documents/in/Password"},{"id":558449,"name":"Keystroke logging","url":"https://www.academia.edu/Documents/in/Keystroke_logging"},{"id":1564564,"name":"Keystroke Dynamics","url":"https://www.academia.edu/Documents/in/Keystroke_Dynamics"}],"urls":[{"id":39345791,"url":"https://doi.org/10.1109/acit54803.2022.9913124"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726358"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726358/Modified_Ling_Six_Sound_Test_Audiometry_Application"><img alt="Research paper thumbnail of Modified Ling Six Sound Test Audiometry Application" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726358/Modified_Ling_Six_Sound_Test_Audiometry_Application">Modified Ling Six Sound Test Audiometry Application</a></div><div class="wp-workCard_item"><span>2022 12th International Conference on Advanced Computer Information Technologies (ACIT)</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726358"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726358"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726358; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726358]").text(description); $(".js-view-count[data-work-id=114726358]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726358; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726358']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726358, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726358]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726358,"title":"Modified Ling Six Sound Test Audiometry Application","translated_title":"","metadata":{"publisher":"IEEE","publication_name":"2022 12th International Conference on Advanced Computer Information Technologies (ACIT)"},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726358/Modified_Ling_Six_Sound_Test_Audiometry_Application","translated_internal_url":"","created_at":"2024-02-10T08:41:08.491-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Modified_Ling_Six_Sound_Test_Audiometry_Application","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":4139,"name":"Audiology","url":"https://www.academia.edu/Documents/in/Audiology"},{"id":11984,"name":"Speech Recognition","url":"https://www.academia.edu/Documents/in/Speech_Recognition"},{"id":76622,"name":"Cochlear Implant","url":"https://www.academia.edu/Documents/in/Cochlear_Implant"},{"id":1725725,"name":"Audiometry","url":"https://www.academia.edu/Documents/in/Audiometry"}],"urls":[{"id":39345789,"url":"http://xplorestaging.ieee.org/ielx7/9912736/9912739/09913152.pdf?arnumber=9913152"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726356"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726356/%E9%81%A9%E5%90%88%E6%BC%B8%E5%87%8D%E4%BA%BA%E4%BD%BF%E7%94%A8%E4%B9%8B%E8%AA%9E%E9%9F%B3%E8%BD%89%E6%8F%9B%E7%B3%BB%E7%B5%B1%E5%88%9D%E6%AD%A5%E7%A0%94%E7%A9%B6_Deep_Neural_Network_Bandwidth_Extension_and_Denoising_Voice_Conversion_System_for_ALS_Patients_"><img alt="Research paper thumbnail of 適合漸凍人使用之語音轉換系統初步研究 (Deep Neural-Network Bandwidth Extension and Denoising Voice Conversion System for ALS Patients)" class="work-thumbnail" src="https://attachments.academia-assets.com/111344864/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726356/%E9%81%A9%E5%90%88%E6%BC%B8%E5%87%8D%E4%BA%BA%E4%BD%BF%E7%94%A8%E4%B9%8B%E8%AA%9E%E9%9F%B3%E8%BD%89%E6%8F%9B%E7%B3%BB%E7%B5%B1%E5%88%9D%E6%AD%A5%E7%A0%94%E7%A9%B6_Deep_Neural_Network_Bandwidth_Extension_and_Denoising_Voice_Conversion_System_for_ALS_Patients_">適合漸凍人使用之語音轉換系統初步研究 (Deep Neural-Network Bandwidth Extension and Denoising Voice Conversion System for ALS Patients)</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">摘要 漸凍人症(肌萎縮性脊隨側索硬化症,Amyotrophic lateral sclerosis, ALS)為一種神 經退化性疾病,這種疾病目前還沒有治癒的方法,並會讓漸凍人慢慢失去說話能力,...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">摘要 漸凍人症(肌萎縮性脊隨側索硬化症,Amyotrophic lateral sclerosis, ALS)為一種神 經退化性疾病,這種疾病目前還沒有治癒的方法,並會讓漸凍人慢慢失去說話能力,最 終導致無法利用語音與人溝通,而失去自我認同。因此,我們需要為漸凍人建立適合其 使用之語音溝通輔具(voice output communication aids, VOCAs),尤其是讓其能具有個 人化的合成語音,即病友發病前的聲音,以保持自我。但大部分在 ALS 後期,已經不 能講話的病友,都沒有事先妥善保存好個人的錄音,最多只能找出有少量大約 20 分鐘 的低品質語音,例如經過失真壓縮(MP3)、只保留低頻寬(8 kHz),或是具有強烈背 景雜訊干擾等等,以致無法建構出適合 ALS 病友使用的個人化語音合成系統。針對以 上困難,本論文嘗試使用通用語音合成系統搭配語音轉換演算法,並在前級加上語音雜 訊消除(speech denoising),後級輔以超展頻模組(speech super-resolution)。以能容忍 有背景雜訊的錄音,並能將低頻寬的合成語音加上高頻成分(16 kHz)。以盡量能從低 品質語音,重建出接近 ALS 病友原音的高品質合成聲音。其中,speech denoising 使用 WaveNet,speech super-resolution 則利用 U-Net 架構。並先以 20 小時的高品質(棚內錄 音)教育電台語料庫,模擬出成對的高雜訊與乾淨語音語句,或是低頻寬與高頻寬語音, The 2019 Conference on Computational Linguistics and Speech Processing ROCLING 2019, pp. 5266 ©The Association for Computational Linguistics and Chinese Language Processing</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="577420730b02fc061abe07f9b1d80f09" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:111344864,&quot;asset_id&quot;:114726356,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/111344864/download_file?st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726356"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726356"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726356; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726356]").text(description); $(".js-view-count[data-work-id=114726356]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726356; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726356']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726356, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "577420730b02fc061abe07f9b1d80f09" } } $('.js-work-strip[data-work-id=114726356]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726356,"title":"適合漸凍人使用之語音轉換系統初步研究 (Deep Neural-Network Bandwidth Extension and Denoising Voice Conversion System for ALS Patients)","translated_title":"","metadata":{"abstract":"摘要 漸凍人症(肌萎縮性脊隨側索硬化症,Amyotrophic lateral sclerosis, ALS)為一種神 經退化性疾病,這種疾病目前還沒有治癒的方法,並會讓漸凍人慢慢失去說話能力,最 終導致無法利用語音與人溝通,而失去自我認同。因此,我們需要為漸凍人建立適合其 使用之語音溝通輔具(voice output communication aids, VOCAs),尤其是讓其能具有個 人化的合成語音,即病友發病前的聲音,以保持自我。但大部分在 ALS 後期,已經不 能講話的病友,都沒有事先妥善保存好個人的錄音,最多只能找出有少量大約 20 分鐘 的低品質語音,例如經過失真壓縮(MP3)、只保留低頻寬(8 kHz),或是具有強烈背 景雜訊干擾等等,以致無法建構出適合 ALS 病友使用的個人化語音合成系統。針對以 上困難,本論文嘗試使用通用語音合成系統搭配語音轉換演算法,並在前級加上語音雜 訊消除(speech denoising),後級輔以超展頻模組(speech super-resolution)。以能容忍 有背景雜訊的錄音,並能將低頻寬的合成語音加上高頻成分(16 kHz)。以盡量能從低 品質語音,重建出接近 ALS 病友原音的高品質合成聲音。其中,speech denoising 使用 WaveNet,speech super-resolution 則利用 U-Net 架構。並先以 20 小時的高品質(棚內錄 音)教育電台語料庫,模擬出成對的高雜訊與乾淨語音語句,或是低頻寬與高頻寬語音, The 2019 Conference on Computational Linguistics and Speech Processing ROCLING 2019, pp. 5266 ©The Association for Computational Linguistics and Chinese Language Processing","publisher":"IJCLCLP","publication_date":{"day":null,"month":null,"year":2019,"errors":{}}},"translated_abstract":"摘要 漸凍人症(肌萎縮性脊隨側索硬化症,Amyotrophic lateral sclerosis, ALS)為一種神 經退化性疾病,這種疾病目前還沒有治癒的方法,並會讓漸凍人慢慢失去說話能力,最 終導致無法利用語音與人溝通,而失去自我認同。因此,我們需要為漸凍人建立適合其 使用之語音溝通輔具(voice output communication aids, VOCAs),尤其是讓其能具有個 人化的合成語音,即病友發病前的聲音,以保持自我。但大部分在 ALS 後期,已經不 能講話的病友,都沒有事先妥善保存好個人的錄音,最多只能找出有少量大約 20 分鐘 的低品質語音,例如經過失真壓縮(MP3)、只保留低頻寬(8 kHz),或是具有強烈背 景雜訊干擾等等,以致無法建構出適合 ALS 病友使用的個人化語音合成系統。針對以 上困難,本論文嘗試使用通用語音合成系統搭配語音轉換演算法,並在前級加上語音雜 訊消除(speech denoising),後級輔以超展頻模組(speech super-resolution)。以能容忍 有背景雜訊的錄音,並能將低頻寬的合成語音加上高頻成分(16 kHz)。以盡量能從低 品質語音,重建出接近 ALS 病友原音的高品質合成聲音。其中,speech denoising 使用 WaveNet,speech super-resolution 則利用 U-Net 架構。並先以 20 小時的高品質(棚內錄 音)教育電台語料庫,模擬出成對的高雜訊與乾淨語音語句,或是低頻寬與高頻寬語音, The 2019 Conference on Computational Linguistics and Speech Processing ROCLING 2019, pp. 5266 ©The Association for Computational Linguistics and Chinese Language Processing","internal_url":"https://www.academia.edu/114726356/%E9%81%A9%E5%90%88%E6%BC%B8%E5%87%8D%E4%BA%BA%E4%BD%BF%E7%94%A8%E4%B9%8B%E8%AA%9E%E9%9F%B3%E8%BD%89%E6%8F%9B%E7%B3%BB%E7%B5%B1%E5%88%9D%E6%AD%A5%E7%A0%94%E7%A9%B6_Deep_Neural_Network_Bandwidth_Extension_and_Denoising_Voice_Conversion_System_for_ALS_Patients_","translated_internal_url":"","created_at":"2024-02-10T08:41:08.321-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":111344864,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/111344864/thumbnails/1.jpg","file_name":"2019.ijclclp-2.3.pdf","download_url":"https://www.academia.edu/attachments/111344864/download_file?st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Deep_Neural_Network_B.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/111344864/2019.ijclclp-2.3-libre.pdf?1707584138=\u0026response-content-disposition=attachment%3B+filename%3DDeep_Neural_Network_B.pdf\u0026Expires=1732828129\u0026Signature=UV~VeyLna4F0qEDyXzG9qbgSjJAymuaB0RIxz6BbOMMZMqy8affeuNMD-ED9-TJtaDdvMjSxud9R2~wkJpGFq7khfG7WGLvkNdFFnZ-OMjAhZBIhkB74S59u0FyBu2rR-p3wAoHKXqeIX4iNtv5Lx7mEuRy0usso7NhH8yWd5jfwbg8fM5Lwh0c5~OOKOXJsRHH4MbapyajMk~jveGiMq~ZGbyjmvQOlOIYnTovDeBjH0iUbU8hQj8y780PaI2q8gLLq58tRJBsB5h-zJwJ7ZOVQ-tyIEsKPd1K7fOiPN3pU9nF6oxqbSqVDUI4pP7pU5aIGnL8Tzm6tI~sQyczeSw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"適合漸凍人使用之語音轉換系統初步研究_Deep_Neural_Network_Bandwidth_Extension_and_Denoising_Voice_Conversion_System_for_ALS_Patients_","translated_slug":"","page_count":16,"language":"zh-TW","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[{"id":111344864,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/111344864/thumbnails/1.jpg","file_name":"2019.ijclclp-2.3.pdf","download_url":"https://www.academia.edu/attachments/111344864/download_file?st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Deep_Neural_Network_B.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/111344864/2019.ijclclp-2.3-libre.pdf?1707584138=\u0026response-content-disposition=attachment%3B+filename%3DDeep_Neural_Network_B.pdf\u0026Expires=1732828129\u0026Signature=UV~VeyLna4F0qEDyXzG9qbgSjJAymuaB0RIxz6BbOMMZMqy8affeuNMD-ED9-TJtaDdvMjSxud9R2~wkJpGFq7khfG7WGLvkNdFFnZ-OMjAhZBIhkB74S59u0FyBu2rR-p3wAoHKXqeIX4iNtv5Lx7mEuRy0usso7NhH8yWd5jfwbg8fM5Lwh0c5~OOKOXJsRHH4MbapyajMk~jveGiMq~ZGbyjmvQOlOIYnTovDeBjH0iUbU8hQj8y780PaI2q8gLLq58tRJBsB5h-zJwJ7ZOVQ-tyIEsKPd1K7fOiPN3pU9nF6oxqbSqVDUI4pP7pU5aIGnL8Tzm6tI~sQyczeSw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"},{"id":111344862,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/111344862/thumbnails/1.jpg","file_name":"2019.ijclclp-2.3.pdf","download_url":"https://www.academia.edu/attachments/111344862/download_file","bulk_download_file_name":"Deep_Neural_Network_B.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/111344862/2019.ijclclp-2.3-libre.pdf?1707584138=\u0026response-content-disposition=attachment%3B+filename%3DDeep_Neural_Network_B.pdf\u0026Expires=1732828129\u0026Signature=HOAm83sFpDqLPNxHuJi6MMsDSK8xNYHOUKuiFz3WkN-KAHyAc8n7lOo5Nfxw6Cwg6bJYNqo99eAiAX4YNuzXCe5Xhn1ogqmURJnBq22HBYv4p1BbYarQJEOlAMtV~Pr3kJp1UqOCXcEk0oPa5DsCD7cn4vc0QopiM92hoAzjDkH9ZKUB4TQ-Efxa9W0gSg5QVnBHGMB5XpASVpa~pKlmM0lf9M5-KgVh3nPEAXkPSVVgPrbuH6oBq-9jwx~2SQXztV28~SOhC2XaQZjxJR2KDQZdBF7tTgvKar~op9B-VxhD0~4SauhP92KJ3Em3~ObyPU7JN0QdhrTarQVezb2JyQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":11984,"name":"Speech Recognition","url":"https://www.academia.edu/Documents/in/Speech_Recognition"},{"id":96798,"name":"Noise reduction","url":"https://www.academia.edu/Documents/in/Noise_reduction"},{"id":1211304,"name":"Artificial Neural Network","url":"https://www.academia.edu/Documents/in/Artificial_Neural_Network"}],"urls":[{"id":39345787,"url":"https://www.aclanthology.org/2019.ijclclp-2.3.pdf"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726355"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726355/System_for_Automatic_Transcription_of_Audio_Meetings_in_Slovak"><img alt="Research paper thumbnail of System for Automatic Transcription of Audio Meetings in Slovak" class="work-thumbnail" src="https://attachments.academia-assets.com/111344878/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726355/System_for_Automatic_Transcription_of_Audio_Meetings_in_Slovak">System for Automatic Transcription of Audio Meetings in Slovak</a></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="563e5e3ba9c1ebd2c0d0b1e03982d6db" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:111344878,&quot;asset_id&quot;:114726355,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/111344878/download_file?st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726355"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726355"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726355; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726355]").text(description); $(".js-view-count[data-work-id=114726355]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726355; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726355']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726355, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "563e5e3ba9c1ebd2c0d0b1e03982d6db" } } $('.js-work-strip[data-work-id=114726355]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726355,"title":"System for Automatic Transcription of Audio Meetings in Slovak","translated_title":"","metadata":{"grobid_abstract":"The main aim of the pilot project is a research and development of the meeting speech recognition system for the Slovak language. The system includes:  voice recording of meetings in small conference rooms with a limited number of participants using microphone array;  automatic domain-specific and speaker-dependent speech-to-text transcription in the Slovak language;  management for storing, browsing, searching and synchronization of transcripts with audio recordings. Expected outcome is functional prototype, practically applicable software product for automatic transcription of meetings, educational talks and lectures.","grobid_abstract_attachment_id":111344878},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726355/System_for_Automatic_Transcription_of_Audio_Meetings_in_Slovak","translated_internal_url":"","created_at":"2024-02-10T08:41:07.212-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":111344878,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/111344878/thumbnails/1.jpg","file_name":"3-TUKE_System_na_automaticke_rozpoznavanie_a_prepis_mitingovych_zaznamov.pdf","download_url":"https://www.academia.edu/attachments/111344878/download_file?st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"System_for_Automatic_Transcription_of_Au.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/111344878/3-TUKE_System_na_automaticke_rozpoznavanie_a_prepis_mitingovych_zaznamov-libre.pdf?1707584135=\u0026response-content-disposition=attachment%3B+filename%3DSystem_for_Automatic_Transcription_of_Au.pdf\u0026Expires=1732828129\u0026Signature=TiCXNaKxltPpbE8mvxMTmwSQ1vU5yKUYgJcSc5X4yhhjx3eYAu0sB1OF9jCLtyh2h2S4OmNY2FbfPzLYC2Ca5hoGClaQMmD0vStxyFUwO2iLQ1IrOroephCUIoiMYh0~jjogfBJ90mjiNeK0PqU-GN0ZEvIMXQnr1IUgw97U42TtTerpD-pX1qJtjsCQQaPyptKeo2tYwnRMqwzviCOkG5FZjV04agEspY9wUFLNEkAGjXbEhQQhpvM4xQdzlZC-53TfO3Sy2jpA2lqFqDUMa5YS0Zuu8m~mGr40IZiBNYjUKJveWw6dzEbuUg-0Thwamw7AmiPNQOjYvKiVpUMujw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"System_for_Automatic_Transcription_of_Audio_Meetings_in_Slovak","translated_slug":"","page_count":1,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[{"id":111344878,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/111344878/thumbnails/1.jpg","file_name":"3-TUKE_System_na_automaticke_rozpoznavanie_a_prepis_mitingovych_zaznamov.pdf","download_url":"https://www.academia.edu/attachments/111344878/download_file?st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"System_for_Automatic_Transcription_of_Au.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/111344878/3-TUKE_System_na_automaticke_rozpoznavanie_a_prepis_mitingovych_zaznamov-libre.pdf?1707584135=\u0026response-content-disposition=attachment%3B+filename%3DSystem_for_Automatic_Transcription_of_Au.pdf\u0026Expires=1732828129\u0026Signature=TiCXNaKxltPpbE8mvxMTmwSQ1vU5yKUYgJcSc5X4yhhjx3eYAu0sB1OF9jCLtyh2h2S4OmNY2FbfPzLYC2Ca5hoGClaQMmD0vStxyFUwO2iLQ1IrOroephCUIoiMYh0~jjogfBJ90mjiNeK0PqU-GN0ZEvIMXQnr1IUgw97U42TtTerpD-pX1qJtjsCQQaPyptKeo2tYwnRMqwzviCOkG5FZjV04agEspY9wUFLNEkAGjXbEhQQhpvM4xQdzlZC-53TfO3Sy2jpA2lqFqDUMa5YS0Zuu8m~mGr40IZiBNYjUKJveWw6dzEbuUg-0Thwamw7AmiPNQOjYvKiVpUMujw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":48,"name":"Engineering","url":"https://www.academia.edu/Documents/in/Engineering"},{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":6121,"name":"Slovak","url":"https://www.academia.edu/Documents/in/Slovak"},{"id":11984,"name":"Speech Recognition","url":"https://www.academia.edu/Documents/in/Speech_Recognition"},{"id":53293,"name":"Software","url":"https://www.academia.edu/Documents/in/Software"},{"id":1748596,"name":"Microphone","url":"https://www.academia.edu/Documents/in/Microphone"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726350"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726350/Biometric_User_Identification_by_Forearm_EMG_Analysis"><img alt="Research paper thumbnail of Biometric User Identification by Forearm EMG Analysis" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726350/Biometric_User_Identification_by_Forearm_EMG_Analysis">Biometric User Identification by Forearm EMG Analysis</a></div><div class="wp-workCard_item"><span>2022 IEEE International Conference on Consumer Electronics - Taiwan</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726350"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726350"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726350; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726350]").text(description); $(".js-view-count[data-work-id=114726350]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726350; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726350']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726350, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726350]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726350,"title":"Biometric User Identification by Forearm EMG Analysis","translated_title":"","metadata":{"publisher":"IEEE","publication_name":"2022 IEEE International Conference on Consumer Electronics - Taiwan"},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726350/Biometric_User_Identification_by_Forearm_EMG_Analysis","translated_internal_url":"","created_at":"2024-02-10T08:40:37.946-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Biometric_User_Identification_by_Forearm_EMG_Analysis","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":465,"name":"Artificial Intelligence","url":"https://www.academia.edu/Documents/in/Artificial_Intelligence"},{"id":9173,"name":"Biometrics","url":"https://www.academia.edu/Documents/in/Biometrics"},{"id":50642,"name":"Virtual Reality","url":"https://www.academia.edu/Documents/in/Virtual_Reality"},{"id":170918,"name":"Electromyography","url":"https://www.academia.edu/Documents/in/Electromyography"},{"id":1568111,"name":"Convolutional Neural Network","url":"https://www.academia.edu/Documents/in/Convolutional_Neural_Network"}],"urls":[{"id":39345782,"url":"http://xplorestaging.ieee.org/ielx7/9868970/9868972/09869268.pdf?arnumber=9869268"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="104938291"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/104938291/Formosa_Speech_Recognition_Challenge_2018_Data_Plan_and_Baselines"><img alt="Research paper thumbnail of Formosa Speech Recognition Challenge 2018: Data, Plan and Baselines" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/104938291/Formosa_Speech_Recognition_Challenge_2018_Data_Plan_and_Baselines">Formosa Speech Recognition Challenge 2018: Data, Plan and Baselines</a></div><div class="wp-workCard_item"><span>2018 11th International Symposium on Chinese Spoken Language Processing (ISCSLP)</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">This paper introduces the Formosa speech recognition (FSR) challenge 2018, presents the provided ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">This paper introduces the Formosa speech recognition (FSR) challenge 2018, presents the provided data profile, evaluation plan and reports the experimental results of the baseline systems. This challenge focuses on spontaneous Taiwanese Mandarin speech recognition (TMSR) and it is based on a real-life, multigene broadcast radio speech corpus, NER-Trs-Vol1, selected from the Formosa speech in the wild (FSW) project. To assist participants to establish a good starting system, a set of baseline systems were published based on various deep neural network (DNN) models. NER-Trs-Vol1 is free for participants (noncommercial license), and its corresponding Kaldi recipes for the baselines have been published online. Experimental results show that the combination of NER-Trs-Vol1 and Kaldi recipes is a good resource pack for spontaneous TMSR research and could be used to initialize an advanced semi-supervised training procedure to further improve the recognition performance.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="104938291"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="104938291"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 104938291; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=104938291]").text(description); $(".js-view-count[data-work-id=104938291]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 104938291; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='104938291']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 104938291, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=104938291]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":104938291,"title":"Formosa Speech Recognition Challenge 2018: Data, Plan and Baselines","translated_title":"","metadata":{"abstract":"This paper introduces the Formosa speech recognition (FSR) challenge 2018, presents the provided data profile, evaluation plan and reports the experimental results of the baseline systems. This challenge focuses on spontaneous Taiwanese Mandarin speech recognition (TMSR) and it is based on a real-life, multigene broadcast radio speech corpus, NER-Trs-Vol1, selected from the Formosa speech in the wild (FSW) project. To assist participants to establish a good starting system, a set of baseline systems were published based on various deep neural network (DNN) models. NER-Trs-Vol1 is free for participants (noncommercial license), and its corresponding Kaldi recipes for the baselines have been published online. Experimental results show that the combination of NER-Trs-Vol1 and Kaldi recipes is a good resource pack for spontaneous TMSR research and could be used to initialize an advanced semi-supervised training procedure to further improve the recognition performance.","publisher":"IEEE","publication_name":"2018 11th International Symposium on Chinese Spoken Language Processing (ISCSLP)"},"translated_abstract":"This paper introduces the Formosa speech recognition (FSR) challenge 2018, presents the provided data profile, evaluation plan and reports the experimental results of the baseline systems. This challenge focuses on spontaneous Taiwanese Mandarin speech recognition (TMSR) and it is based on a real-life, multigene broadcast radio speech corpus, NER-Trs-Vol1, selected from the Formosa speech in the wild (FSW) project. To assist participants to establish a good starting system, a set of baseline systems were published based on various deep neural network (DNN) models. NER-Trs-Vol1 is free for participants (noncommercial license), and its corresponding Kaldi recipes for the baselines have been published online. Experimental results show that the combination of NER-Trs-Vol1 and Kaldi recipes is a good resource pack for spontaneous TMSR research and could be used to initialize an advanced semi-supervised training procedure to further improve the recognition performance.","internal_url":"https://www.academia.edu/104938291/Formosa_Speech_Recognition_Challenge_2018_Data_Plan_and_Baselines","translated_internal_url":"","created_at":"2023-07-25T22:20:01.601-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Formosa_Speech_Recognition_Challenge_2018_Data_Plan_and_Baselines","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":11984,"name":"Speech Recognition","url":"https://www.academia.edu/Documents/in/Speech_Recognition"},{"id":339518,"name":"LICENSE","url":"https://www.academia.edu/Documents/in/LICENSE"}],"urls":[{"id":33080472,"url":"http://xplorestaging.ieee.org/ielx7/8701133/8706262/08706700.pdf?arnumber=8706700"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="104938290"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/104938290/Human_Computer_Interaction_for_Intelligent_Systems"><img alt="Research paper thumbnail of Human–Computer Interaction for Intelligent Systems" class="work-thumbnail" src="https://attachments.academia-assets.com/104531848/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/104938290/Human_Computer_Interaction_for_Intelligent_Systems">Human–Computer Interaction for Intelligent Systems</a></div><div class="wp-workCard_item"><span>Electronics</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The further development of human–computer interaction applications is still in great demand as us...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The further development of human–computer interaction applications is still in great demand as users expect more natural interactions [...]</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="ad0b79d6639a45837d8af05eb5d7f399" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:104531848,&quot;asset_id&quot;:104938290,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/104531848/download_file?st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="104938290"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="104938290"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 104938290; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=104938290]").text(description); $(".js-view-count[data-work-id=104938290]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 104938290; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='104938290']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 104938290, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "ad0b79d6639a45837d8af05eb5d7f399" } } $('.js-work-strip[data-work-id=104938290]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":104938290,"title":"Human–Computer Interaction for Intelligent Systems","translated_title":"","metadata":{"abstract":"The further development of human–computer interaction applications is still in great demand as users expect more natural interactions [...]","publisher":"MDPI AG","publication_name":"Electronics"},"translated_abstract":"The further development of human–computer interaction applications is still in great demand as users expect more natural interactions [...]","internal_url":"https://www.academia.edu/104938290/Human_Computer_Interaction_for_Intelligent_Systems","translated_internal_url":"","created_at":"2023-07-25T22:20:01.280-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":104531848,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/104531848/thumbnails/1.jpg","file_name":"pdf.pdf","download_url":"https://www.academia.edu/attachments/104531848/download_file?st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Human_Computer_Interaction_for_Intellige.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/104531848/pdf-libre.pdf?1690355193=\u0026response-content-disposition=attachment%3B+filename%3DHuman_Computer_Interaction_for_Intellige.pdf\u0026Expires=1732828130\u0026Signature=apCygspxz19ppXCiruuI8bHRuNmX-XQU4q9h1bcf0fcTfqvrexoa2DeH7fFg8xG6WEAfXQdPy9mDQnFAsWIkzACQ0c4YveM9EqiwPg7PIQGCBUm4bLnBzknG-F14-pkDKEkEL5i5XgGQGjaAKGcQbwlPwELftvWuSr7EXrdxCZwB3x6JC0T~83xq5uwLlCCG-BRQr-jmdHk7aVqGoCpHjSkVdEBCuvMVc7aHQZjAY8BoPaxRwlHlLep6EZxMzooUFWRG3FhIHZDTYBlTpeysfkHz3rJj2avctPjjZ3Gn1EczRaOMZWz9CtJQhc4tcl7cSRd3u7neC7-im2Bc4hOjNw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Human_Computer_Interaction_for_Intelligent_Systems","translated_slug":"","page_count":4,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[{"id":104531848,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/104531848/thumbnails/1.jpg","file_name":"pdf.pdf","download_url":"https://www.academia.edu/attachments/104531848/download_file?st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Human_Computer_Interaction_for_Intellige.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/104531848/pdf-libre.pdf?1690355193=\u0026response-content-disposition=attachment%3B+filename%3DHuman_Computer_Interaction_for_Intellige.pdf\u0026Expires=1732828130\u0026Signature=apCygspxz19ppXCiruuI8bHRuNmX-XQU4q9h1bcf0fcTfqvrexoa2DeH7fFg8xG6WEAfXQdPy9mDQnFAsWIkzACQ0c4YveM9EqiwPg7PIQGCBUm4bLnBzknG-F14-pkDKEkEL5i5XgGQGjaAKGcQbwlPwELftvWuSr7EXrdxCZwB3x6JC0T~83xq5uwLlCCG-BRQr-jmdHk7aVqGoCpHjSkVdEBCuvMVc7aHQZjAY8BoPaxRwlHlLep6EZxMzooUFWRG3FhIHZDTYBlTpeysfkHz3rJj2avctPjjZ3Gn1EczRaOMZWz9CtJQhc4tcl7cSRd3u7neC7-im2Bc4hOjNw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":472,"name":"Human Computer Interaction","url":"https://www.academia.edu/Documents/in/Human_Computer_Interaction"},{"id":4758,"name":"Electronics","url":"https://www.academia.edu/Documents/in/Electronics"},{"id":20492,"name":"Spelling","url":"https://www.academia.edu/Documents/in/Spelling"},{"id":60270,"name":"Modalities","url":"https://www.academia.edu/Documents/in/Modalities"},{"id":3418189,"name":"testbed","url":"https://www.academia.edu/Documents/in/testbed"}],"urls":[{"id":33080471,"url":"https://www.mdpi.com/2079-9292/12/1/161/pdf"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="104938289"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/104938289/Comparison_of_Statistical_Algorithms_and_Deep_Learning_for_Slovak_Document_Classification"><img alt="Research paper thumbnail of Comparison of Statistical Algorithms and Deep Learning for Slovak Document Classification" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/104938289/Comparison_of_Statistical_Algorithms_and_Deep_Learning_for_Slovak_Document_Classification">Comparison of Statistical Algorithms and Deep Learning for Slovak Document Classification</a></div><div class="wp-workCard_item"><span>2022 IEEE International Conference on Consumer Electronics - Taiwan</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="104938289"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="104938289"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 104938289; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=104938289]").text(description); $(".js-view-count[data-work-id=104938289]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 104938289; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='104938289']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 104938289, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=104938289]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":104938289,"title":"Comparison of Statistical Algorithms and Deep Learning for Slovak Document Classification","translated_title":"","metadata":{"publisher":"IEEE","publication_name":"2022 IEEE International Conference on Consumer Electronics - Taiwan"},"translated_abstract":null,"internal_url":"https://www.academia.edu/104938289/Comparison_of_Statistical_Algorithms_and_Deep_Learning_for_Slovak_Document_Classification","translated_internal_url":"","created_at":"2023-07-25T22:20:00.700-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Comparison_of_Statistical_Algorithms_and_Deep_Learning_for_Slovak_Document_Classification","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":465,"name":"Artificial Intelligence","url":"https://www.academia.edu/Documents/in/Artificial_Intelligence"},{"id":1432,"name":"Natural Language Processing","url":"https://www.academia.edu/Documents/in/Natural_Language_Processing"},{"id":6121,"name":"Slovak","url":"https://www.academia.edu/Documents/in/Slovak"},{"id":3163891,"name":"Statistical Classification","url":"https://www.academia.edu/Documents/in/Statistical_Classification"}],"urls":[{"id":33080470,"url":"http://xplorestaging.ieee.org/ielx7/9868970/9868972/09869155.pdf?arnumber=9869155"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="104938288"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/104938288/Slovak_dialogue_corpus_with_backchannel_annotation"><img alt="Research paper thumbnail of Slovak dialogue corpus with backchannel annotation" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/104938288/Slovak_dialogue_corpus_with_backchannel_annotation">Slovak dialogue corpus with backchannel annotation</a></div><div class="wp-workCard_item"><span>2022 32nd International Conference Radioelektronika (RADIOELEKTRONIKA)</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="104938288"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="104938288"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 104938288; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=104938288]").text(description); $(".js-view-count[data-work-id=104938288]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 104938288; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='104938288']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 104938288, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=104938288]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":104938288,"title":"Slovak dialogue corpus with backchannel annotation","translated_title":"","metadata":{"publisher":"IEEE","publication_name":"2022 32nd International Conference Radioelektronika (RADIOELEKTRONIKA)"},"translated_abstract":null,"internal_url":"https://www.academia.edu/104938288/Slovak_dialogue_corpus_with_backchannel_annotation","translated_internal_url":"","created_at":"2023-07-25T22:20:00.349-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Slovak_dialogue_corpus_with_backchannel_annotation","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":1432,"name":"Natural Language Processing","url":"https://www.academia.edu/Documents/in/Natural_Language_Processing"},{"id":6121,"name":"Slovak","url":"https://www.academia.edu/Documents/in/Slovak"},{"id":11984,"name":"Speech Recognition","url":"https://www.academia.edu/Documents/in/Speech_Recognition"},{"id":38072,"name":"Annotation","url":"https://www.academia.edu/Documents/in/Annotation"}],"urls":[{"id":33080469,"url":"http://xplorestaging.ieee.org/ielx7/9764897/9764898/09764955.pdf?arnumber=9764955"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="104938287"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/104938287/Personalized_Taiwanese_Speech_Synthesis_using_Cascaded_ASR_and_TTS_Framework"><img alt="Research paper thumbnail of Personalized Taiwanese Speech Synthesis using Cascaded ASR and TTS Framework" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/104938287/Personalized_Taiwanese_Speech_Synthesis_using_Cascaded_ASR_and_TTS_Framework">Personalized Taiwanese Speech Synthesis using Cascaded ASR and TTS Framework</a></div><div class="wp-workCard_item"><span>2022 32nd International Conference Radioelektronika (RADIOELEKTRONIKA)</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="104938287"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="104938287"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 104938287; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=104938287]").text(description); $(".js-view-count[data-work-id=104938287]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 104938287; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='104938287']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 104938287, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=104938287]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":104938287,"title":"Personalized Taiwanese Speech Synthesis using Cascaded ASR and TTS Framework","translated_title":"","metadata":{"publisher":"IEEE","publication_name":"2022 32nd International Conference Radioelektronika (RADIOELEKTRONIKA)"},"translated_abstract":null,"internal_url":"https://www.academia.edu/104938287/Personalized_Taiwanese_Speech_Synthesis_using_Cascaded_ASR_and_TTS_Framework","translated_internal_url":"","created_at":"2023-07-25T22:19:59.866-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Personalized_Taiwanese_Speech_Synthesis_using_Cascaded_ASR_and_TTS_Framework","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":1432,"name":"Natural Language Processing","url":"https://www.academia.edu/Documents/in/Natural_Language_Processing"},{"id":2342,"name":"Speech Synthesis","url":"https://www.academia.edu/Documents/in/Speech_Synthesis"},{"id":11984,"name":"Speech Recognition","url":"https://www.academia.edu/Documents/in/Speech_Recognition"},{"id":200138,"name":"Mandarin Chinese","url":"https://www.academia.edu/Documents/in/Mandarin_Chinese"},{"id":726220,"name":"Speech corpus","url":"https://www.academia.edu/Documents/in/Speech_corpus"}],"urls":[{"id":33080468,"url":"http://xplorestaging.ieee.org/ielx7/9764897/9764898/09764940.pdf?arnumber=9764940"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> </div><div class="profile--tab_content_container js-tab-pane tab-pane" data-section-id="10445103" id="papers"><div class="js-work-strip profile--work_container" data-work-id="114726374"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726374/Transition_Relevance_Places_Machine_Learning_Based_Detection_in_Dialogue_Interactions"><img alt="Research paper thumbnail of Transition-Relevance Places Machine Learning-Based Detection in Dialogue Interactions" class="work-thumbnail" src="https://attachments.academia-assets.com/111344869/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726374/Transition_Relevance_Places_Machine_Learning_Based_Detection_in_Dialogue_Interactions">Transition-Relevance Places Machine Learning-Based Detection in Dialogue Interactions</a></div><div class="wp-workCard_item"><span>Elektronika Ir Elektrotechnika</span><span>, Jun 27, 2023</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="ed15e7e1833de279ccdef9a1b8b9afd3" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:111344869,&quot;asset_id&quot;:114726374,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/111344869/download_file?st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726374"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726374"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726374; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726374]").text(description); $(".js-view-count[data-work-id=114726374]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726374; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726374']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726374, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "ed15e7e1833de279ccdef9a1b8b9afd3" } } $('.js-work-strip[data-work-id=114726374]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726374,"title":"Transition-Relevance Places Machine Learning-Based Detection in Dialogue Interactions","translated_title":"","metadata":{"publisher":"Kaunas University of Technology","grobid_abstract":"A transition-relevance place (TRP) represents a place in a conversation where a change of speaker can occur. The appearance and use of these points in the dialogue ensures a correct and smooth alternation between the speakers. In the presented article, we focused on the study of prosodic speech parameters in the Slovak language, and we tried to experimentally verify the potential of these parameters to detect TRP. To study turn-taking issues in dyadic conversations, the Slovak dialogue corpus was collected and annotated. TRP places were identified by the human annotator in the manual labelling process. The data were then divided into chunks that reflect the length of the interpausal dialogue units and the prosodic features were computed. In the Matlab environment, we compared different types of classifiers based on machine learning in the role of an automatic TRP detector based on pitch and intensity parameters. The achieved results indicate that prosodic parameters can be useful in detecting TRP after splitting the dialogue into interpausal units. The designed approach can serve as a tool for automatic conversational analysis or can be used to label large databases for training predictive models, which can help machines to enhance human-machine spoken dialogue applications.","publication_date":{"day":27,"month":6,"year":2023,"errors":{}},"publication_name":"Elektronika Ir Elektrotechnika","grobid_abstract_attachment_id":111344869},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726374/Transition_Relevance_Places_Machine_Learning_Based_Detection_in_Dialogue_Interactions","translated_internal_url":"","created_at":"2024-02-10T08:41:11.394-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":111344869,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/111344869/thumbnails/1.jpg","file_name":"33853-Article_Text-123897-1-10-20230705.pdf","download_url":"https://www.academia.edu/attachments/111344869/download_file?st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Transition_Relevance_Places_Machine_Lear.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/111344869/33853-Article_Text-123897-1-10-20230705-libre.pdf?1707584132=\u0026response-content-disposition=attachment%3B+filename%3DTransition_Relevance_Places_Machine_Lear.pdf\u0026Expires=1732828129\u0026Signature=Bff3isvv7LQbHJZSy~nL2RPjMkuxnMgF-H306vFyRHtHtz6LYsY6yzDbsNhg7Ad~TYySV-PDVMi810xcMMGyGuvOzB3wTqADJmtt~bAsiJ32FJowhnR3jDv-TJHDKJx~ySS11LzwgYzSnVHHu8z4MneShJ8Bhd0H45DZHAO00MyFjH4nAMfXdQENfF9JwIV2dJpgqHvomFehRZSrQqRMDVasfstlBhwHnZ1xwOmHHcE0F~suCQPoehr8MYrphazz1gCzBVwWru6brgrDHRR44C3iKmVAzoPOHqcpa6PJSGg9HG8BcRc9IcvTU5Z-LkWY8DPGLLSaYhGfjfCwLX5kcw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Transition_Relevance_Places_Machine_Learning_Based_Detection_in_Dialogue_Interactions","translated_slug":"","page_count":7,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[{"id":111344869,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/111344869/thumbnails/1.jpg","file_name":"33853-Article_Text-123897-1-10-20230705.pdf","download_url":"https://www.academia.edu/attachments/111344869/download_file?st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Transition_Relevance_Places_Machine_Lear.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/111344869/33853-Article_Text-123897-1-10-20230705-libre.pdf?1707584132=\u0026response-content-disposition=attachment%3B+filename%3DTransition_Relevance_Places_Machine_Lear.pdf\u0026Expires=1732828129\u0026Signature=Bff3isvv7LQbHJZSy~nL2RPjMkuxnMgF-H306vFyRHtHtz6LYsY6yzDbsNhg7Ad~TYySV-PDVMi810xcMMGyGuvOzB3wTqADJmtt~bAsiJ32FJowhnR3jDv-TJHDKJx~ySS11LzwgYzSnVHHu8z4MneShJ8Bhd0H45DZHAO00MyFjH4nAMfXdQENfF9JwIV2dJpgqHvomFehRZSrQqRMDVasfstlBhwHnZ1xwOmHHcE0F~suCQPoehr8MYrphazz1gCzBVwWru6brgrDHRR44C3iKmVAzoPOHqcpa6PJSGg9HG8BcRc9IcvTU5Z-LkWY8DPGLLSaYhGfjfCwLX5kcw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"},{"id":111344868,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/111344868/thumbnails/1.jpg","file_name":"33853-Article_Text-123897-1-10-20230705.pdf","download_url":"https://www.academia.edu/attachments/111344868/download_file","bulk_download_file_name":"Transition_Relevance_Places_Machine_Lear.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/111344868/33853-Article_Text-123897-1-10-20230705-libre.pdf?1707584132=\u0026response-content-disposition=attachment%3B+filename%3DTransition_Relevance_Places_Machine_Lear.pdf\u0026Expires=1732828129\u0026Signature=GnPu~7hVQQm632Q7R8hbn3SU7ClVADIW-G06DwYSdZaRsvSqf3hdKoF0P6gtzN1Xlpb57x9Mp-cRXkt-V4l32UbvlqHvSiwo~TLa1xEowPzK4F0N9PSwpCaBPM~oFzOer7h~W0XHbnBvJwJ1b~5In6VNHoDTIZe8p2RiZOA1TIw2bUh0keOd7KFWzS6y-PEPUanlzEy30WY1gNJVuwMbzbeS4CJdbqsjIOQlQW2WT9iqrgCrEpGR3T8AsLZOnpj-bsI-cq8hLBJvGXJwTbYVc~Upg6q20B-IjWyCYseLsxp8iGOgUFDH2JVutWU4oYlkV8CC393Vhd0LDpyadGoEcw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":465,"name":"Artificial Intelligence","url":"https://www.academia.edu/Documents/in/Artificial_Intelligence"},{"id":1432,"name":"Natural Language Processing","url":"https://www.academia.edu/Documents/in/Natural_Language_Processing"},{"id":6121,"name":"Slovak","url":"https://www.academia.edu/Documents/in/Slovak"},{"id":24342,"name":"Conversation","url":"https://www.academia.edu/Documents/in/Conversation"}],"urls":[{"id":39345804,"url":"https://eejournal.ktu.lt/index.php/elt/article/download/33853/15935"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726373"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726373/A_Preliminary_Study_on_Taiwanese_OCR_for_Assisting_Textual_Database_Construction_from_Historical_Documents"><img alt="Research paper thumbnail of A Preliminary Study on Taiwanese OCR for Assisting Textual Database Construction from Historical Documents" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726373/A_Preliminary_Study_on_Taiwanese_OCR_for_Assisting_Textual_Database_Construction_from_Historical_Documents">A Preliminary Study on Taiwanese OCR for Assisting Textual Database Construction from Historical Documents</a></div><div class="wp-workCard_item"><span>2022 13th International Symposium on Chinese Spoken Language Processing (ISCSLP)</span><span>, Dec 11, 2022</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726373"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726373"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726373; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726373]").text(description); $(".js-view-count[data-work-id=114726373]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726373; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726373']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726373, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726373]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726373,"title":"A Preliminary Study on Taiwanese OCR for Assisting Textual Database Construction from Historical Documents","translated_title":"","metadata":{"publication_date":{"day":11,"month":12,"year":2022,"errors":{}},"publication_name":"2022 13th International Symposium on Chinese Spoken Language Processing (ISCSLP)"},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726373/A_Preliminary_Study_on_Taiwanese_OCR_for_Assisting_Textual_Database_Construction_from_Historical_Documents","translated_internal_url":"","created_at":"2024-02-10T08:41:11.209-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"A_Preliminary_Study_on_Taiwanese_OCR_for_Assisting_Textual_Database_Construction_from_Historical_Documents","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":1432,"name":"Natural Language Processing","url":"https://www.academia.edu/Documents/in/Natural_Language_Processing"},{"id":265584,"name":"Optical Character Recognition","url":"https://www.academia.edu/Documents/in/Optical_Character_Recognition"},{"id":740724,"name":"Writing System","url":"https://www.academia.edu/Documents/in/Writing_System"}],"urls":[{"id":39345803,"url":"https://doi.org/10.1109/iscslp57327.2022.10038277"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726372"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726372/Handwriting_Data_Analysis_from_Crayonic_KeyVault_Smart_Security_Device"><img alt="Research paper thumbnail of Handwriting Data Analysis from Crayonic KeyVault Smart Security Device" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726372/Handwriting_Data_Analysis_from_Crayonic_KeyVault_Smart_Security_Device">Handwriting Data Analysis from Crayonic KeyVault Smart Security Device</a></div><div class="wp-workCard_item"><span>2022 20th International Conference on Emerging eLearning Technologies and Applications (ICETA)</span><span>, Oct 20, 2022</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726372"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726372"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726372; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726372]").text(description); $(".js-view-count[data-work-id=114726372]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726372; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726372']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726372, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726372]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726372,"title":"Handwriting Data Analysis from Crayonic KeyVault Smart Security Device","translated_title":"","metadata":{"publication_date":{"day":20,"month":10,"year":2022,"errors":{}},"publication_name":"2022 20th International Conference on Emerging eLearning Technologies and Applications (ICETA)"},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726372/Handwriting_Data_Analysis_from_Crayonic_KeyVault_Smart_Security_Device","translated_internal_url":"","created_at":"2024-02-10T08:41:11.006-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Handwriting_Data_Analysis_from_Crayonic_KeyVault_Smart_Security_Device","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":465,"name":"Artificial Intelligence","url":"https://www.academia.edu/Documents/in/Artificial_Intelligence"},{"id":9173,"name":"Biometrics","url":"https://www.academia.edu/Documents/in/Biometrics"},{"id":11984,"name":"Speech Recognition","url":"https://www.academia.edu/Documents/in/Speech_Recognition"},{"id":46118,"name":"Handwriting","url":"https://www.academia.edu/Documents/in/Handwriting"},{"id":479186,"name":"Accelerometer","url":"https://www.academia.edu/Documents/in/Accelerometer"},{"id":1231305,"name":"Data Set","url":"https://www.academia.edu/Documents/in/Data_Set"}],"urls":[{"id":39345802,"url":"https://doi.org/10.1109/iceta57911.2022.9974843"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726371"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726371/Children_robot_spoken_interaction_in_selected_educational_scenarios"><img alt="Research paper thumbnail of Children-robot spoken interaction in selected educational scenarios" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726371/Children_robot_spoken_interaction_in_selected_educational_scenarios">Children-robot spoken interaction in selected educational scenarios</a></div><div class="wp-workCard_item"><span>2022 20th International Conference on Emerging eLearning Technologies and Applications (ICETA)</span><span>, Oct 20, 2022</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726371"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726371"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726371; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726371]").text(description); $(".js-view-count[data-work-id=114726371]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726371; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726371']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726371, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726371]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726371,"title":"Children-robot spoken interaction in selected educational scenarios","translated_title":"","metadata":{"publication_date":{"day":20,"month":10,"year":2022,"errors":{}},"publication_name":"2022 20th International Conference on Emerging eLearning Technologies and Applications (ICETA)"},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726371/Children_robot_spoken_interaction_in_selected_educational_scenarios","translated_internal_url":"","created_at":"2024-02-10T08:41:10.813-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Children_robot_spoken_interaction_in_selected_educational_scenarios","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":99861,"name":"ROBOT","url":"https://www.academia.edu/Documents/in/ROBOT"},{"id":382135,"name":"Wizard of Oz","url":"https://www.academia.edu/Documents/in/Wizard_of_Oz"}],"urls":[{"id":39345801,"url":"https://doi.org/10.1109/iceta57911.2022.9974859"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726370"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726370/Multimodal_dialogue_system_with_NAO_and_VoiceXML_dialogue_manager"><img alt="Research paper thumbnail of Multimodal dialogue system with NAO and VoiceXML dialogue manager" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726370/Multimodal_dialogue_system_with_NAO_and_VoiceXML_dialogue_manager">Multimodal dialogue system with NAO and VoiceXML dialogue manager</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The proposed paper describes a multimodal interactive system based on NAO humanoid robot with an ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The proposed paper describes a multimodal interactive system based on NAO humanoid robot with an external dialogue management module VoiceON. System can be controlled by voice. NAO produces speech and gestures as a response to the user inputs. Presented multimodal system uses built-in speech recognition and speech synthesis modules adapted to Slovak language, which was originally not supported. Moreover, system accepts VoiceXML dialogue applications for all supported languages. To manage dialogue interaction, a previously developed VoiceXML dialogue manager was adopted. A new module for generation of multimodal output (speech and gestures) was designed, which enables NAO to produce gestures together with speech in several modes.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726370"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726370"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726370; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726370]").text(description); $(".js-view-count[data-work-id=114726370]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726370; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726370']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726370, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726370]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726370,"title":"Multimodal dialogue system with NAO and VoiceXML dialogue manager","translated_title":"","metadata":{"abstract":"The proposed paper describes a multimodal interactive system based on NAO humanoid robot with an external dialogue management module VoiceON. System can be controlled by voice. NAO produces speech and gestures as a response to the user inputs. Presented multimodal system uses built-in speech recognition and speech synthesis modules adapted to Slovak language, which was originally not supported. Moreover, system accepts VoiceXML dialogue applications for all supported languages. To manage dialogue interaction, a previously developed VoiceXML dialogue manager was adopted. A new module for generation of multimodal output (speech and gestures) was designed, which enables NAO to produce gestures together with speech in several modes.","publication_date":{"day":1,"month":9,"year":2017,"errors":{}}},"translated_abstract":"The proposed paper describes a multimodal interactive system based on NAO humanoid robot with an external dialogue management module VoiceON. System can be controlled by voice. NAO produces speech and gestures as a response to the user inputs. Presented multimodal system uses built-in speech recognition and speech synthesis modules adapted to Slovak language, which was originally not supported. Moreover, system accepts VoiceXML dialogue applications for all supported languages. To manage dialogue interaction, a previously developed VoiceXML dialogue manager was adopted. A new module for generation of multimodal output (speech and gestures) was designed, which enables NAO to produce gestures together with speech in several modes.","internal_url":"https://www.academia.edu/114726370/Multimodal_dialogue_system_with_NAO_and_VoiceXML_dialogue_manager","translated_internal_url":"","created_at":"2024-02-10T08:41:10.624-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Multimodal_dialogue_system_with_NAO_and_VoiceXML_dialogue_manager","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"}],"urls":[{"id":39345800,"url":"https://doi.org/10.1109/coginfocom.2017.8268286"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726369"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726369/Identification_of_Trolling_in_Memes_Using_Convolutional_Neural_Networks"><img alt="Research paper thumbnail of Identification of Trolling in Memes Using Convolutional Neural Networks" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726369/Identification_of_Trolling_in_Memes_Using_Convolutional_Neural_Networks">Identification of Trolling in Memes Using Convolutional Neural Networks</a></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726369"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726369"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726369; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726369]").text(description); $(".js-view-count[data-work-id=114726369]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726369; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726369']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726369, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726369]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726369,"title":"Identification of Trolling in Memes Using Convolutional Neural Networks","translated_title":"","metadata":{"publication_date":{"day":19,"month":4,"year":2023,"errors":{}}},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726369/Identification_of_Trolling_in_Memes_Using_Convolutional_Neural_Networks","translated_internal_url":"","created_at":"2024-02-10T08:41:10.409-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Identification_of_Trolling_in_Memes_Using_Convolutional_Neural_Networks","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":465,"name":"Artificial Intelligence","url":"https://www.academia.edu/Documents/in/Artificial_Intelligence"},{"id":988977,"name":"Lexical Analysis","url":"https://www.academia.edu/Documents/in/Lexical_Analysis"},{"id":1568111,"name":"Convolutional Neural Network","url":"https://www.academia.edu/Documents/in/Convolutional_Neural_Network"}],"urls":[{"id":39345799,"url":"https://doi.org/10.1109/radioelektronika57919.2023.10109044"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726368"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726368/Preliminary_evaluation_of_the_multimodal_interactive_system_for_NAO_robot"><img alt="Research paper thumbnail of Preliminary evaluation of the multimodal interactive system for NAO robot" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726368/Preliminary_evaluation_of_the_multimodal_interactive_system_for_NAO_robot">Preliminary evaluation of the multimodal interactive system for NAO robot</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The proposed paper brings a description of the pilot version of the multimodal dialogue system fo...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The proposed paper brings a description of the pilot version of the multimodal dialogue system for NAO humanoid robot. Designed system enables multimodal interaction with the user in such manner that it takes a speech input from the user and it answers by a combination of synthetic speech and gestures. The core of the system is an external dialogue manager, which interprets VoiceXML language. A pilot speech communication application was designed and a preliminary evaluation was performed using subjective methods. Results of the preliminary evaluation highlights importance of involving gestures into communication exchange. Moreover the paper brings a discussion of usage scenarios, where designed system can be used for educational purposes.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726368"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726368"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726368; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726368]").text(description); $(".js-view-count[data-work-id=114726368]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726368; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726368']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726368, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726368]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726368,"title":"Preliminary evaluation of the multimodal interactive system for NAO robot","translated_title":"","metadata":{"abstract":"The proposed paper brings a description of the pilot version of the multimodal dialogue system for NAO humanoid robot. Designed system enables multimodal interaction with the user in such manner that it takes a speech input from the user and it answers by a combination of synthetic speech and gestures. The core of the system is an external dialogue manager, which interprets VoiceXML language. A pilot speech communication application was designed and a preliminary evaluation was performed using subjective methods. Results of the preliminary evaluation highlights importance of involving gestures into communication exchange. Moreover the paper brings a discussion of usage scenarios, where designed system can be used for educational purposes.","publication_date":{"day":1,"month":10,"year":2017,"errors":{}}},"translated_abstract":"The proposed paper brings a description of the pilot version of the multimodal dialogue system for NAO humanoid robot. Designed system enables multimodal interaction with the user in such manner that it takes a speech input from the user and it answers by a combination of synthetic speech and gestures. The core of the system is an external dialogue manager, which interprets VoiceXML language. A pilot speech communication application was designed and a preliminary evaluation was performed using subjective methods. Results of the preliminary evaluation highlights importance of involving gestures into communication exchange. Moreover the paper brings a discussion of usage scenarios, where designed system can be used for educational purposes.","internal_url":"https://www.academia.edu/114726368/Preliminary_evaluation_of_the_multimodal_interactive_system_for_NAO_robot","translated_internal_url":"","created_at":"2024-02-10T08:41:10.199-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Preliminary_evaluation_of_the_multimodal_interactive_system_for_NAO_robot","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":472,"name":"Human Computer Interaction","url":"https://www.academia.edu/Documents/in/Human_Computer_Interaction"},{"id":3147,"name":"Gesture","url":"https://www.academia.edu/Documents/in/Gesture"},{"id":11081,"name":"Multimodal Interaction","url":"https://www.academia.edu/Documents/in/Multimodal_Interaction"},{"id":428932,"name":"Humanoid robot","url":"https://www.academia.edu/Documents/in/Humanoid_robot"}],"urls":[{"id":39345798,"url":"https://doi.org/10.1109/iceta.2017.8102515"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726367"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726367/Static_Audio_Keystroke_Dynamics"><img alt="Research paper thumbnail of Static Audio Keystroke Dynamics" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726367/Static_Audio_Keystroke_Dynamics">Static Audio Keystroke Dynamics</a></div><div class="wp-workCard_item"><span>Springer eBooks</span><span>, 2015</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">In this paper we investigate the accuracy of an identification scheme based on the sound of typin...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">In this paper we investigate the accuracy of an identification scheme based on the sound of typing a password. The novelty of this paper lies in the comparison of performance between timing based and audio based keystroke dynamics data in both an authentication and an identification setting. We collected data of 50 people typing the same given password 100 times, divided into 4 sessions of 25 typings, and tested how well the system could recognize the correct typist. When training with data of 3 sessions and testing with the remaining session we achieved a maximal accuracy of 97.3 % using cross validation. Repeating this with training with 1 session and testing with the 3 remaining sessions we achieved an accuracy of still 90.6 %. The results show the potential of using Audio Keystroke Dynamics information as a way to identify users during log on.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726367"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726367"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726367; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726367]").text(description); $(".js-view-count[data-work-id=114726367]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726367; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726367']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726367, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726367]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726367,"title":"Static Audio Keystroke Dynamics","translated_title":"","metadata":{"abstract":"In this paper we investigate the accuracy of an identification scheme based on the sound of typing a password. The novelty of this paper lies in the comparison of performance between timing based and audio based keystroke dynamics data in both an authentication and an identification setting. We collected data of 50 people typing the same given password 100 times, divided into 4 sessions of 25 typings, and tested how well the system could recognize the correct typist. When training with data of 3 sessions and testing with the remaining session we achieved a maximal accuracy of 97.3 % using cross validation. Repeating this with training with 1 session and testing with the 3 remaining sessions we achieved an accuracy of still 90.6 %. The results show the potential of using Audio Keystroke Dynamics information as a way to identify users during log on.","publisher":"Springer Nature","publication_date":{"day":null,"month":null,"year":2015,"errors":{}},"publication_name":"Springer eBooks"},"translated_abstract":"In this paper we investigate the accuracy of an identification scheme based on the sound of typing a password. The novelty of this paper lies in the comparison of performance between timing based and audio based keystroke dynamics data in both an authentication and an identification setting. We collected data of 50 people typing the same given password 100 times, divided into 4 sessions of 25 typings, and tested how well the system could recognize the correct typist. When training with data of 3 sessions and testing with the remaining session we achieved a maximal accuracy of 97.3 % using cross validation. Repeating this with training with 1 session and testing with the 3 remaining sessions we achieved an accuracy of still 90.6 %. The results show the potential of using Audio Keystroke Dynamics information as a way to identify users during log on.","internal_url":"https://www.academia.edu/114726367/Static_Audio_Keystroke_Dynamics","translated_internal_url":"","created_at":"2024-02-10T08:41:10.018-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Static_Audio_Keystroke_Dynamics","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":11984,"name":"Speech Recognition","url":"https://www.academia.edu/Documents/in/Speech_Recognition"},{"id":303449,"name":"Novelty","url":"https://www.academia.edu/Documents/in/Novelty"},{"id":433921,"name":"Password","url":"https://www.academia.edu/Documents/in/Password"},{"id":558449,"name":"Keystroke logging","url":"https://www.academia.edu/Documents/in/Keystroke_logging"},{"id":1564564,"name":"Keystroke Dynamics","url":"https://www.academia.edu/Documents/in/Keystroke_Dynamics"},{"id":3647879,"name":"Springer Ebooks","url":"https://www.academia.edu/Documents/in/Springer_Ebooks"}],"urls":[{"id":39345797,"url":"https://doi.org/10.1007/978-3-319-26404-2_13"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726366"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726366/What_Behavioral_Signals_could_be_Used_to_Identify_Users"><img alt="Research paper thumbnail of What Behavioral Signals could be Used to Identify Users?" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726366/What_Behavioral_Signals_could_be_Used_to_Identify_Users">What Behavioral Signals could be Used to Identify Users?</a></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726366"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726366"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726366; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726366]").text(description); $(".js-view-count[data-work-id=114726366]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726366; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726366']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726366, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726366]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726366,"title":"What Behavioral Signals could be Used to Identify Users?","translated_title":"","metadata":{"publication_date":{"day":19,"month":1,"year":2023,"errors":{}}},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726366/What_Behavioral_Signals_could_be_Used_to_Identify_Users","translated_internal_url":"","created_at":"2024-02-10T08:41:09.332-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"What_Behavioral_Signals_could_be_Used_to_Identify_Users","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":472,"name":"Human Computer Interaction","url":"https://www.academia.edu/Documents/in/Human_Computer_Interaction"},{"id":9173,"name":"Biometrics","url":"https://www.academia.edu/Documents/in/Biometrics"},{"id":50642,"name":"Virtual Reality","url":"https://www.academia.edu/Documents/in/Virtual_Reality"},{"id":759573,"name":"Login","url":"https://www.academia.edu/Documents/in/Login"}],"urls":[{"id":39345796,"url":"https://doi.org/10.1109/sami58000.2023.10044502"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726362"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726362/BERT_based_Chinese_Medicine_Named_Entity_Recognition_Model_Applied_to_Medication_Reminder_Dialogue_System"><img alt="Research paper thumbnail of BERT-based Chinese Medicine Named Entity Recognition Model Applied to Medication Reminder Dialogue System" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726362/BERT_based_Chinese_Medicine_Named_Entity_Recognition_Model_Applied_to_Medication_Reminder_Dialogue_System">BERT-based Chinese Medicine Named Entity Recognition Model Applied to Medication Reminder Dialogue System</a></div><div class="wp-workCard_item"><span>2022 13th International Symposium on Chinese Spoken Language Processing (ISCSLP)</span><span>, Dec 11, 2022</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726362"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726362"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726362; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726362]").text(description); $(".js-view-count[data-work-id=114726362]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726362; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726362']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726362, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726362]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726362,"title":"BERT-based Chinese Medicine Named Entity Recognition Model Applied to Medication Reminder Dialogue System","translated_title":"","metadata":{"publication_date":{"day":11,"month":12,"year":2022,"errors":{}},"publication_name":"2022 13th International Symposium on Chinese Spoken Language Processing (ISCSLP)"},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726362/BERT_based_Chinese_Medicine_Named_Entity_Recognition_Model_Applied_to_Medication_Reminder_Dialogue_System","translated_internal_url":"","created_at":"2024-02-10T08:41:09.012-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"BERT_based_Chinese_Medicine_Named_Entity_Recognition_Model_Applied_to_Medication_Reminder_Dialogue_System","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":29205,"name":"Named Entity Recognition","url":"https://www.academia.edu/Documents/in/Named_Entity_Recognition"},{"id":60107,"name":"Traditional Chinese Medicine","url":"https://www.academia.edu/Documents/in/Traditional_Chinese_Medicine"}],"urls":[{"id":39345793,"url":"https://doi.org/10.1109/iscslp57327.2022.10037867"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726360"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726360/EMG_Data_Collection_for_Multimodal_Keystroke_Analysis"><img alt="Research paper thumbnail of EMG Data Collection for Multimodal Keystroke Analysis" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726360/EMG_Data_Collection_for_Multimodal_Keystroke_Analysis">EMG Data Collection for Multimodal Keystroke Analysis</a></div><div class="wp-workCard_item"><span>2022 12th International Conference on Advanced Computer Information Technologies (ACIT)</span><span>, Sep 26, 2022</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726360"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726360"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726360; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726360]").text(description); $(".js-view-count[data-work-id=114726360]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726360; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726360']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726360, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726360]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726360,"title":"EMG Data Collection for Multimodal Keystroke Analysis","translated_title":"","metadata":{"publication_date":{"day":26,"month":9,"year":2022,"errors":{}},"publication_name":"2022 12th International Conference on Advanced Computer Information Technologies (ACIT)"},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726360/EMG_Data_Collection_for_Multimodal_Keystroke_Analysis","translated_internal_url":"","created_at":"2024-02-10T08:41:08.711-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"EMG_Data_Collection_for_Multimodal_Keystroke_Analysis","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":53293,"name":"Software","url":"https://www.academia.edu/Documents/in/Software"},{"id":433921,"name":"Password","url":"https://www.academia.edu/Documents/in/Password"},{"id":558449,"name":"Keystroke logging","url":"https://www.academia.edu/Documents/in/Keystroke_logging"},{"id":1564564,"name":"Keystroke Dynamics","url":"https://www.academia.edu/Documents/in/Keystroke_Dynamics"}],"urls":[{"id":39345791,"url":"https://doi.org/10.1109/acit54803.2022.9913124"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726358"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726358/Modified_Ling_Six_Sound_Test_Audiometry_Application"><img alt="Research paper thumbnail of Modified Ling Six Sound Test Audiometry Application" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726358/Modified_Ling_Six_Sound_Test_Audiometry_Application">Modified Ling Six Sound Test Audiometry Application</a></div><div class="wp-workCard_item"><span>2022 12th International Conference on Advanced Computer Information Technologies (ACIT)</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726358"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726358"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726358; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726358]").text(description); $(".js-view-count[data-work-id=114726358]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726358; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726358']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726358, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726358]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726358,"title":"Modified Ling Six Sound Test Audiometry Application","translated_title":"","metadata":{"publisher":"IEEE","publication_name":"2022 12th International Conference on Advanced Computer Information Technologies (ACIT)"},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726358/Modified_Ling_Six_Sound_Test_Audiometry_Application","translated_internal_url":"","created_at":"2024-02-10T08:41:08.491-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Modified_Ling_Six_Sound_Test_Audiometry_Application","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":4139,"name":"Audiology","url":"https://www.academia.edu/Documents/in/Audiology"},{"id":11984,"name":"Speech Recognition","url":"https://www.academia.edu/Documents/in/Speech_Recognition"},{"id":76622,"name":"Cochlear Implant","url":"https://www.academia.edu/Documents/in/Cochlear_Implant"},{"id":1725725,"name":"Audiometry","url":"https://www.academia.edu/Documents/in/Audiometry"}],"urls":[{"id":39345789,"url":"http://xplorestaging.ieee.org/ielx7/9912736/9912739/09913152.pdf?arnumber=9913152"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726356"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726356/%E9%81%A9%E5%90%88%E6%BC%B8%E5%87%8D%E4%BA%BA%E4%BD%BF%E7%94%A8%E4%B9%8B%E8%AA%9E%E9%9F%B3%E8%BD%89%E6%8F%9B%E7%B3%BB%E7%B5%B1%E5%88%9D%E6%AD%A5%E7%A0%94%E7%A9%B6_Deep_Neural_Network_Bandwidth_Extension_and_Denoising_Voice_Conversion_System_for_ALS_Patients_"><img alt="Research paper thumbnail of 適合漸凍人使用之語音轉換系統初步研究 (Deep Neural-Network Bandwidth Extension and Denoising Voice Conversion System for ALS Patients)" class="work-thumbnail" src="https://attachments.academia-assets.com/111344864/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726356/%E9%81%A9%E5%90%88%E6%BC%B8%E5%87%8D%E4%BA%BA%E4%BD%BF%E7%94%A8%E4%B9%8B%E8%AA%9E%E9%9F%B3%E8%BD%89%E6%8F%9B%E7%B3%BB%E7%B5%B1%E5%88%9D%E6%AD%A5%E7%A0%94%E7%A9%B6_Deep_Neural_Network_Bandwidth_Extension_and_Denoising_Voice_Conversion_System_for_ALS_Patients_">適合漸凍人使用之語音轉換系統初步研究 (Deep Neural-Network Bandwidth Extension and Denoising Voice Conversion System for ALS Patients)</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">摘要 漸凍人症(肌萎縮性脊隨側索硬化症,Amyotrophic lateral sclerosis, ALS)為一種神 經退化性疾病,這種疾病目前還沒有治癒的方法,並會讓漸凍人慢慢失去說話能力,...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">摘要 漸凍人症(肌萎縮性脊隨側索硬化症,Amyotrophic lateral sclerosis, ALS)為一種神 經退化性疾病,這種疾病目前還沒有治癒的方法,並會讓漸凍人慢慢失去說話能力,最 終導致無法利用語音與人溝通,而失去自我認同。因此,我們需要為漸凍人建立適合其 使用之語音溝通輔具(voice output communication aids, VOCAs),尤其是讓其能具有個 人化的合成語音,即病友發病前的聲音,以保持自我。但大部分在 ALS 後期,已經不 能講話的病友,都沒有事先妥善保存好個人的錄音,最多只能找出有少量大約 20 分鐘 的低品質語音,例如經過失真壓縮(MP3)、只保留低頻寬(8 kHz),或是具有強烈背 景雜訊干擾等等,以致無法建構出適合 ALS 病友使用的個人化語音合成系統。針對以 上困難,本論文嘗試使用通用語音合成系統搭配語音轉換演算法,並在前級加上語音雜 訊消除(speech denoising),後級輔以超展頻模組(speech super-resolution)。以能容忍 有背景雜訊的錄音,並能將低頻寬的合成語音加上高頻成分(16 kHz)。以盡量能從低 品質語音,重建出接近 ALS 病友原音的高品質合成聲音。其中,speech denoising 使用 WaveNet,speech super-resolution 則利用 U-Net 架構。並先以 20 小時的高品質(棚內錄 音)教育電台語料庫,模擬出成對的高雜訊與乾淨語音語句,或是低頻寬與高頻寬語音, The 2019 Conference on Computational Linguistics and Speech Processing ROCLING 2019, pp. 5266 ©The Association for Computational Linguistics and Chinese Language Processing</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="577420730b02fc061abe07f9b1d80f09" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:111344864,&quot;asset_id&quot;:114726356,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/111344864/download_file?st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726356"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726356"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726356; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726356]").text(description); $(".js-view-count[data-work-id=114726356]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726356; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726356']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726356, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "577420730b02fc061abe07f9b1d80f09" } } $('.js-work-strip[data-work-id=114726356]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726356,"title":"適合漸凍人使用之語音轉換系統初步研究 (Deep Neural-Network Bandwidth Extension and Denoising Voice Conversion System for ALS Patients)","translated_title":"","metadata":{"abstract":"摘要 漸凍人症(肌萎縮性脊隨側索硬化症,Amyotrophic lateral sclerosis, ALS)為一種神 經退化性疾病,這種疾病目前還沒有治癒的方法,並會讓漸凍人慢慢失去說話能力,最 終導致無法利用語音與人溝通,而失去自我認同。因此,我們需要為漸凍人建立適合其 使用之語音溝通輔具(voice output communication aids, VOCAs),尤其是讓其能具有個 人化的合成語音,即病友發病前的聲音,以保持自我。但大部分在 ALS 後期,已經不 能講話的病友,都沒有事先妥善保存好個人的錄音,最多只能找出有少量大約 20 分鐘 的低品質語音,例如經過失真壓縮(MP3)、只保留低頻寬(8 kHz),或是具有強烈背 景雜訊干擾等等,以致無法建構出適合 ALS 病友使用的個人化語音合成系統。針對以 上困難,本論文嘗試使用通用語音合成系統搭配語音轉換演算法,並在前級加上語音雜 訊消除(speech denoising),後級輔以超展頻模組(speech super-resolution)。以能容忍 有背景雜訊的錄音,並能將低頻寬的合成語音加上高頻成分(16 kHz)。以盡量能從低 品質語音,重建出接近 ALS 病友原音的高品質合成聲音。其中,speech denoising 使用 WaveNet,speech super-resolution 則利用 U-Net 架構。並先以 20 小時的高品質(棚內錄 音)教育電台語料庫,模擬出成對的高雜訊與乾淨語音語句,或是低頻寬與高頻寬語音, The 2019 Conference on Computational Linguistics and Speech Processing ROCLING 2019, pp. 5266 ©The Association for Computational Linguistics and Chinese Language Processing","publisher":"IJCLCLP","publication_date":{"day":null,"month":null,"year":2019,"errors":{}}},"translated_abstract":"摘要 漸凍人症(肌萎縮性脊隨側索硬化症,Amyotrophic lateral sclerosis, ALS)為一種神 經退化性疾病,這種疾病目前還沒有治癒的方法,並會讓漸凍人慢慢失去說話能力,最 終導致無法利用語音與人溝通,而失去自我認同。因此,我們需要為漸凍人建立適合其 使用之語音溝通輔具(voice output communication aids, VOCAs),尤其是讓其能具有個 人化的合成語音,即病友發病前的聲音,以保持自我。但大部分在 ALS 後期,已經不 能講話的病友,都沒有事先妥善保存好個人的錄音,最多只能找出有少量大約 20 分鐘 的低品質語音,例如經過失真壓縮(MP3)、只保留低頻寬(8 kHz),或是具有強烈背 景雜訊干擾等等,以致無法建構出適合 ALS 病友使用的個人化語音合成系統。針對以 上困難,本論文嘗試使用通用語音合成系統搭配語音轉換演算法,並在前級加上語音雜 訊消除(speech denoising),後級輔以超展頻模組(speech super-resolution)。以能容忍 有背景雜訊的錄音,並能將低頻寬的合成語音加上高頻成分(16 kHz)。以盡量能從低 品質語音,重建出接近 ALS 病友原音的高品質合成聲音。其中,speech denoising 使用 WaveNet,speech super-resolution 則利用 U-Net 架構。並先以 20 小時的高品質(棚內錄 音)教育電台語料庫,模擬出成對的高雜訊與乾淨語音語句,或是低頻寬與高頻寬語音, The 2019 Conference on Computational Linguistics and Speech Processing ROCLING 2019, pp. 5266 ©The Association for Computational Linguistics and Chinese Language Processing","internal_url":"https://www.academia.edu/114726356/%E9%81%A9%E5%90%88%E6%BC%B8%E5%87%8D%E4%BA%BA%E4%BD%BF%E7%94%A8%E4%B9%8B%E8%AA%9E%E9%9F%B3%E8%BD%89%E6%8F%9B%E7%B3%BB%E7%B5%B1%E5%88%9D%E6%AD%A5%E7%A0%94%E7%A9%B6_Deep_Neural_Network_Bandwidth_Extension_and_Denoising_Voice_Conversion_System_for_ALS_Patients_","translated_internal_url":"","created_at":"2024-02-10T08:41:08.321-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":111344864,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/111344864/thumbnails/1.jpg","file_name":"2019.ijclclp-2.3.pdf","download_url":"https://www.academia.edu/attachments/111344864/download_file?st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Deep_Neural_Network_B.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/111344864/2019.ijclclp-2.3-libre.pdf?1707584138=\u0026response-content-disposition=attachment%3B+filename%3DDeep_Neural_Network_B.pdf\u0026Expires=1732828129\u0026Signature=UV~VeyLna4F0qEDyXzG9qbgSjJAymuaB0RIxz6BbOMMZMqy8affeuNMD-ED9-TJtaDdvMjSxud9R2~wkJpGFq7khfG7WGLvkNdFFnZ-OMjAhZBIhkB74S59u0FyBu2rR-p3wAoHKXqeIX4iNtv5Lx7mEuRy0usso7NhH8yWd5jfwbg8fM5Lwh0c5~OOKOXJsRHH4MbapyajMk~jveGiMq~ZGbyjmvQOlOIYnTovDeBjH0iUbU8hQj8y780PaI2q8gLLq58tRJBsB5h-zJwJ7ZOVQ-tyIEsKPd1K7fOiPN3pU9nF6oxqbSqVDUI4pP7pU5aIGnL8Tzm6tI~sQyczeSw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"適合漸凍人使用之語音轉換系統初步研究_Deep_Neural_Network_Bandwidth_Extension_and_Denoising_Voice_Conversion_System_for_ALS_Patients_","translated_slug":"","page_count":16,"language":"zh-TW","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[{"id":111344864,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/111344864/thumbnails/1.jpg","file_name":"2019.ijclclp-2.3.pdf","download_url":"https://www.academia.edu/attachments/111344864/download_file?st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Deep_Neural_Network_B.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/111344864/2019.ijclclp-2.3-libre.pdf?1707584138=\u0026response-content-disposition=attachment%3B+filename%3DDeep_Neural_Network_B.pdf\u0026Expires=1732828129\u0026Signature=UV~VeyLna4F0qEDyXzG9qbgSjJAymuaB0RIxz6BbOMMZMqy8affeuNMD-ED9-TJtaDdvMjSxud9R2~wkJpGFq7khfG7WGLvkNdFFnZ-OMjAhZBIhkB74S59u0FyBu2rR-p3wAoHKXqeIX4iNtv5Lx7mEuRy0usso7NhH8yWd5jfwbg8fM5Lwh0c5~OOKOXJsRHH4MbapyajMk~jveGiMq~ZGbyjmvQOlOIYnTovDeBjH0iUbU8hQj8y780PaI2q8gLLq58tRJBsB5h-zJwJ7ZOVQ-tyIEsKPd1K7fOiPN3pU9nF6oxqbSqVDUI4pP7pU5aIGnL8Tzm6tI~sQyczeSw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"},{"id":111344862,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/111344862/thumbnails/1.jpg","file_name":"2019.ijclclp-2.3.pdf","download_url":"https://www.academia.edu/attachments/111344862/download_file","bulk_download_file_name":"Deep_Neural_Network_B.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/111344862/2019.ijclclp-2.3-libre.pdf?1707584138=\u0026response-content-disposition=attachment%3B+filename%3DDeep_Neural_Network_B.pdf\u0026Expires=1732828129\u0026Signature=HOAm83sFpDqLPNxHuJi6MMsDSK8xNYHOUKuiFz3WkN-KAHyAc8n7lOo5Nfxw6Cwg6bJYNqo99eAiAX4YNuzXCe5Xhn1ogqmURJnBq22HBYv4p1BbYarQJEOlAMtV~Pr3kJp1UqOCXcEk0oPa5DsCD7cn4vc0QopiM92hoAzjDkH9ZKUB4TQ-Efxa9W0gSg5QVnBHGMB5XpASVpa~pKlmM0lf9M5-KgVh3nPEAXkPSVVgPrbuH6oBq-9jwx~2SQXztV28~SOhC2XaQZjxJR2KDQZdBF7tTgvKar~op9B-VxhD0~4SauhP92KJ3Em3~ObyPU7JN0QdhrTarQVezb2JyQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":11984,"name":"Speech Recognition","url":"https://www.academia.edu/Documents/in/Speech_Recognition"},{"id":96798,"name":"Noise reduction","url":"https://www.academia.edu/Documents/in/Noise_reduction"},{"id":1211304,"name":"Artificial Neural Network","url":"https://www.academia.edu/Documents/in/Artificial_Neural_Network"}],"urls":[{"id":39345787,"url":"https://www.aclanthology.org/2019.ijclclp-2.3.pdf"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726355"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726355/System_for_Automatic_Transcription_of_Audio_Meetings_in_Slovak"><img alt="Research paper thumbnail of System for Automatic Transcription of Audio Meetings in Slovak" class="work-thumbnail" src="https://attachments.academia-assets.com/111344878/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726355/System_for_Automatic_Transcription_of_Audio_Meetings_in_Slovak">System for Automatic Transcription of Audio Meetings in Slovak</a></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="563e5e3ba9c1ebd2c0d0b1e03982d6db" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:111344878,&quot;asset_id&quot;:114726355,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/111344878/download_file?st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726355"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726355"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726355; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726355]").text(description); $(".js-view-count[data-work-id=114726355]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726355; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726355']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726355, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "563e5e3ba9c1ebd2c0d0b1e03982d6db" } } $('.js-work-strip[data-work-id=114726355]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726355,"title":"System for Automatic Transcription of Audio Meetings in Slovak","translated_title":"","metadata":{"grobid_abstract":"The main aim of the pilot project is a research and development of the meeting speech recognition system for the Slovak language. The system includes:  voice recording of meetings in small conference rooms with a limited number of participants using microphone array;  automatic domain-specific and speaker-dependent speech-to-text transcription in the Slovak language;  management for storing, browsing, searching and synchronization of transcripts with audio recordings. Expected outcome is functional prototype, practically applicable software product for automatic transcription of meetings, educational talks and lectures.","grobid_abstract_attachment_id":111344878},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726355/System_for_Automatic_Transcription_of_Audio_Meetings_in_Slovak","translated_internal_url":"","created_at":"2024-02-10T08:41:07.212-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":111344878,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/111344878/thumbnails/1.jpg","file_name":"3-TUKE_System_na_automaticke_rozpoznavanie_a_prepis_mitingovych_zaznamov.pdf","download_url":"https://www.academia.edu/attachments/111344878/download_file?st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"System_for_Automatic_Transcription_of_Au.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/111344878/3-TUKE_System_na_automaticke_rozpoznavanie_a_prepis_mitingovych_zaznamov-libre.pdf?1707584135=\u0026response-content-disposition=attachment%3B+filename%3DSystem_for_Automatic_Transcription_of_Au.pdf\u0026Expires=1732828129\u0026Signature=TiCXNaKxltPpbE8mvxMTmwSQ1vU5yKUYgJcSc5X4yhhjx3eYAu0sB1OF9jCLtyh2h2S4OmNY2FbfPzLYC2Ca5hoGClaQMmD0vStxyFUwO2iLQ1IrOroephCUIoiMYh0~jjogfBJ90mjiNeK0PqU-GN0ZEvIMXQnr1IUgw97U42TtTerpD-pX1qJtjsCQQaPyptKeo2tYwnRMqwzviCOkG5FZjV04agEspY9wUFLNEkAGjXbEhQQhpvM4xQdzlZC-53TfO3Sy2jpA2lqFqDUMa5YS0Zuu8m~mGr40IZiBNYjUKJveWw6dzEbuUg-0Thwamw7AmiPNQOjYvKiVpUMujw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"System_for_Automatic_Transcription_of_Audio_Meetings_in_Slovak","translated_slug":"","page_count":1,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[{"id":111344878,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/111344878/thumbnails/1.jpg","file_name":"3-TUKE_System_na_automaticke_rozpoznavanie_a_prepis_mitingovych_zaznamov.pdf","download_url":"https://www.academia.edu/attachments/111344878/download_file?st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&st=MTczMjgyNDUyOSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"System_for_Automatic_Transcription_of_Au.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/111344878/3-TUKE_System_na_automaticke_rozpoznavanie_a_prepis_mitingovych_zaznamov-libre.pdf?1707584135=\u0026response-content-disposition=attachment%3B+filename%3DSystem_for_Automatic_Transcription_of_Au.pdf\u0026Expires=1732828129\u0026Signature=TiCXNaKxltPpbE8mvxMTmwSQ1vU5yKUYgJcSc5X4yhhjx3eYAu0sB1OF9jCLtyh2h2S4OmNY2FbfPzLYC2Ca5hoGClaQMmD0vStxyFUwO2iLQ1IrOroephCUIoiMYh0~jjogfBJ90mjiNeK0PqU-GN0ZEvIMXQnr1IUgw97U42TtTerpD-pX1qJtjsCQQaPyptKeo2tYwnRMqwzviCOkG5FZjV04agEspY9wUFLNEkAGjXbEhQQhpvM4xQdzlZC-53TfO3Sy2jpA2lqFqDUMa5YS0Zuu8m~mGr40IZiBNYjUKJveWw6dzEbuUg-0Thwamw7AmiPNQOjYvKiVpUMujw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":48,"name":"Engineering","url":"https://www.academia.edu/Documents/in/Engineering"},{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":6121,"name":"Slovak","url":"https://www.academia.edu/Documents/in/Slovak"},{"id":11984,"name":"Speech Recognition","url":"https://www.academia.edu/Documents/in/Speech_Recognition"},{"id":53293,"name":"Software","url":"https://www.academia.edu/Documents/in/Software"},{"id":1748596,"name":"Microphone","url":"https://www.academia.edu/Documents/in/Microphone"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="114726350"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/114726350/Biometric_User_Identification_by_Forearm_EMG_Analysis"><img alt="Research paper thumbnail of Biometric User Identification by Forearm EMG Analysis" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/114726350/Biometric_User_Identification_by_Forearm_EMG_Analysis">Biometric User Identification by Forearm EMG Analysis</a></div><div class="wp-workCard_item"><span>2022 IEEE International Conference on Consumer Electronics - Taiwan</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="114726350"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="114726350"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 114726350; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=114726350]").text(description); $(".js-view-count[data-work-id=114726350]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 114726350; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='114726350']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 114726350, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=114726350]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":114726350,"title":"Biometric User Identification by Forearm EMG Analysis","translated_title":"","metadata":{"publisher":"IEEE","publication_name":"2022 IEEE International Conference on Consumer Electronics - Taiwan"},"translated_abstract":null,"internal_url":"https://www.academia.edu/114726350/Biometric_User_Identification_by_Forearm_EMG_Analysis","translated_internal_url":"","created_at":"2024-02-10T08:40:37.946-08:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Biometric_User_Identification_by_Forearm_EMG_Analysis","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":465,"name":"Artificial Intelligence","url":"https://www.academia.edu/Documents/in/Artificial_Intelligence"},{"id":9173,"name":"Biometrics","url":"https://www.academia.edu/Documents/in/Biometrics"},{"id":50642,"name":"Virtual Reality","url":"https://www.academia.edu/Documents/in/Virtual_Reality"},{"id":170918,"name":"Electromyography","url":"https://www.academia.edu/Documents/in/Electromyography"},{"id":1568111,"name":"Convolutional Neural Network","url":"https://www.academia.edu/Documents/in/Convolutional_Neural_Network"}],"urls":[{"id":39345782,"url":"http://xplorestaging.ieee.org/ielx7/9868970/9868972/09869268.pdf?arnumber=9869268"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="104938291"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/104938291/Formosa_Speech_Recognition_Challenge_2018_Data_Plan_and_Baselines"><img alt="Research paper thumbnail of Formosa Speech Recognition Challenge 2018: Data, Plan and Baselines" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/104938291/Formosa_Speech_Recognition_Challenge_2018_Data_Plan_and_Baselines">Formosa Speech Recognition Challenge 2018: Data, Plan and Baselines</a></div><div class="wp-workCard_item"><span>2018 11th International Symposium on Chinese Spoken Language Processing (ISCSLP)</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">This paper introduces the Formosa speech recognition (FSR) challenge 2018, presents the provided ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">This paper introduces the Formosa speech recognition (FSR) challenge 2018, presents the provided data profile, evaluation plan and reports the experimental results of the baseline systems. This challenge focuses on spontaneous Taiwanese Mandarin speech recognition (TMSR) and it is based on a real-life, multigene broadcast radio speech corpus, NER-Trs-Vol1, selected from the Formosa speech in the wild (FSW) project. To assist participants to establish a good starting system, a set of baseline systems were published based on various deep neural network (DNN) models. NER-Trs-Vol1 is free for participants (noncommercial license), and its corresponding Kaldi recipes for the baselines have been published online. Experimental results show that the combination of NER-Trs-Vol1 and Kaldi recipes is a good resource pack for spontaneous TMSR research and could be used to initialize an advanced semi-supervised training procedure to further improve the recognition performance.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="104938291"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="104938291"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 104938291; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=104938291]").text(description); $(".js-view-count[data-work-id=104938291]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 104938291; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='104938291']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 104938291, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=104938291]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":104938291,"title":"Formosa Speech Recognition Challenge 2018: Data, Plan and Baselines","translated_title":"","metadata":{"abstract":"This paper introduces the Formosa speech recognition (FSR) challenge 2018, presents the provided data profile, evaluation plan and reports the experimental results of the baseline systems. This challenge focuses on spontaneous Taiwanese Mandarin speech recognition (TMSR) and it is based on a real-life, multigene broadcast radio speech corpus, NER-Trs-Vol1, selected from the Formosa speech in the wild (FSW) project. To assist participants to establish a good starting system, a set of baseline systems were published based on various deep neural network (DNN) models. NER-Trs-Vol1 is free for participants (noncommercial license), and its corresponding Kaldi recipes for the baselines have been published online. Experimental results show that the combination of NER-Trs-Vol1 and Kaldi recipes is a good resource pack for spontaneous TMSR research and could be used to initialize an advanced semi-supervised training procedure to further improve the recognition performance.","publisher":"IEEE","publication_name":"2018 11th International Symposium on Chinese Spoken Language Processing (ISCSLP)"},"translated_abstract":"This paper introduces the Formosa speech recognition (FSR) challenge 2018, presents the provided data profile, evaluation plan and reports the experimental results of the baseline systems. This challenge focuses on spontaneous Taiwanese Mandarin speech recognition (TMSR) and it is based on a real-life, multigene broadcast radio speech corpus, NER-Trs-Vol1, selected from the Formosa speech in the wild (FSW) project. To assist participants to establish a good starting system, a set of baseline systems were published based on various deep neural network (DNN) models. NER-Trs-Vol1 is free for participants (noncommercial license), and its corresponding Kaldi recipes for the baselines have been published online. Experimental results show that the combination of NER-Trs-Vol1 and Kaldi recipes is a good resource pack for spontaneous TMSR research and could be used to initialize an advanced semi-supervised training procedure to further improve the recognition performance.","internal_url":"https://www.academia.edu/104938291/Formosa_Speech_Recognition_Challenge_2018_Data_Plan_and_Baselines","translated_internal_url":"","created_at":"2023-07-25T22:20:01.601-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Formosa_Speech_Recognition_Challenge_2018_Data_Plan_and_Baselines","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":11984,"name":"Speech Recognition","url":"https://www.academia.edu/Documents/in/Speech_Recognition"},{"id":339518,"name":"LICENSE","url":"https://www.academia.edu/Documents/in/LICENSE"}],"urls":[{"id":33080472,"url":"http://xplorestaging.ieee.org/ielx7/8701133/8706262/08706700.pdf?arnumber=8706700"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="104938290"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/104938290/Human_Computer_Interaction_for_Intelligent_Systems"><img alt="Research paper thumbnail of Human–Computer Interaction for Intelligent Systems" class="work-thumbnail" src="https://attachments.academia-assets.com/104531848/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/104938290/Human_Computer_Interaction_for_Intelligent_Systems">Human–Computer Interaction for Intelligent Systems</a></div><div class="wp-workCard_item"><span>Electronics</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The further development of human–computer interaction applications is still in great demand as us...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The further development of human–computer interaction applications is still in great demand as users expect more natural interactions [...]</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="ad0b79d6639a45837d8af05eb5d7f399" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:104531848,&quot;asset_id&quot;:104938290,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/104531848/download_file?st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="104938290"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="104938290"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 104938290; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=104938290]").text(description); $(".js-view-count[data-work-id=104938290]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 104938290; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='104938290']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 104938290, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "ad0b79d6639a45837d8af05eb5d7f399" } } $('.js-work-strip[data-work-id=104938290]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":104938290,"title":"Human–Computer Interaction for Intelligent Systems","translated_title":"","metadata":{"abstract":"The further development of human–computer interaction applications is still in great demand as users expect more natural interactions [...]","publisher":"MDPI AG","publication_name":"Electronics"},"translated_abstract":"The further development of human–computer interaction applications is still in great demand as users expect more natural interactions [...]","internal_url":"https://www.academia.edu/104938290/Human_Computer_Interaction_for_Intelligent_Systems","translated_internal_url":"","created_at":"2023-07-25T22:20:01.280-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":104531848,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/104531848/thumbnails/1.jpg","file_name":"pdf.pdf","download_url":"https://www.academia.edu/attachments/104531848/download_file?st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Human_Computer_Interaction_for_Intellige.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/104531848/pdf-libre.pdf?1690355193=\u0026response-content-disposition=attachment%3B+filename%3DHuman_Computer_Interaction_for_Intellige.pdf\u0026Expires=1732828130\u0026Signature=apCygspxz19ppXCiruuI8bHRuNmX-XQU4q9h1bcf0fcTfqvrexoa2DeH7fFg8xG6WEAfXQdPy9mDQnFAsWIkzACQ0c4YveM9EqiwPg7PIQGCBUm4bLnBzknG-F14-pkDKEkEL5i5XgGQGjaAKGcQbwlPwELftvWuSr7EXrdxCZwB3x6JC0T~83xq5uwLlCCG-BRQr-jmdHk7aVqGoCpHjSkVdEBCuvMVc7aHQZjAY8BoPaxRwlHlLep6EZxMzooUFWRG3FhIHZDTYBlTpeysfkHz3rJj2avctPjjZ3Gn1EczRaOMZWz9CtJQhc4tcl7cSRd3u7neC7-im2Bc4hOjNw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Human_Computer_Interaction_for_Intelligent_Systems","translated_slug":"","page_count":4,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[{"id":104531848,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/104531848/thumbnails/1.jpg","file_name":"pdf.pdf","download_url":"https://www.academia.edu/attachments/104531848/download_file?st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&st=MTczMjgyNDUzMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Human_Computer_Interaction_for_Intellige.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/104531848/pdf-libre.pdf?1690355193=\u0026response-content-disposition=attachment%3B+filename%3DHuman_Computer_Interaction_for_Intellige.pdf\u0026Expires=1732828130\u0026Signature=apCygspxz19ppXCiruuI8bHRuNmX-XQU4q9h1bcf0fcTfqvrexoa2DeH7fFg8xG6WEAfXQdPy9mDQnFAsWIkzACQ0c4YveM9EqiwPg7PIQGCBUm4bLnBzknG-F14-pkDKEkEL5i5XgGQGjaAKGcQbwlPwELftvWuSr7EXrdxCZwB3x6JC0T~83xq5uwLlCCG-BRQr-jmdHk7aVqGoCpHjSkVdEBCuvMVc7aHQZjAY8BoPaxRwlHlLep6EZxMzooUFWRG3FhIHZDTYBlTpeysfkHz3rJj2avctPjjZ3Gn1EczRaOMZWz9CtJQhc4tcl7cSRd3u7neC7-im2Bc4hOjNw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":472,"name":"Human Computer Interaction","url":"https://www.academia.edu/Documents/in/Human_Computer_Interaction"},{"id":4758,"name":"Electronics","url":"https://www.academia.edu/Documents/in/Electronics"},{"id":20492,"name":"Spelling","url":"https://www.academia.edu/Documents/in/Spelling"},{"id":60270,"name":"Modalities","url":"https://www.academia.edu/Documents/in/Modalities"},{"id":3418189,"name":"testbed","url":"https://www.academia.edu/Documents/in/testbed"}],"urls":[{"id":33080471,"url":"https://www.mdpi.com/2079-9292/12/1/161/pdf"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="104938289"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/104938289/Comparison_of_Statistical_Algorithms_and_Deep_Learning_for_Slovak_Document_Classification"><img alt="Research paper thumbnail of Comparison of Statistical Algorithms and Deep Learning for Slovak Document Classification" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/104938289/Comparison_of_Statistical_Algorithms_and_Deep_Learning_for_Slovak_Document_Classification">Comparison of Statistical Algorithms and Deep Learning for Slovak Document Classification</a></div><div class="wp-workCard_item"><span>2022 IEEE International Conference on Consumer Electronics - Taiwan</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="104938289"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="104938289"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 104938289; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=104938289]").text(description); $(".js-view-count[data-work-id=104938289]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 104938289; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='104938289']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 104938289, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=104938289]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":104938289,"title":"Comparison of Statistical Algorithms and Deep Learning for Slovak Document Classification","translated_title":"","metadata":{"publisher":"IEEE","publication_name":"2022 IEEE International Conference on Consumer Electronics - Taiwan"},"translated_abstract":null,"internal_url":"https://www.academia.edu/104938289/Comparison_of_Statistical_Algorithms_and_Deep_Learning_for_Slovak_Document_Classification","translated_internal_url":"","created_at":"2023-07-25T22:20:00.700-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Comparison_of_Statistical_Algorithms_and_Deep_Learning_for_Slovak_Document_Classification","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":465,"name":"Artificial Intelligence","url":"https://www.academia.edu/Documents/in/Artificial_Intelligence"},{"id":1432,"name":"Natural Language Processing","url":"https://www.academia.edu/Documents/in/Natural_Language_Processing"},{"id":6121,"name":"Slovak","url":"https://www.academia.edu/Documents/in/Slovak"},{"id":3163891,"name":"Statistical Classification","url":"https://www.academia.edu/Documents/in/Statistical_Classification"}],"urls":[{"id":33080470,"url":"http://xplorestaging.ieee.org/ielx7/9868970/9868972/09869155.pdf?arnumber=9869155"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="104938288"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/104938288/Slovak_dialogue_corpus_with_backchannel_annotation"><img alt="Research paper thumbnail of Slovak dialogue corpus with backchannel annotation" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/104938288/Slovak_dialogue_corpus_with_backchannel_annotation">Slovak dialogue corpus with backchannel annotation</a></div><div class="wp-workCard_item"><span>2022 32nd International Conference Radioelektronika (RADIOELEKTRONIKA)</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="104938288"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="104938288"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 104938288; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=104938288]").text(description); $(".js-view-count[data-work-id=104938288]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 104938288; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='104938288']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 104938288, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=104938288]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":104938288,"title":"Slovak dialogue corpus with backchannel annotation","translated_title":"","metadata":{"publisher":"IEEE","publication_name":"2022 32nd International Conference Radioelektronika (RADIOELEKTRONIKA)"},"translated_abstract":null,"internal_url":"https://www.academia.edu/104938288/Slovak_dialogue_corpus_with_backchannel_annotation","translated_internal_url":"","created_at":"2023-07-25T22:20:00.349-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Slovak_dialogue_corpus_with_backchannel_annotation","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":1432,"name":"Natural Language Processing","url":"https://www.academia.edu/Documents/in/Natural_Language_Processing"},{"id":6121,"name":"Slovak","url":"https://www.academia.edu/Documents/in/Slovak"},{"id":11984,"name":"Speech Recognition","url":"https://www.academia.edu/Documents/in/Speech_Recognition"},{"id":38072,"name":"Annotation","url":"https://www.academia.edu/Documents/in/Annotation"}],"urls":[{"id":33080469,"url":"http://xplorestaging.ieee.org/ielx7/9764897/9764898/09764955.pdf?arnumber=9764955"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="104938287"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/104938287/Personalized_Taiwanese_Speech_Synthesis_using_Cascaded_ASR_and_TTS_Framework"><img alt="Research paper thumbnail of Personalized Taiwanese Speech Synthesis using Cascaded ASR and TTS Framework" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/104938287/Personalized_Taiwanese_Speech_Synthesis_using_Cascaded_ASR_and_TTS_Framework">Personalized Taiwanese Speech Synthesis using Cascaded ASR and TTS Framework</a></div><div class="wp-workCard_item"><span>2022 32nd International Conference Radioelektronika (RADIOELEKTRONIKA)</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="104938287"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="104938287"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 104938287; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=104938287]").text(description); $(".js-view-count[data-work-id=104938287]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 104938287; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='104938287']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 104938287, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=104938287]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":104938287,"title":"Personalized Taiwanese Speech Synthesis using Cascaded ASR and TTS Framework","translated_title":"","metadata":{"publisher":"IEEE","publication_name":"2022 32nd International Conference Radioelektronika (RADIOELEKTRONIKA)"},"translated_abstract":null,"internal_url":"https://www.academia.edu/104938287/Personalized_Taiwanese_Speech_Synthesis_using_Cascaded_ASR_and_TTS_Framework","translated_internal_url":"","created_at":"2023-07-25T22:19:59.866-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":160433363,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Personalized_Taiwanese_Speech_Synthesis_using_Cascaded_ASR_and_TTS_Framework","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":160433363,"first_name":"Matus","middle_initials":null,"last_name":"Pleva","page_name":"MatusPleva","domain_name":"tuke","created_at":"2020-06-07T14:11:52.465-07:00","display_name":"Matus Pleva","url":"https://tuke.academia.edu/MatusPleva"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":1432,"name":"Natural Language Processing","url":"https://www.academia.edu/Documents/in/Natural_Language_Processing"},{"id":2342,"name":"Speech Synthesis","url":"https://www.academia.edu/Documents/in/Speech_Synthesis"},{"id":11984,"name":"Speech Recognition","url":"https://www.academia.edu/Documents/in/Speech_Recognition"},{"id":200138,"name":"Mandarin Chinese","url":"https://www.academia.edu/Documents/in/Mandarin_Chinese"},{"id":726220,"name":"Speech corpus","url":"https://www.academia.edu/Documents/in/Speech_corpus"}],"urls":[{"id":33080468,"url":"http://xplorestaging.ieee.org/ielx7/9764897/9764898/09764940.pdf?arnumber=9764940"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> </div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/google_contacts-0dfb882d836b94dbcb4a2d123d6933fc9533eda5be911641f20b4eb428429600.js"], function() { // from javascript_helper.rb $('.js-google-connect-button').click(function(e) { e.preventDefault(); GoogleContacts.authorize_and_show_contacts(); Aedu.Dismissibles.recordClickthrough("WowProfileImportContactsPrompt"); }); $('.js-update-biography-button').click(function(e) { e.preventDefault(); Aedu.Dismissibles.recordClickthrough("UpdateUserBiographyPrompt"); $.ajax({ url: $r.api_v0_profiles_update_about_path({ subdomain_param: 'api', about: "", }), type: 'PUT', success: function(response) { location.reload(); } }); }); $('.js-work-creator-button').click(function (e) { e.preventDefault(); window.location = $r.upload_funnel_document_path({ source: encodeURIComponent(""), }); }); $('.js-video-upload-button').click(function (e) { e.preventDefault(); window.location = $r.upload_funnel_video_path({ source: encodeURIComponent(""), }); }); $('.js-do-this-later-button').click(function() { $(this).closest('.js-profile-nag-panel').remove(); Aedu.Dismissibles.recordDismissal("WowProfileImportContactsPrompt"); }); $('.js-update-biography-do-this-later-button').click(function(){ $(this).closest('.js-profile-nag-panel').remove(); Aedu.Dismissibles.recordDismissal("UpdateUserBiographyPrompt"); }); $('.wow-profile-mentions-upsell--close').click(function(){ $('.wow-profile-mentions-upsell--panel').hide(); Aedu.Dismissibles.recordDismissal("WowProfileMentionsUpsell"); }); $('.wow-profile-mentions-upsell--button').click(function(){ Aedu.Dismissibles.recordClickthrough("WowProfileMentionsUpsell"); }); new WowProfile.SocialRedesignUserWorks({ initialWorksOffset: 20, allWorksOffset: 20, maxSections: 1 }) }); </script> </div></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile_edit-5ea339ee107c863779f560dd7275595239fed73f1a13d279d2b599a28c0ecd33.js","https://a.academia-assets.com/assets/add_coauthor-22174b608f9cb871d03443cafa7feac496fb50d7df2d66a53f5ee3c04ba67f53.js","https://a.academia-assets.com/assets/tab-dcac0130902f0cc2d8cb403714dd47454f11fc6fb0e99ae6a0827b06613abc20.js","https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js"], function() { // from javascript_helper.rb window.ae = window.ae || {}; window.ae.WowProfile = window.ae.WowProfile || {}; if(Aedu.User.current && Aedu.User.current.id === $viewedUser.id) { window.ae.WowProfile.current_user_edit = {}; new WowProfileEdit.EditUploadView({ el: '.js-edit-upload-button-wrapper', model: window.$current_user, }); new AddCoauthor.AddCoauthorsController(); } var userInfoView = new WowProfile.SocialRedesignUserInfo({ recaptcha_key: "6LdxlRMTAAAAADnu_zyLhLg0YF9uACwz78shpjJB" }); WowProfile.router = new WowProfile.Router({ userInfoView: userInfoView }); Backbone.history.start({ pushState: true, root: "/" + $viewedUser.page_name }); new WowProfile.UserWorksNav() }); </script> </div> <div class="bootstrap login"><div class="modal fade login-modal" id="login-modal"><div class="login-modal-dialog modal-dialog"><div class="modal-content"><div class="modal-header"><button class="close close" data-dismiss="modal" type="button"><span aria-hidden="true">&times;</span><span class="sr-only">Close</span></button><h4 class="modal-title text-center"><strong>Log In</strong></h4></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><button class="btn btn-fb btn-lg btn-block btn-v-center-content" id="login-facebook-oauth-button"><svg style="float: left; width: 19px; line-height: 1em; margin-right: .3em;" aria-hidden="true" focusable="false" data-prefix="fab" data-icon="facebook-square" class="svg-inline--fa fa-facebook-square fa-w-14" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 448 512"><path fill="currentColor" d="M400 32H48A48 48 0 0 0 0 80v352a48 48 0 0 0 48 48h137.25V327.69h-63V256h63v-54.64c0-62.15 37-96.48 93.67-96.48 27.14 0 55.52 4.84 55.52 4.84v61h-31.27c-30.81 0-40.42 19.12-40.42 38.73V256h68.78l-11 71.69h-57.78V480H400a48 48 0 0 0 48-48V80a48 48 0 0 0-48-48z"></path></svg><small><strong>Log in</strong> with <strong>Facebook</strong></small></button><br /><button class="btn btn-google btn-lg btn-block btn-v-center-content" id="login-google-oauth-button"><svg style="float: left; width: 22px; line-height: 1em; margin-right: .3em;" aria-hidden="true" focusable="false" data-prefix="fab" data-icon="google-plus" class="svg-inline--fa fa-google-plus fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M256,8C119.1,8,8,119.1,8,256S119.1,504,256,504,504,392.9,504,256,392.9,8,256,8ZM185.3,380a124,124,0,0,1,0-248c31.3,0,60.1,11,83,32.3l-33.6,32.6c-13.2-12.9-31.3-19.1-49.4-19.1-42.9,0-77.2,35.5-77.2,78.1S142.3,334,185.3,334c32.6,0,64.9-19.1,70.1-53.3H185.3V238.1H302.2a109.2,109.2,0,0,1,1.9,20.7c0,70.8-47.5,121.2-118.8,121.2ZM415.5,273.8v35.5H380V273.8H344.5V238.3H380V202.8h35.5v35.5h35.2v35.5Z"></path></svg><small><strong>Log in</strong> with <strong>Google</strong></small></button><br /><style type="text/css">.sign-in-with-apple-button { width: 100%; height: 52px; border-radius: 3px; border: 1px solid black; cursor: pointer; }</style><script src="https://appleid.cdn-apple.com/appleauth/static/jsapi/appleid/1/en_US/appleid.auth.js" type="text/javascript"></script><div class="sign-in-with-apple-button" data-border="false" data-color="white" id="appleid-signin"><span &nbsp;&nbsp;="Sign Up with Apple" class="u-fs11"></span></div><script>AppleID.auth.init({ clientId: 'edu.academia.applesignon', scope: 'name email', redirectURI: 'https://www.academia.edu/sessions', state: "14a12f54bfd50cfdc9775e2495d8911604ac5efea384d7f50438c0f1955291b4", });</script><script>// Hacky way of checking if on fast loswp if (window.loswp == null) { (function() { const Google = window?.Aedu?.Auth?.OauthButton?.Login?.Google; const Facebook = window?.Aedu?.Auth?.OauthButton?.Login?.Facebook; if (Google) { new Google({ el: '#login-google-oauth-button', rememberMeCheckboxId: 'remember_me', track: null }); } if (Facebook) { new Facebook({ el: '#login-facebook-oauth-button', rememberMeCheckboxId: 'remember_me', track: null }); } })(); }</script></div></div></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><div class="hr-heading login-hr-heading"><span class="hr-heading-text">or</span></div></div></div></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><form class="js-login-form" action="https://www.academia.edu/sessions" accept-charset="UTF-8" method="post"><input name="utf8" type="hidden" value="&#x2713;" autocomplete="off" /><input type="hidden" name="authenticity_token" value="RKLYs5NB1YbWgnfImQ8T/RbK2fDBhM4+WA3TdK4Imv4qcO0My+lAOtCzm3fj2qbAKoeblyOcYcVBGXwwPiRHtQ==" autocomplete="off" /><div class="form-group"><label class="control-label" for="login-modal-email-input" style="font-size: 14px;">Email</label><input class="form-control" id="login-modal-email-input" name="login" type="email" /></div><div class="form-group"><label class="control-label" for="login-modal-password-input" style="font-size: 14px;">Password</label><input class="form-control" id="login-modal-password-input" name="password" type="password" /></div><input type="hidden" name="post_login_redirect_url" id="post_login_redirect_url" value="https://tuke.academia.edu/MatusPleva" autocomplete="off" /><div class="checkbox"><label><input type="checkbox" name="remember_me" id="remember_me" value="1" checked="checked" /><small style="font-size: 12px; margin-top: 2px; display: inline-block;">Remember me on this computer</small></label></div><br><input type="submit" name="commit" value="Log In" class="btn btn-primary btn-block btn-lg js-login-submit" data-disable-with="Log In" /></br></form><script>typeof window?.Aedu?.recaptchaManagedForm === 'function' && window.Aedu.recaptchaManagedForm( document.querySelector('.js-login-form'), document.querySelector('.js-login-submit') );</script><small style="font-size: 12px;"><br />or <a data-target="#login-modal-reset-password-container" data-toggle="collapse" href="javascript:void(0)">reset password</a></small><div class="collapse" id="login-modal-reset-password-container"><br /><div class="well margin-0x"><form class="js-password-reset-form" action="https://www.academia.edu/reset_password" accept-charset="UTF-8" method="post"><input name="utf8" type="hidden" value="&#x2713;" autocomplete="off" /><input type="hidden" name="authenticity_token" value="6eSS56C40hmbZK+HXFYYGfC/F/guTATbZSTxvWupKoaHNqdY+BBHpZ1VQzgmg60kzPJVn8xUqyB8MF75+4X3zQ==" autocomplete="off" /><p>Enter the email address you signed up with and we&#39;ll email you a reset link.</p><div class="form-group"><input class="form-control" name="email" type="email" /></div><script src="https://recaptcha.net/recaptcha/api.js" async defer></script> <script> var invisibleRecaptchaSubmit = function () { var closestForm = function (ele) { var curEle = ele.parentNode; while (curEle.nodeName !== 'FORM' && curEle.nodeName !== 'BODY'){ curEle = curEle.parentNode; } return curEle.nodeName === 'FORM' ? curEle : null }; var eles = document.getElementsByClassName('g-recaptcha'); if (eles.length > 0) { var form = closestForm(eles[0]); if (form) { form.submit(); } } }; </script> <input type="submit" data-sitekey="6Lf3KHUUAAAAACggoMpmGJdQDtiyrjVlvGJ6BbAj" data-callback="invisibleRecaptchaSubmit" class="g-recaptcha btn btn-primary btn-block" value="Email me a link" value=""/> </form></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/collapse-45805421cf446ca5adf7aaa1935b08a3a8d1d9a6cc5d91a62a2a3a00b20b3e6a.js"], function() { // from javascript_helper.rb $("#login-modal-reset-password-container").on("shown.bs.collapse", function() { $(this).find("input[type=email]").focus(); }); }); </script> </div></div></div><div class="modal-footer"><div class="text-center"><small style="font-size: 12px;">Need an account?&nbsp;<a rel="nofollow" href="https://www.academia.edu/signup">Click here to sign up</a></small></div></div></div></div></div></div><script>// If we are on subdomain or non-bootstrapped page, redirect to login page instead of showing modal (function(){ if (typeof $ === 'undefined') return; var host = window.location.hostname; if ((host === $domain || host === "www."+$domain) && (typeof $().modal === 'function')) { $("#nav_log_in").click(function(e) { // Don't follow the link and open the modal e.preventDefault(); $("#login-modal").on('shown.bs.modal', function() { $(this).find("#login-modal-email-input").focus() }).modal('show'); }); } })()</script> <div class="bootstrap" id="footer"><div class="footer-content clearfix text-center padding-top-7x" style="width:100%;"><ul class="footer-links-secondary footer-links-wide list-inline margin-bottom-1x"><li><a href="https://www.academia.edu/about">About</a></li><li><a href="https://www.academia.edu/press">Press</a></li><li><a rel="nofollow" href="https://medium.com/academia">Blog</a></li><li><a href="https://www.academia.edu/documents">Papers</a></li><li><a href="https://www.academia.edu/topics">Topics</a></li><li><a href="https://www.academia.edu/journals">Academia.edu Journals</a></li><li><a rel="nofollow" href="https://www.academia.edu/hiring"><svg style="width: 13px; height: 13px;" aria-hidden="true" focusable="false" data-prefix="fas" data-icon="briefcase" class="svg-inline--fa fa-briefcase fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M320 336c0 8.84-7.16 16-16 16h-96c-8.84 0-16-7.16-16-16v-48H0v144c0 25.6 22.4 48 48 48h416c25.6 0 48-22.4 48-48V288H320v48zm144-208h-80V80c0-25.6-22.4-48-48-48H176c-25.6 0-48 22.4-48 48v48H48c-25.6 0-48 22.4-48 48v80h512v-80c0-25.6-22.4-48-48-48zm-144 0H192V96h128v32z"></path></svg>&nbsp;<strong>We're Hiring!</strong></a></li><li><a rel="nofollow" href="https://support.academia.edu/"><svg style="width: 12px; height: 12px;" aria-hidden="true" focusable="false" data-prefix="fas" data-icon="question-circle" class="svg-inline--fa fa-question-circle fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M504 256c0 136.997-111.043 248-248 248S8 392.997 8 256C8 119.083 119.043 8 256 8s248 111.083 248 248zM262.655 90c-54.497 0-89.255 22.957-116.549 63.758-3.536 5.286-2.353 12.415 2.715 16.258l34.699 26.31c5.205 3.947 12.621 3.008 16.665-2.122 17.864-22.658 30.113-35.797 57.303-35.797 20.429 0 45.698 13.148 45.698 32.958 0 14.976-12.363 22.667-32.534 33.976C247.128 238.528 216 254.941 216 296v4c0 6.627 5.373 12 12 12h56c6.627 0 12-5.373 12-12v-1.333c0-28.462 83.186-29.647 83.186-106.667 0-58.002-60.165-102-116.531-102zM256 338c-25.365 0-46 20.635-46 46 0 25.364 20.635 46 46 46s46-20.636 46-46c0-25.365-20.635-46-46-46z"></path></svg>&nbsp;<strong>Help Center</strong></a></li></ul><ul class="footer-links-tertiary list-inline margin-bottom-1x"><li class="small">Find new research papers in:</li><li class="small"><a href="https://www.academia.edu/Documents/in/Physics">Physics</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Chemistry">Chemistry</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Biology">Biology</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Health_Sciences">Health Sciences</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Ecology">Ecology</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Earth_Sciences">Earth Sciences</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Cognitive_Science">Cognitive Science</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Mathematics">Mathematics</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Computer_Science">Computer Science</a></li></ul></div></div><div class="DesignSystem" id="credit" style="width:100%;"><ul class="u-pl0x footer-links-legal list-inline"><li><a rel="nofollow" href="https://www.academia.edu/terms">Terms</a></li><li><a rel="nofollow" href="https://www.academia.edu/privacy">Privacy</a></li><li><a rel="nofollow" href="https://www.academia.edu/copyright">Copyright</a></li><li>Academia &copy;2024</li></ul></div><script> //<![CDATA[ window.detect_gmtoffset = true; window.Academia && window.Academia.set_gmtoffset && Academia.set_gmtoffset('/gmtoffset'); //]]> </script> <div id='overlay_background'></div> <div id='bootstrap-modal-container' class='bootstrap'></div> <div id='ds-modal-container' class='bootstrap DesignSystem'></div> <div id='full-screen-modal'></div> </div> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10