CINXE.COM
Search results for: computer human interaction
<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: computer human interaction</title> <meta name="description" content="Search results for: computer human interaction"> <meta name="keywords" content="computer human interaction"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="computer human interaction" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="computer human interaction"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 13527</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: computer human interaction</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13527</span> Justyna Skrzyńska, Zdzisław Kobos, Zbigniew Wochyński</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Vahid%20Bairami%20Rad">Vahid Bairami Rad</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Due to the tremendous progress in computer technology in the last decades, the capabilities of computers increased enormously and working with a computer became a normal activity for nearly everybody. With all the possibilities a computer can offer, humans and their interaction with computers are now a limiting factor. This gave rise to a lot of research in the field of HCI (human computer interaction) aiming to make interaction easier, more intuitive, and more efficient. To research eye gaze based interfaces it is necessary to understand both sides of the interaction–the human eye and the eye tracker. The first section gives an overview on the anatomy of the eye. The second section accuracy and calibration issue. The subsequent section presents data from a user study where eye movements have been recorded while watching a video and while surfing the Internet. Statistics on the eye movement during these tasks for several individuals provide typical values and ranges for fixation times and saccade lengths and are the foundation for discussions in later chapters. The data also reveal typical limitations of eye trackers. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=human%20computer%20interaction" title="human computer interaction">human computer interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=gaze%20tracking" title=" gaze tracking"> gaze tracking</a>, <a href="https://publications.waset.org/abstracts/search?q=calibration" title=" calibration"> calibration</a>, <a href="https://publications.waset.org/abstracts/search?q=eye%20movement" title=" eye movement"> eye movement</a> </p> <a href="https://publications.waset.org/abstracts/29656/justyna-skrzynska-zdzislaw-kobos-zbigniew-wochynski" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/29656.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">537</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13526</span> Emotions in Human-Machine Interaction</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Joanna%20Maj">Joanna Maj</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Awe inspiring is the idea that emotions could be present in human-machine interactions, both on the human side as well as the machine side. Human factors present intriguing components and are examined in detail while discussing this controversial topic. Mood, attention, memory, performance, assessment, causes of emotion, and neurological responses are analyzed as components of the interaction. Problems in computer-based technology, revenge of the system on its users and design, and applications comprise a major part of all descriptions and examples throughout this paper. It also allows for critical thinking while challenging intriguing questions regarding future directions in research, dealing with emotion in human-machine interactions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=biocomputing" title="biocomputing">biocomputing</a>, <a href="https://publications.waset.org/abstracts/search?q=biomedical%20engineering" title=" biomedical engineering"> biomedical engineering</a>, <a href="https://publications.waset.org/abstracts/search?q=emotions" title=" emotions"> emotions</a>, <a href="https://publications.waset.org/abstracts/search?q=human-machine%20interaction" title=" human-machine interaction"> human-machine interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=interfaces" title=" interfaces"> interfaces</a> </p> <a href="https://publications.waset.org/abstracts/156950/emotions-in-human-machine-interaction" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/156950.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">133</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13525</span> An Interaction between Human and Animal through the Death Experience</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mindaugas%20Kazlauskas">Mindaugas Kazlauskas</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, it is presupposed that the description of the relationship between animal and human should begin with a description of the direct experience of the animal and how, in this experience, the human experiences itself (a self awareness mode). A human is concerned first and foremost with himself as a human through the experience of another as an animal. The questionsare: In the encounter with an animal, how is the animal constituted in the acts of human experience? How does human-animal interaction influence human behavioral patterns, and how does the human identifies itself in this interaction? The paper will present the results of interpretative phenomenological descriptions (IPA) of the relationship between human and animal in the face of death phenomenon through the experience of pet owners who lost their beloved companions and hunters, veterinatians, and farmers who face animal death. The results of IPA analysis reveal different relations such as the identification with an animal, the alienation experience, the experience of resistance, and an experience of detachment. Within these themes, IPA qualitative research results will be presented by highlighting patterns of human behavior, following Friedrich Schlachermacher's hermeneutics methodological principles, and reflecting on changes in value and attitude within society during daily interaction with the animal. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=animal%20human%20interaction" title="animal human interaction">animal human interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=phenomenology" title=" phenomenology"> phenomenology</a>, <a href="https://publications.waset.org/abstracts/search?q=philosophy" title=" philosophy"> philosophy</a>, <a href="https://publications.waset.org/abstracts/search?q=death%20phenomenon" title=" death phenomenon"> death phenomenon</a> </p> <a href="https://publications.waset.org/abstracts/150335/an-interaction-between-human-and-animal-through-the-death-experience" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/150335.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">151</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13524</span> Human Computer Interaction Using Computer Vision and Speech Processing</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Shreyansh%20Jain%20Jeetmal">Shreyansh Jain Jeetmal</a>, <a href="https://publications.waset.org/abstracts/search?q=Shobith%20P.%20Chadaga"> Shobith P. Chadaga</a>, <a href="https://publications.waset.org/abstracts/search?q=Shreyas%20H.%20Srinivas"> Shreyas H. Srinivas</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Internet of Things (IoT) is seen as the next major step in the ongoing revolution in the Information Age. It is predicted that in the near future billions of embedded devices will be communicating with each other to perform a plethora of tasks with or without human intervention. One of the major ongoing hotbed of research activity in IoT is Human Computer Interaction (HCI). HCI is used to facilitate communication between an intelligent system and a user. An intelligent system typically comprises of a system consisting of various sensors, actuators and embedded controllers which communicate with each other to monitor data collected from the environment. Communication by the user to the system is typically done using voice. One of the major ongoing applications of HCI is in home automation as a personal assistant. The prime objective of our project is to implement a use case of HCI for home automation. Our system is designed to detect and recognize the users and personalize the appliances in the house according to their individual preferences. Our HCI system is also capable of speaking with the user when certain commands are spoken such as searching on the web for information and controlling appliances. Our system can also monitor the environment in the house such as air quality and gas leakages for added safety. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=human%20computer%20interaction" title="human computer interaction">human computer interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=internet%20of%20things" title=" internet of things"> internet of things</a>, <a href="https://publications.waset.org/abstracts/search?q=computer%20vision" title=" computer vision"> computer vision</a>, <a href="https://publications.waset.org/abstracts/search?q=sensor%20networks" title=" sensor networks"> sensor networks</a>, <a href="https://publications.waset.org/abstracts/search?q=speech%20to%20text" title=" speech to text"> speech to text</a>, <a href="https://publications.waset.org/abstracts/search?q=text%20to%20speech" title=" text to speech"> text to speech</a>, <a href="https://publications.waset.org/abstracts/search?q=android" title=" android"> android</a> </p> <a href="https://publications.waset.org/abstracts/73991/human-computer-interaction-using-computer-vision-and-speech-processing" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/73991.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">362</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13523</span> Vision-Based Hand Segmentation Techniques for Human-Computer Interaction</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=M.%20Jebali">M. Jebali</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Jemni"> M. Jemni</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This work is the part of vision based hand gesture recognition system for Natural Human Computer Interface. Hand tracking and segmentation are the primary steps for any hand gesture recognition system. The aim of this paper is to develop robust and efficient hand segmentation algorithm such as an input to another system which attempt to bring the HCI performance nearby the human-human interaction, by modeling an intelligent sign language recognition system based on prediction in the context of dialogue between the system (avatar) and the interlocutor. For the purpose of hand segmentation, an overcoming occlusion approach has been proposed for superior results for detection of hand from an image. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=HCI" title="HCI">HCI</a>, <a href="https://publications.waset.org/abstracts/search?q=sign%20language%20recognition" title=" sign language recognition"> sign language recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=object%20tracking" title=" object tracking"> object tracking</a>, <a href="https://publications.waset.org/abstracts/search?q=hand%20segmentation" title=" hand segmentation"> hand segmentation</a> </p> <a href="https://publications.waset.org/abstracts/26490/vision-based-hand-segmentation-techniques-for-human-computer-interaction" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/26490.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">412</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13522</span> Hand Motion Tracking as a Human Computer Interation for People with Cerebral Palsy</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ana%20%20Teixeira">Ana Teixeira</a>, <a href="https://publications.waset.org/abstracts/search?q=Joao%20Orvalho"> Joao Orvalho</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper describes experiments using Scratch games, to check the feasibility of employing cerebral palsy users gestures as an alternative of interaction with a computer carried out by students of Master Human Computer Interaction (HCI) of IPC Coimbra. The main focus of this work is to study the usability of a Web Camera as a motion tracking device to achieve a virtual human-computer interaction used by individuals with CP. An approach for Human-computer Interaction (HCI) is present, where individuals with cerebral palsy react and interact with a scratch game through the use of a webcam as an external interaction device. Motion tracking interaction is an emerging technology that is becoming more useful, effective and affordable. However, it raises new questions from the HCI viewpoint, for example, which environments are most suitable for interaction by users with disabilities. In our case, we put emphasis on the accessibility and usability aspects of such interaction devices to meet the special needs of people with disabilities, and specifically people with CP. Despite the fact that our work has just started, preliminary results show that, in general, computer vision interaction systems are very useful; in some cases, these systems are the only way by which some people can interact with a computer. The purpose of the experiments was to verify two hypothesis: 1) people with cerebral palsy can interact with a computer using their natural gestures, 2) scratch games can be a research tool in experiments with disabled young people. A game in Scratch with three levels is created to be played through the use of a webcam. This device permits the detection of certain key points of the user’s body, which allows to assume the head, arms and specially the hands as the most important aspects of recognition. Tests with 5 individuals of different age and gender were made throughout 3 days through periods of 30 minutes with each participant. For a more extensive and reliable statistical analysis, the number of both participants and repetitions in further investigations should be increased. However, already at this stage of research, it is possible to draw some conclusions. First, and the most important, is that simple scratch games on the computer can be a research tool that allows investigating the interaction with computer performed by young persons with CP using intentional gestures. Measurements performed with the assistance of games are attractive for young disabled users. The second important conclusion is that they are able to play scratch games using their gestures. Therefore, the proposed interaction method is promising for them as a human-computer interface. In the future, we plan to include the development of multimodal interfaces that combine various computer vision devices with other input devices improvements in the existing systems to accommodate more the special needs of individuals, in addition, to perform experiments on a larger number of participants. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=motion%20tracking" title="motion tracking">motion tracking</a>, <a href="https://publications.waset.org/abstracts/search?q=cerebral%20palsy" title=" cerebral palsy"> cerebral palsy</a>, <a href="https://publications.waset.org/abstracts/search?q=rehabilitation" title=" rehabilitation"> rehabilitation</a>, <a href="https://publications.waset.org/abstracts/search?q=HCI" title=" HCI"> HCI</a> </p> <a href="https://publications.waset.org/abstracts/53050/hand-motion-tracking-as-a-human-computer-interation-for-people-with-cerebral-palsy" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/53050.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">235</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13521</span> User Experience Measurement of User Interfaces</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mohammad%20Hashemi">Mohammad Hashemi</a>, <a href="https://publications.waset.org/abstracts/search?q=John%20Herbert"> John Herbert</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Quantifying and measuring Quality of Experience (QoE) are important and difficult concerns in Human Computer Interaction (HCI). Quality of Service (QoS) and the actual User Interface (UI) of the application are both important contributors to the QoE of a user. This paper describes a framework that measures accurately the way a user uses the UI in order to model users' behaviours and profiles. It monitors the use of the mouse and use of UI elements with accurate time measurement. It does this in real-time and does so unobtrusively and efficiently allowing the user to work as normal with the application. This real-time accurate measurement of the user's interaction provides valuable data and insight into the use of the UI, and is also the basis for analysis of the user's QoE. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=user%20modelling" title="user modelling">user modelling</a>, <a href="https://publications.waset.org/abstracts/search?q=user%20interface%20experience" title=" user interface experience"> user interface experience</a>, <a href="https://publications.waset.org/abstracts/search?q=quality%20of%20experience" title=" quality of experience"> quality of experience</a>, <a href="https://publications.waset.org/abstracts/search?q=user%20experience" title=" user experience"> user experience</a>, <a href="https://publications.waset.org/abstracts/search?q=human%20and%20computer%20interaction" title=" human and computer interaction"> human and computer interaction</a> </p> <a href="https://publications.waset.org/abstracts/3652/user-experience-measurement-of-user-interfaces" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/3652.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">503</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13520</span> Relational Attention Shift on Images Using Bu-Td Architecture and Sequential Structure Revealing</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Alona%20Faktor">Alona Faktor</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this work, we present a NN-based computational model that can perform attention shifts according to high-level instruction. The instruction specifies the type of attentional shift using explicit geometrical relation. The instruction also can be of cognitive nature, specifying more complex human-human interaction or human-object interaction, or object-object interaction. Applying this approach sequentially allows obtaining a structural description of an image. A novel data-set of interacting humans and objects is constructed using a computer graphics engine. Using this data, we perform systematic research of relational segmentation shifts. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=cognitive%20science" title="cognitive science">cognitive science</a>, <a href="https://publications.waset.org/abstracts/search?q=attentin" title=" attentin"> attentin</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title=" deep learning"> deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=generalization" title=" generalization"> generalization</a> </p> <a href="https://publications.waset.org/abstracts/135787/relational-attention-shift-on-images-using-bu-td-architecture-and-sequential-structure-revealing" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/135787.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">198</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13519</span> Hand Controlled Mobile Robot Applied in Virtual Environment</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jozsef%20Katona">Jozsef Katona</a>, <a href="https://publications.waset.org/abstracts/search?q=Attila%20Kovari"> Attila Kovari</a>, <a href="https://publications.waset.org/abstracts/search?q=Tibor%20Ujbanyi"> Tibor Ujbanyi</a>, <a href="https://publications.waset.org/abstracts/search?q=Gergely%20Sziladi"> Gergely Sziladi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> By the development of IT systems, human-computer interaction is also developing even faster and newer communication methods become available in human-machine interaction. In this article, the application of a hand gesture controlled human-computer interface is being introduced through the example of a mobile robot. The control of the mobile robot is implemented in a realistic virtual environment that is advantageous regarding the aspect of different tests, parallel examinations, so the purchase of expensive equipment is unnecessary. The usability of the implemented hand gesture control has been evaluated by test subjects. According to the opinion of the testing subjects, the system can be well used, and its application would be recommended on other application fields too. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=human-machine%20interface%20%28HCI%29" title="human-machine interface (HCI)">human-machine interface (HCI)</a>, <a href="https://publications.waset.org/abstracts/search?q=mobile%20robot" title=" mobile robot"> mobile robot</a>, <a href="https://publications.waset.org/abstracts/search?q=hand%20control" title=" hand control"> hand control</a>, <a href="https://publications.waset.org/abstracts/search?q=virtual%20environment" title=" virtual environment"> virtual environment</a> </p> <a href="https://publications.waset.org/abstracts/75711/hand-controlled-mobile-robot-applied-in-virtual-environment" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/75711.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">298</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13518</span> Mobile Cloud Computing: How to Improve</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Abdullah%20Aljumah">Abdullah Aljumah</a>, <a href="https://publications.waset.org/abstracts/search?q=Tariq%20Ahamad"> Tariq Ahamad</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The simplest possible human-computer interaction is mobile cloud computing as it emerges and makes the use of all modern-day human-oriented technology. The main aim of this idea is the QoS (quality of service) by using user-friendly and reliable software over the global network in order to make it economical by reducing cost, reliable, and increase the main storage. Since we studied and went through almost all the existing related work in this area and we came up with some challenges that will rise or might be rising for some basic areas in mobile cloud computing and mostly stogie and security area. In this research article, we suggest some recommendation for mobile cloud computing and for its security that will help in building more powerful tools to handle all this pressure. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Cloud%20Computing" title="Cloud Computing">Cloud Computing</a>, <a href="https://publications.waset.org/abstracts/search?q=MCC" title=" MCC"> MCC</a>, <a href="https://publications.waset.org/abstracts/search?q=SAAS" title=" SAAS"> SAAS</a>, <a href="https://publications.waset.org/abstracts/search?q=computer%20interaction" title=" computer interaction"> computer interaction</a> </p> <a href="https://publications.waset.org/abstracts/20811/mobile-cloud-computing-how-to-improve" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/20811.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">380</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13517</span> The Interaction between Human and Environment on the Perspective of Environmental Ethics</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mella%20Ismelina%20Farma%20Rahayu">Mella Ismelina Farma Rahayu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Environmental problems could not be separated from unethical human perspectives and behaviors toward the environment. There is a fundamental error in the philosophy of people’s perspective about human and nature and their relationship with the environment, which in turn will create an inappropriate behavior in relation to the environment. The aim of this study is to investigate and to understand the ethics of the environment in the context of humans interacting with the environment by using the hermeneutic approach. The related theories and concepts collected from literature review are used as data, which were analyzed by using interpretation, critical evaluation, internal coherence, comparisons, and heuristic techniques. As a result of this study, there will be a picture related to the interaction of human and environment in the perspective of environmental ethics, as well as the problems of the value of ecological justice in the interaction of humans and environment. We suggest that the interaction between humans and environment need to be based on environmental ethics, in a spirit of mutual respect between humans and the natural world. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=environment" title="environment">environment</a>, <a href="https://publications.waset.org/abstracts/search?q=environmental%20ethics" title=" environmental ethics"> environmental ethics</a>, <a href="https://publications.waset.org/abstracts/search?q=interaction" title=" interaction"> interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=value" title=" value"> value</a> </p> <a href="https://publications.waset.org/abstracts/45287/the-interaction-between-human-and-environment-on-the-perspective-of-environmental-ethics" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/45287.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">422</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13516</span> Automatic Motion Trajectory Analysis for Dual Human Interaction Using Video Sequences</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yuan-Hsiang%20Chang">Yuan-Hsiang Chang</a>, <a href="https://publications.waset.org/abstracts/search?q=Pin-Chi%20Lin"> Pin-Chi Lin</a>, <a href="https://publications.waset.org/abstracts/search?q=Li-Der%20Jeng"> Li-Der Jeng</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Advance in techniques of image and video processing has enabled the development of intelligent video surveillance systems. This study was aimed to automatically detect moving human objects and to analyze events of dual human interaction in a surveillance scene. Our system was developed in four major steps: image preprocessing, human object detection, human object tracking, and motion trajectory analysis. The adaptive background subtraction and image processing techniques were used to detect and track moving human objects. To solve the occlusion problem during the interaction, the Kalman filter was used to retain a complete trajectory for each human object. Finally, the motion trajectory analysis was developed to distinguish between the interaction and non-interaction events based on derivatives of trajectories related to the speed of the moving objects. Using a database of 60 video sequences, our system could achieve the classification accuracy of 80% in interaction events and 95% in non-interaction events, respectively. In summary, we have explored the idea to investigate a system for the automatic classification of events for interaction and non-interaction events using surveillance cameras. Ultimately, this system could be incorporated in an intelligent surveillance system for the detection and/or classification of abnormal or criminal events (e.g., theft, snatch, fighting, etc.). <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=motion%20detection" title="motion detection">motion detection</a>, <a href="https://publications.waset.org/abstracts/search?q=motion%20tracking" title=" motion tracking"> motion tracking</a>, <a href="https://publications.waset.org/abstracts/search?q=trajectory%20analysis" title=" trajectory analysis"> trajectory analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=video%20surveillance" title=" video surveillance"> video surveillance</a> </p> <a href="https://publications.waset.org/abstracts/13650/automatic-motion-trajectory-analysis-for-dual-human-interaction-using-video-sequences" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/13650.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">548</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13515</span> An AI-generated Semantic Communication Platform in HCI Course</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yi%20Yang">Yi Yang</a>, <a href="https://publications.waset.org/abstracts/search?q=Jiasong%20Sun"> Jiasong Sun</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Almost every aspect of our daily lives is now intertwined with some degree of human-computer interaction (HCI). HCI courses draw on knowledge from disciplines as diverse as computer science, psychology, design principles, anthropology, and more. Our HCI courses, named the Media and Cognition course, are constantly updated to reflect state-of-the-art technological advancements such as virtual reality, augmented reality, and artificial intelligence-based interactions. For more than a decade, our course has used an interest-based approach to teaching, in which students proactively propose some research-based questions and collaborate with teachers, using course knowledge to explore potential solutions. Semantic communication plays a key role in facilitating understanding and interaction between users and computer systems, ultimately enhancing system usability and user experience. The advancements in AI-generated technology, which have gained significant attention from both academia and industry in recent years, are exemplified by language models like GPT-3 that generate human-like dialogues from given prompts. Our latest version of the Human-Computer Interaction course practices a semantic communication platform based on AI-generated techniques. The purpose of this semantic communication is twofold: to extract and transmit task-specific information while ensuring efficient end-to-end communication with minimal latency. An AI-generated semantic communication platform evaluates the retention of signal sources and converts low-retain ability visual signals into textual prompts. These data are transmitted through AI-generated techniques and reconstructed at the receiving end; on the other hand, visual signals with a high retain ability rate are compressed and transmitted according to their respective regions. The platform and associated research are a testament to our students' growing ability to independently investigate state-of-the-art technologies. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=human-computer%20interaction" title="human-computer interaction">human-computer interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=media%20and%20cognition%20course" title=" media and cognition course"> media and cognition course</a>, <a href="https://publications.waset.org/abstracts/search?q=semantic%20communication" title=" semantic communication"> semantic communication</a>, <a href="https://publications.waset.org/abstracts/search?q=retainability" title=" retainability"> retainability</a>, <a href="https://publications.waset.org/abstracts/search?q=prompts" title=" prompts"> prompts</a> </p> <a href="https://publications.waset.org/abstracts/170407/an-ai-generated-semantic-communication-platform-in-hci-course" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/170407.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">115</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13514</span> Brain Computer Interface Implementation for Affective Computing Sensing: Classifiers Comparison</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ram%C3%B3n%20Aparicio-Garc%C3%ADa">Ramón Aparicio-García</a>, <a href="https://publications.waset.org/abstracts/search?q=Gustavo%20Ju%C3%A1rez%20Gracia"> Gustavo Juárez Gracia</a>, <a href="https://publications.waset.org/abstracts/search?q=Jes%C3%BAs%20%C3%81lvarez%20Cedillo"> Jesús Álvarez Cedillo</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A research line of the computer science that involve the study of the Human-Computer Interaction (HCI), which search to recognize and interpret the user intent by the storage and the subsequent analysis of the electrical signals of the brain, for using them in the control of electronic devices. On the other hand, the affective computing research applies the human emotions in the HCI process helping to reduce the user frustration. This paper shows the results obtained during the hardware and software development of a Brain Computer Interface (BCI) capable of recognizing the human emotions through the association of the brain electrical activity patterns. The hardware involves the sensing stage and analogical-digital conversion. The interface software involves algorithms for pre-processing of the signal in time and frequency analysis and the classification of patterns associated with the electrical brain activity. The methods used for the analysis and classification of the signal have been tested separately, by using a database that is accessible to the public, besides to a comparison among classifiers in order to know the best performing. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=affective%20computing" title="affective computing">affective computing</a>, <a href="https://publications.waset.org/abstracts/search?q=interface" title=" interface"> interface</a>, <a href="https://publications.waset.org/abstracts/search?q=brain" title=" brain"> brain</a>, <a href="https://publications.waset.org/abstracts/search?q=intelligent%20interaction" title=" intelligent interaction"> intelligent interaction</a> </p> <a href="https://publications.waset.org/abstracts/27725/brain-computer-interface-implementation-for-affective-computing-sensing-classifiers-comparison" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/27725.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">388</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13513</span> The Role of Situational Factors in User Experience during Human-Robot Interaction</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Da%20Tao">Da Tao</a>, <a href="https://publications.waset.org/abstracts/search?q=Tieyan%20Wang"> Tieyan Wang</a>, <a href="https://publications.waset.org/abstracts/search?q=Mingfu%20Qin"> Mingfu Qin</a> </p> <p class="card-text"><strong>Abstract:</strong></p> While social robots have been increasingly developed and rapidly applied in our daily life, how robots should interact with humans is still an urgent problem to be explored. Appropriate use of interactive behavior is likely to create a good user experience in human-robot interaction situations, which in turn can improve people’s acceptance of robots. This paper aimed to systematically and quantitatively examine the effects of several important situational factors (i.e., interaction distance, interaction posture, and feedback style) on user experience during human-robot interaction. A three-factor mixed designed experiment was adopted in this study, where subjects were asked to interact with a social robot in different interaction situations by combinations of varied interaction distance, interaction posture, and feedback style. A set of data on users’ behavioral performance, subjective perceptions, and eye movement measures were tracked and collected, and analyzed by repeated measures analysis of variance. The results showed that the three situational factors showed no effects on behavioral performance in tasks during human-robot interaction. Interaction distance and feedback style yielded significant main effects and interaction effects on the proportion of fixation times. The proportion of fixation times on the robot is higher for negative feedback compared with positive feedback style. While the proportion of fixation times on the robot generally decreased with the increase of the interaction distance, it decreased more under the positive feedback style than under the negative feedback style. In addition, there were significant interaction effects on pupil diameter between interaction distance and posture. As interaction distance increased, mean pupil diameter became smaller in side interaction, while it became larger in frontal interaction. Moreover, the three situation factors had significant interaction effects on user acceptance of the interaction mode. The findings are helpful in the underlying mechanism of user experience in human-robot interaction situations and provide important implications for the design of robot behavioral expression and for optimal strategies to improve user experience during human-robot interaction. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=social%20robots" title="social robots">social robots</a>, <a href="https://publications.waset.org/abstracts/search?q=human-robot%20interaction" title=" human-robot interaction"> human-robot interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=interaction%20posture" title=" interaction posture"> interaction posture</a>, <a href="https://publications.waset.org/abstracts/search?q=interaction%20distance" title=" interaction distance"> interaction distance</a>, <a href="https://publications.waset.org/abstracts/search?q=feedback%20style" title=" feedback style"> feedback style</a>, <a href="https://publications.waset.org/abstracts/search?q=user%20experience" title=" user experience"> user experience</a> </p> <a href="https://publications.waset.org/abstracts/166976/the-role-of-situational-factors-in-user-experience-during-human-robot-interaction" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/166976.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">132</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13512</span> Multimodal Characterization of Emotion within Multimedia Space</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Dayo%20Samuel%20Banjo">Dayo Samuel Banjo</a>, <a href="https://publications.waset.org/abstracts/search?q=Connice%20Trimmingham"> Connice Trimmingham</a>, <a href="https://publications.waset.org/abstracts/search?q=Niloofar%20Yousefi"> Niloofar Yousefi</a>, <a href="https://publications.waset.org/abstracts/search?q=Nitin%20Agarwal"> Nitin Agarwal</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Technological advancement and its omnipresent connection have pushed humans past the boundaries and limitations of a computer screen, physical state, or geographical location. It has provided a depth of avenues that facilitate human-computer interaction that was once inconceivable such as audio and body language detection. Given the complex modularities of emotions, it becomes vital to study human-computer interaction, as it is the commencement of a thorough understanding of the emotional state of users and, in the context of social networks, the producers of multimodal information. This study first acknowledges the accuracy of classification found within multimodal emotion detection systems compared to unimodal solutions. Second, it explores the characterization of multimedia content produced based on their emotions and the coherence of emotion in different modalities by utilizing deep learning models to classify emotion across different modalities. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=affective%20computing" title="affective computing">affective computing</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title=" deep learning"> deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20recognition" title=" emotion recognition"> emotion recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=multimodal" title=" multimodal"> multimodal</a> </p> <a href="https://publications.waset.org/abstracts/157830/multimodal-characterization-of-emotion-within-multimedia-space" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/157830.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">156</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13511</span> OPEN-EmoRec-II-A Multimodal Corpus of Human-Computer Interaction</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Stefanie%20Rukavina">Stefanie Rukavina</a>, <a href="https://publications.waset.org/abstracts/search?q=Sascha%20Gruss"> Sascha Gruss</a>, <a href="https://publications.waset.org/abstracts/search?q=Steffen%20Walter"> Steffen Walter</a>, <a href="https://publications.waset.org/abstracts/search?q=Holger%20Hoffmann"> Holger Hoffmann</a>, <a href="https://publications.waset.org/abstracts/search?q=Harald%20C.%20Traue"> Harald C. Traue</a> </p> <p class="card-text"><strong>Abstract:</strong></p> OPEN-EmoRecII is an open multimodal corpus with experimentally induced emotions. In the first half of the experiment, emotions were induced with standardized picture material and in the second half during a human-computer interaction (HCI), realized with a wizard-of-oz design. The induced emotions are based on the dimensional theory of emotions (valence, arousal and dominance). These emotional sequences - recorded with multimodal data (mimic reactions, speech, audio and physiological reactions) during a naturalistic-like HCI-environment one can improve classification methods on a multimodal level. This database is the result of an HCI-experiment, for which 30 subjects in total agreed to a publication of their data including the video material for research purposes. The now available open corpus contains sensory signal of: video, audio, physiology (SCL, respiration, BVP, EMG Corrugator supercilii, EMG Zygomaticus Major) and mimic annotations. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=open%20multimodal%20emotion%20corpus" title="open multimodal emotion corpus">open multimodal emotion corpus</a>, <a href="https://publications.waset.org/abstracts/search?q=annotated%20labels" title=" annotated labels"> annotated labels</a>, <a href="https://publications.waset.org/abstracts/search?q=intelligent%20interaction" title=" intelligent interaction"> intelligent interaction</a> </p> <a href="https://publications.waset.org/abstracts/29365/open-emorec-ii-a-multimodal-corpus-of-human-computer-interaction" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/29365.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">416</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13510</span> Investigating Breakdowns in Human Robot Interaction: A Conversation Analysis Guided Single Case Study of a Human-Robot Communication in a Museum Environment</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=B.%20Arend">B. Arend</a>, <a href="https://publications.waset.org/abstracts/search?q=P.%20Sunnen"> P. Sunnen</a>, <a href="https://publications.waset.org/abstracts/search?q=P.%20Caire"> P. Caire</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In a single case study, we show how a conversation analysis (CA) approach can shed light onto the sequential unfolding of human-robot interaction. Relying on video data, we are able to show that CA allows us to investigate the respective turn-taking systems of humans and a NAO robot in their dialogical dynamics, thus pointing out relevant differences. Our fine grained video analysis points out occurring breakdowns and their overcoming, when humans and a NAO-robot engage in a multimodally uttered multi-party communication during a sports guessing game. Our findings suggest that interdisciplinary work opens up the opportunity to gain new insights into the challenging issues of human robot communication in order to provide resources for developing mechanisms that enable complex human-robot interaction (HRI). <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=human%20robot%20interaction" title="human robot interaction">human robot interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=conversation%20analysis" title=" conversation analysis"> conversation analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=dialogism" title=" dialogism"> dialogism</a>, <a href="https://publications.waset.org/abstracts/search?q=breakdown" title=" breakdown"> breakdown</a>, <a href="https://publications.waset.org/abstracts/search?q=museum" title=" museum"> museum</a> </p> <a href="https://publications.waset.org/abstracts/60248/investigating-breakdowns-in-human-robot-interaction-a-conversation-analysis-guided-single-case-study-of-a-human-robot-communication-in-a-museum-environment" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/60248.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">305</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13509</span> Hand Motion and Gesture Control of Laboratory Test Equipment Using the Leap Motion Controller</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ian%20A.%20Grout">Ian A. Grout</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, the design and development of a system to provide hand motion and gesture control of laboratory test equipment is considered and discussed. The Leap Motion controller is used to provide an input to control a laboratory power supply as part of an electronic circuit experiment. By suitable hand motions and gestures, control of the power supply is provided remotely and without the need to physically touch the equipment used. As such, it provides an alternative manner in which to control electronic equipment via a PC and is considered here within the field of human computer interaction (HCI). <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=control" title="control">control</a>, <a href="https://publications.waset.org/abstracts/search?q=hand%20gesture" title=" hand gesture"> hand gesture</a>, <a href="https://publications.waset.org/abstracts/search?q=human%20computer%20interaction" title=" human computer interaction"> human computer interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=test%20equipment" title=" test equipment"> test equipment</a> </p> <a href="https://publications.waset.org/abstracts/72099/hand-motion-and-gesture-control-of-laboratory-test-equipment-using-the-leap-motion-controller" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/72099.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">315</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13508</span> The Design Process of an Interactive Seat for Improving Workplace Productivity</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Carlos%20Ferreira">Carlos Ferreira</a>, <a href="https://publications.waset.org/abstracts/search?q=Paulo%20Freitas"> Paulo Freitas</a>, <a href="https://publications.waset.org/abstracts/search?q=Valentim%20Freitas"> Valentim Freitas</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Creative industries’ workers are becoming more prominent as countries move towards intellectual-based economies. Consequently, the nature and essence of the workplace needs to be reconfigured so that creativity and productivity can be better promoted at these spaces. Using a multidisciplinary approach and a user-centered methodology, combining product design, electronic engineering, software and human-computer interaction, we have designed and developed a new seat that uses embedded sensors and actuators to increase the overall well-being of its users, their productivity and their creativity. Our contribution focuses on the parameters that most affect the user’s work on these kinds of spaces, which are, according to our study, noise and temperature. We describe the design process for a new interactive seat targeted at improving workspace productivity. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=human-computer%20interaction" title="human-computer interaction">human-computer interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=usability" title=" usability"> usability</a>, <a href="https://publications.waset.org/abstracts/search?q=user%20interface" title=" user interface"> user interface</a>, <a href="https://publications.waset.org/abstracts/search?q=creativity" title=" creativity"> creativity</a>, <a href="https://publications.waset.org/abstracts/search?q=ergonomics" title=" ergonomics"> ergonomics</a> </p> <a href="https://publications.waset.org/abstracts/83453/the-design-process-of-an-interactive-seat-for-improving-workplace-productivity" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/83453.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">221</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13507</span> Stimulating Young Children Social Interaction Behaviour through Computer Play Activities: The Role of Teachers and Parents Support</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mahani%20Razali">Mahani Razali</a>, <a href="https://publications.waset.org/abstracts/search?q=Nordin%20Mamat"> Nordin Mamat</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The purpose of the study is to explore how computer technology is integrated into pre-school activities and its relationship with children’s social interaction behaviour in pre-school classroom. The major question of interest in the present study is to investigate the social interaction behaviour of children when using computers in the Malaysian pre-school classroom. This research is based on three main objectives which are to identify children`s social interaction during computer play activities, teacher’s role and parent’s participation to develop children`s social interaction. This qualitative study was carried out among 25 pre-school children, three teachers and three parents as the research sample. On the other hand, parent’s support was obtained from their discussions, supervisions and communication at home. The data collection procedures involved structured observation which was to identify social interaction behaviour among pre-school children through computer play activities; as for semi-structured interviews, it was done to study the perception of the teachers and parents on the acquired social interaction behaviour among the children. Besides, documentation analysis method was used as to triangulate acquired information with observations and interviews. In this study, the qualitative data analysis was tabulated in descriptive manner with frequency and percentage format. This study primarily focused on social interaction behaviour elements among the pre-school children. Findings revealed that the children showed positive outcomes on the social interaction behaviour during their computer play. This research summarizes that teacher’s role and parent’s support can improve children`s social interaction behaviour through computer play activities. As a whole, this research highlighted the significance of computer play activities as to stimulate social interaction behavior among the pre-school children. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=early%20childhood" title="early childhood">early childhood</a>, <a href="https://publications.waset.org/abstracts/search?q=emotional%20development" title=" emotional development"> emotional development</a>, <a href="https://publications.waset.org/abstracts/search?q=parent%20support" title=" parent support"> parent support</a>, <a href="https://publications.waset.org/abstracts/search?q=play" title=" play"> play</a> </p> <a href="https://publications.waset.org/abstracts/53541/stimulating-young-children-social-interaction-behaviour-through-computer-play-activities-the-role-of-teachers-and-parents-support" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/53541.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">367</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13506</span> Humans as Enrichment: Human-Animal Interactions and the Perceived Benefit to the Cheetah (Acinonyx jubatus), Human and Zoological Establishment</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=S.%20J.%20Higgs">S. J. Higgs</a>, <a href="https://publications.waset.org/abstracts/search?q=E.%20Van%20Eck"> E. Van Eck</a>, <a href="https://publications.waset.org/abstracts/search?q=K.%20Heynis"> K. Heynis</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20H.%20Broadberry"> S. H. Broadberry</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Engagement with non-human animals is a rapidly-growing field of study within the animal science and social science sectors, with human-interactions occurring in many forms; interactions, encounters and animal-assisted therapy. To our knowledge, there has been a wide array of research published on domestic and livestock human-animal interactions, however, there appear to be fewer publications relating to zoo animals and the effect these interactions have on the animal, human and establishment. The aim of this study was to identify if there were any perceivable benefits from the human-animal interaction for the cheetah, the human and the establishment. Behaviour data were collected before, during and after the interaction on the behaviour of the cheetah and the human participants to highlight any trends with nine interactions conducted. All 35 participants were asked to fill in a questionnaire prior to the interaction and immediately after to ascertain if their perceptions changed following an interaction with the cheetah. An online questionnaire was also distributed for three months to gain an understanding of the perceptions of human-animal interactions from members of the public, gaining 229 responses. Both questionnaires contained qualitative and quantitative questions to allow for specific definitive answers to be analysed, but also expansion on the participants perceived perception of human-animal interactions. In conclusion, it was found that participants’ perceptions of human-animal interactions saw a positive change, with 64% of participants altering their opinion and viewing the interaction as beneficial for the cheetah (reduction in stress assumed behaviours) following participation in a 15-minute interaction. However, it was noted that many participants felt the interaction lacked educational values and therefore this is an area in which zoological establishments can work to further improve upon. The results highlighted many positive benefits for the human, animal and establishment, however, the study does indicate further areas for research in order to promote positive perceptions of human-animal interactions and to further increase the welfare of the animal during these interactions, with recommendations to create and regulate legislation. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Acinonyx%20jubatus" title="Acinonyx jubatus">Acinonyx jubatus</a>, <a href="https://publications.waset.org/abstracts/search?q=encounters" title=" encounters"> encounters</a>, <a href="https://publications.waset.org/abstracts/search?q=human-animal%20interactions" title=" human-animal interactions"> human-animal interactions</a>, <a href="https://publications.waset.org/abstracts/search?q=perceptions" title=" perceptions"> perceptions</a>, <a href="https://publications.waset.org/abstracts/search?q=zoological%20establishments" title=" zoological establishments"> zoological establishments</a> </p> <a href="https://publications.waset.org/abstracts/88177/humans-as-enrichment-human-animal-interactions-and-the-perceived-benefit-to-the-cheetah-acinonyx-jubatus-human-and-zoological-establishment" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/88177.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">189</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13505</span> Work in the Industry of the Future-Investigations of Human-Machine Interactions</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=S.%20Schr%C3%B6der">S. Schröder</a>, <a href="https://publications.waset.org/abstracts/search?q=P.%20Ennen"> P. Ennen</a>, <a href="https://publications.waset.org/abstracts/search?q=T.%20Langer"> T. Langer</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20M%C3%BCller"> S. Müller</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Shehadeh"> M. Shehadeh</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Haberstroh"> M. Haberstroh</a>, <a href="https://publications.waset.org/abstracts/search?q=F.%20Hees"> F. Hees</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Since a bit over a year ago, Festo AG and Co. KG, Festo Didactic SE, robomotion GmbH, the researchers of the Cybernetics-Lab IMA/ZLW and IfU, as well as the Human-Computer Interaction Center at the RWTH Aachen University, have been working together in the focal point of assembly competences to realize different scenarios in the field of human-machine interaction (HMI). In the framework of project ARIZ, questions concerning the future of production within the fourth industrial revolution are dealt with. There are many perspectives of human-robot collaboration that consist Industry 4.0 on an individual, organization and enterprise level, and these will be addressed in ARIZ. The aim of the ARIZ projects is to link AI-Approaches to assembly problems and to implement them as prototypes in demonstrators. To do so, island and flow based production scenarios will be simulated and realized as prototypes. These prototypes will serve as applications of flexible robotics as well as AI-based planning and control of production process. Using the demonstrators, human interaction strategies will be examined with an information system on one hand, and a robotic system on the other. During the tests, prototypes of workspaces that illustrate prospective production work forms will be represented. The human being will remain a central element in future productions and will increasingly be in charge of managerial tasks. Questions thus arise within the overall perspective, primarily concerning the role of humans within these technological revolutions, as well as their ability to act and design respectively to the acceptance of such systems. Roles, such as the 'Trainer' of intelligent systems may become a possibility in such assembly scenarios. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=human-machine%20interaction" title="human-machine interaction">human-machine interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=information%20technology" title=" information technology"> information technology</a>, <a href="https://publications.waset.org/abstracts/search?q=island%20based%20production" title=" island based production"> island based production</a>, <a href="https://publications.waset.org/abstracts/search?q=assembly%20competences" title=" assembly competences"> assembly competences</a> </p> <a href="https://publications.waset.org/abstracts/86239/work-in-the-industry-of-the-future-investigations-of-human-machine-interactions" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/86239.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">205</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13504</span> A Multimodal Dialogue Management System for Achieving Natural Interaction with Embodied Conversational Agents</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ozge%20Nilay%20Yalcin">Ozge Nilay Yalcin</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Dialogue has been proposed to be the natural basis for the human-computer interaction, which is behaviorally rich and includes different modalities such as gestures, posture changes, gaze, para-linguistic parameters and linguistic context. However, equipping the system with these capabilities might have consequences on the usability of the system. One issue is to be able to find a good balance between rich behavior and fluent behavior, as planning and generating these behaviors is computationally expensive. In this work, we propose a multi-modal dialogue management system that automates the conversational flow from text-based dialogue examples and uses synchronized verbal and non-verbal conversational cues to achieve a fluent interaction. Our system is integrated with Smartbody behavior realizer to provide real-time interaction with embodied agent. The nonverbal behaviors are used according to turn-taking behavior, emotions, and personality of the user and linguistic analysis of the dialogue. The verbal behaviors are responsive to the emotional value of the utterance and the feedback from the user. Our system is aimed for online planning of these affective multi-modal components, in order to achieve enhanced user experience with richer and more natural interaction. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=affect" title="affect">affect</a>, <a href="https://publications.waset.org/abstracts/search?q=embodied%20conversational%20agents" title=" embodied conversational agents"> embodied conversational agents</a>, <a href="https://publications.waset.org/abstracts/search?q=human-agent%20interaction" title=" human-agent interaction"> human-agent interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=multimodal%20interaction" title=" multimodal interaction"> multimodal interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=natural%20interfaces" title=" natural interfaces"> natural interfaces</a> </p> <a href="https://publications.waset.org/abstracts/91824/a-multimodal-dialogue-management-system-for-achieving-natural-interaction-with-embodied-conversational-agents" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/91824.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">175</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13503</span> Stimulating the Social Interaction Development of Children through Computer Play Activities: The Role of Teachers</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mahani%20Razali">Mahani Razali</a>, <a href="https://publications.waset.org/abstracts/search?q=Abd%20Halim%20Masnan"> Abd Halim Masnan</a>, <a href="https://publications.waset.org/abstracts/search?q=Nordin%20Mamat"> Nordin Mamat</a>, <a href="https://publications.waset.org/abstracts/search?q=Seah%20Siok%20Peh"> Seah Siok Peh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This research is based on three main objectives which are to identify children`s social interaction behaviour during computer play activities, teacher’s role and to explore teacher’s beliefs, views and knowledge about computers use in four Malaysian pre-schools.This qualitative study was carried out among 25 pre-school children and three teachers as the research sample. The data collection procedures involved structured observation which was to identify social interaction behavior among pre-school children through computer play activities; as for semi-structured interviews, it was done to study the perception of the teachers on the acquired of social interaction behavior development among the children. A variety of patterns can be seen within the peer interactions indicating that children exhibit a vast range of social interactions at the computer, and they varied each day. The findings of this study guide us to certain conclusions, which have implications in understanding the phenomena of how computers were used and how its relationship to the children’s social interactions emerge in the four Malaysian preschools. This study provides evidence that the children’s social interactions with peers and adults were mediated by the engagement of the children in the computer environments. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=computer" title="computer">computer</a>, <a href="https://publications.waset.org/abstracts/search?q=play" title=" play"> play</a>, <a href="https://publications.waset.org/abstracts/search?q=preschool" title=" preschool"> preschool</a>, <a href="https://publications.waset.org/abstracts/search?q=social%20interaction" title=" social interaction"> social interaction</a> </p> <a href="https://publications.waset.org/abstracts/54009/stimulating-the-social-interaction-development-of-children-through-computer-play-activities-the-role-of-teachers" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/54009.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">299</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13502</span> An Interactive Platform Displaying Mixed Reality Media</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Alfred%20Chen">Alfred Chen</a>, <a href="https://publications.waset.org/abstracts/search?q=Cheng%20Chieh%20Hsu"> Cheng Chieh Hsu</a>, <a href="https://publications.waset.org/abstracts/search?q=Yu-Pin%20Ma"> Yu-Pin Ma</a>, <a href="https://publications.waset.org/abstracts/search?q=Meng-Jie%20Lin"> Meng-Jie Lin</a>, <a href="https://publications.waset.org/abstracts/search?q=Fu%20Pai%20Chiu"> Fu Pai Chiu</a>, <a href="https://publications.waset.org/abstracts/search?q=Yi-Yan%20Sie"> Yi-Yan Sie </a> </p> <p class="card-text"><strong>Abstract:</strong></p> This study is attempted to construct a human-computer interactive platform system that has mainly consisted of an augmented hardware system, a software system, a display table, and mixed media. This system has provided with human-computer interaction services through an interactive platform for the tourism industry. A well designed interactive platform, integrating of augmented reality and mixed media, has potential to enhance museum display quality and diversity. Besides, it will create a comprehensive and creative display mode for most museums and historical heritages. Therefore, it is essential to let public understand what the platform is, how it functions, and most importantly how one builds an interactive augmented platform. Hence the authors try to elaborate the construction process of the platform in detail. Thus, there are three issues to be considered, i.e.1) the theory and application of augmented reality, 2) the hardware and software applied, and 3) the mixed media presented. In order to describe how the platform works, Courtesy Door of Tainan Confucius Temple has been selected as case study in this study. As a result, a developed interactive platform has been presented by showing the physical entity object, along with virtual mixing media such as text, images, animation, and video. This platform will result in providing diversified and effective information that will be delivered to the users. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=human-computer%20interaction" title="human-computer interaction">human-computer interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=mixed%20reality" title=" mixed reality"> mixed reality</a>, <a href="https://publications.waset.org/abstracts/search?q=mixed%20media" title=" mixed media"> mixed media</a>, <a href="https://publications.waset.org/abstracts/search?q=tourism" title=" tourism"> tourism</a> </p> <a href="https://publications.waset.org/abstracts/16872/an-interactive-platform-displaying-mixed-reality-media" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/16872.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">489</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13501</span> Evaluation of AR-4BL-MAST with Multiple Markers Interaction Technique for Augmented Reality Based Engineering Application</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Waleed%20Maqableh">Waleed Maqableh</a>, <a href="https://publications.waset.org/abstracts/search?q=Ahmad%20Al-Hamad"> Ahmad Al-Hamad</a>, <a href="https://publications.waset.org/abstracts/search?q=Manjit%20Sidhu"> Manjit Sidhu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Augmented reality (AR) technology has the capability to provide many benefits in the field of education as a modern technology which aided learning and improved the learning experience. This paper evaluates AR based application with multiple markers interaction technique (touch-to-print) which is designed for analyzing the kinematics of 4BL mechanism in mechanical engineering. The application is termed as AR-4BL-MAST and it allows the users to touch the symbols on a paper in natural way of interaction. The evaluation of this application was performed with mechanical engineering students and human–computer interaction (HCI) experts to test its effectiveness as a tangible user interface application where the statistical results show its ability as an interaction technique, and it gives the users more freedom in interaction with the virtual mechanical objects. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=augmented%20reality" title="augmented reality">augmented reality</a>, <a href="https://publications.waset.org/abstracts/search?q=multimedia" title=" multimedia"> multimedia</a>, <a href="https://publications.waset.org/abstracts/search?q=user%20interface" title=" user interface"> user interface</a>, <a href="https://publications.waset.org/abstracts/search?q=engineering" title=" engineering"> engineering</a>, <a href="https://publications.waset.org/abstracts/search?q=education%20technology" title=" education technology"> education technology</a> </p> <a href="https://publications.waset.org/abstracts/52748/evaluation-of-ar-4bl-mast-with-multiple-markers-interaction-technique-for-augmented-reality-based-engineering-application" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/52748.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">575</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13500</span> Human Kinetics Education and the Computer Operations, Effects and Merits</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Kehinde%20Adeyeye%20Adelabu">Kehinde Adeyeye Adelabu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Computer applications has completely revolutionized the way of life of people which does not exclude the field of sport education. There are computer technologies which help to enhance teaching in every field of education. Invention of computers has done great to the field of education. This study was therefore carried out to examine the effects and merits of computer operations in Human Kinetics Education and Sports. The study was able to identify the component of computer, uses of computer in Human Kinetics education (sports), computer applications in some branches of human kinetics education. A qualitative research method was employed by the author in gathering experts’ views and used to analyze the effects and merits of computer applications in the field of human kinetics education. No experiment was performed in the cause of carrying out the study. The source of information for the study was text-books, journal, articles, past project reports, internet i.e. Google search engine. Computer has significantly helped to improve Education (Human Kinetic), it has complemented the basic physical fitness testing and gave a more scientific basis to the testing. The use of the software and packages has made cost projections, database applications, inventory control, management of events, word processing, electronic mailing and record keeping easier than the pasts. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=application" title="application">application</a>, <a href="https://publications.waset.org/abstracts/search?q=computer%20operation" title=" computer operation"> computer operation</a>, <a href="https://publications.waset.org/abstracts/search?q=education" title=" education"> education</a>, <a href="https://publications.waset.org/abstracts/search?q=human%20kinetics" title=" human kinetics"> human kinetics</a> </p> <a href="https://publications.waset.org/abstracts/92823/human-kinetics-education-and-the-computer-operations-effects-and-merits" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/92823.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">186</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13499</span> Systematic Process for Constructing an Augmented Reality Display Platform</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Cheng%20Chieh%20Hsu">Cheng Chieh Hsu</a>, <a href="https://publications.waset.org/abstracts/search?q=Alfred%20Chen"> Alfred Chen</a>, <a href="https://publications.waset.org/abstracts/search?q=Yu-Pin%20Ma"> Yu-Pin Ma</a>, <a href="https://publications.waset.org/abstracts/search?q=Meng-Jie%20Lin"> Meng-Jie Lin</a>, <a href="https://publications.waset.org/abstracts/search?q=Fu%20Pai%20Chiu"> Fu Pai Chiu</a>, <a href="https://publications.waset.org/abstracts/search?q=Yi-Yan%20Sie"> Yi-Yan Sie </a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this study, it is attempted to construct an augmented reality display platform (ARDP), and its objectives are two facets, i.e. 1) providing a creative display mode for museums/historical heritages and 2) providing a benchmark for human-computer interaction professionals to build an augmented reality display platform. A general augmented reality theory has been explored in the very beginning and afterwards a systematic process model is proposed. There are three major core tasks to be done for the platform, i.e. 1) constructing the physical interactive table, 2) designing the media, and 3) designing the media carrier. In order to describe how the platform manipulates, the authors have introduced Tainan Confucius Temple, a cultural heritage in Taiwan, as a case study. As a result, a systematic process with thirteen steps has been developed and it aims at providing a rational method for constructing the platform. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=human-computer%20interaction" title="human-computer interaction">human-computer interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=media" title=" media"> media</a>, <a href="https://publications.waset.org/abstracts/search?q=media%20carrier" title=" media carrier"> media carrier</a>, <a href="https://publications.waset.org/abstracts/search?q=augmented%20reality%20display%20platform" title=" augmented reality display platform"> augmented reality display platform</a> </p> <a href="https://publications.waset.org/abstracts/18782/systematic-process-for-constructing-an-augmented-reality-display-platform" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/18782.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">415</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13498</span> Design Guidelines for an Enhanced Interaction Experience in the Domain of Smartphone-Based Applications for Sport and Fitness</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Paolo%20Pilloni">Paolo Pilloni</a>, <a href="https://publications.waset.org/abstracts/search?q=Fabrizio%20Mulas"> Fabrizio Mulas</a>, <a href="https://publications.waset.org/abstracts/search?q=Salvatore%20Carta"> Salvatore Carta</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Nowadays, several research studies point up that an active lifestyle is essential for physical and mental health benefits. Mobile phones have greatly influenced people’s habits and attitudes also in the way they exercise. Our research work is mainly focused on investigating how to exploit mobile technologies to favour people’s exertion experience. To this end, we developed an exertion framework users can exploit through a real world mobile application, called BLINDED, designed to act as a virtual personal trainer to support runners during their trainings. In this work, inspired by both previous findings in the field of interaction design for people with visual impairments, feedback gathered from real users of our framework, and positive results obtained from two experimentations, we present some new interaction facilities we designed to enhance the interaction experience during a training. The positive obtained results helped us to derive some interaction design recommendations we believe will be a valid support for designers of future mobile systems conceived to be used in circumstances where there are limited possibilities of interaction. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=human%20computer%20interaction" title="human computer interaction">human computer interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=interaction%20design%20guidelines" title=" interaction design guidelines"> interaction design guidelines</a>, <a href="https://publications.waset.org/abstracts/search?q=persuasive%20mobile%20technologies%20for%20sport%20and%20health" title=" persuasive mobile technologies for sport and health"> persuasive mobile technologies for sport and health</a> </p> <a href="https://publications.waset.org/abstracts/20387/design-guidelines-for-an-enhanced-interaction-experience-in-the-domain-of-smartphone-based-applications-for-sport-and-fitness" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/20387.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">532</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">‹</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=computer%20human%20interaction&page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=computer%20human%20interaction&page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=computer%20human%20interaction&page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=computer%20human%20interaction&page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=computer%20human%20interaction&page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=computer%20human%20interaction&page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=computer%20human%20interaction&page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=computer%20human%20interaction&page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=computer%20human%20interaction&page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=computer%20human%20interaction&page=450">450</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=computer%20human%20interaction&page=451">451</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=computer%20human%20interaction&page=2" rel="next">›</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">© 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">×</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>