CINXE.COM

Search results for: HMI (Human Machine Interface)

<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: HMI (Human Machine Interface)</title> <meta name="description" content="Search results for: HMI (Human Machine Interface)"> <meta name="keywords" content="HMI (Human Machine Interface)"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="HMI (Human Machine Interface)" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="HMI (Human Machine Interface)"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 11947</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: HMI (Human Machine Interface)</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11947</span> Evaluation of the Matching Optimization of Human-Machine Interface Matching in the Cab</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yanhua%20Ma">Yanhua Ma</a>, <a href="https://publications.waset.org/abstracts/search?q=Lu%20Zhai"> Lu Zhai</a>, <a href="https://publications.waset.org/abstracts/search?q=Xinchen%20Wang"> Xinchen Wang</a>, <a href="https://publications.waset.org/abstracts/search?q=Hongyu%20Liang"> Hongyu Liang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, by understanding the development status of the human-machine interface in today's automobile cab, a subjective and objective evaluation system for evaluating the optimization of human-machine interface matching in automobile cab was established. The man-machine interface of the car cab was divided into a software interface and a hard interface. Objective evaluation method of software human factor analysis is used to evaluate the hard interface matching; The analytic hierarchy process is used to establish the evaluation index system for the software interface matching optimization, and the multi-level fuzzy comprehensive evaluation method is used to evaluate hard interface machine. This article takes Dongfeng Sokon (DFSK) C37 model automobile as an example. The evaluation method given in the paper is used to carry out relevant analysis and evaluation, and corresponding optimization suggestions are given, which have certain reference value for designers. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=analytic%20hierarchy%20process" title="analytic hierarchy process">analytic hierarchy process</a>, <a href="https://publications.waset.org/abstracts/search?q=fuzzy%20comprehension%20evaluation%20method" title=" fuzzy comprehension evaluation method"> fuzzy comprehension evaluation method</a>, <a href="https://publications.waset.org/abstracts/search?q=human-machine%20interface" title=" human-machine interface"> human-machine interface</a>, <a href="https://publications.waset.org/abstracts/search?q=matching%20optimization" title=" matching optimization"> matching optimization</a>, <a href="https://publications.waset.org/abstracts/search?q=software%20human%20factor%20analysis" title=" software human factor analysis"> software human factor analysis</a> </p> <a href="https://publications.waset.org/abstracts/131104/evaluation-of-the-matching-optimization-of-human-machine-interface-matching-in-the-cab" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/131104.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">156</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11946</span> Challenges for Interface Designers in Designing Sensor Dashboards in the Context of Industry 4.0</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Naveen%20Kumar">Naveen Kumar</a>, <a href="https://publications.waset.org/abstracts/search?q=Shyambihari%20Prajapati"> Shyambihari Prajapati</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Industry 4.0 is the fourth industrial revolution that focuses on interconnectivity of machine to machine, human to machine and human to human via Internet of Things (IoT). Technologies of industry 4.0 facilitate communication between human and machine through IoT and forms Cyber-Physical Production System (CPPS). In CPPS, multiple shop floors sensor data are connected through IoT and displayed through sensor dashboard to the operator. These sensor dashboards have enormous amount of information to be presented which becomes complex for operators to perform monitoring, controlling and interpretation tasks. Designing handheld sensor dashboards for supervision task will become a challenge for the interface designers. This paper reports emerging technologies of industry 4.0, changing context of increasing information complexity in consecutive industrial revolutions and upcoming design challenges for interface designers in context of Industry 4.0. Authors conclude that information complexity of sensor dashboards design has increased with consecutive industrial revolutions and designs of sensor dashboard causes cognitive load on users. Designing such complex dashboards interfaces in Industry 4.0 context will become main challenges for the interface designers. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Industry4.0" title="Industry4.0">Industry4.0</a>, <a href="https://publications.waset.org/abstracts/search?q=sensor%20dashboard%20design" title=" sensor dashboard design"> sensor dashboard design</a>, <a href="https://publications.waset.org/abstracts/search?q=cyber-physical%20production%20system" title=" cyber-physical production system"> cyber-physical production system</a>, <a href="https://publications.waset.org/abstracts/search?q=Interface%20designer" title=" Interface designer"> Interface designer</a> </p> <a href="https://publications.waset.org/abstracts/110214/challenges-for-interface-designers-in-designing-sensor-dashboards-in-the-context-of-industry-40" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/110214.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">128</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11945</span> Human Machine Interface for Controlling a Robot Using Image Processing</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ambuj%20Kumar%20Gautam">Ambuj Kumar Gautam</a>, <a href="https://publications.waset.org/abstracts/search?q=V.%20Vasu"> V. Vasu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper introduces a head movement based Human Machine Interface (HMI) that uses the right and left movements of head to control a robot motion. Here we present an approach for making an effective technique for real-time face orientation information system, to control a robot which can be efficiently used for Electrical Powered Wheelchair (EPW). Basically this project aims at application related to HMI. The system (machine) identifies the orientation of the face movement with respect to the pixel values of image in a certain areas. Initially we take an image and divide that whole image into three parts on the basis of its number of columns. On the basis of orientation of face, maximum pixel value of approximate same range of (R, G, and B value of a pixel) lie in one of divided parts of image. This information we transfer to the microcontroller through serial communication port and control the motion of robot like forward motion, left and right turn and stop in real time by using head movements. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=electrical%20powered%20wheelchair%20%28EPW%29" title="electrical powered wheelchair (EPW)">electrical powered wheelchair (EPW)</a>, <a href="https://publications.waset.org/abstracts/search?q=human%20machine%20interface%20%28HMI%29" title=" human machine interface (HMI)"> human machine interface (HMI)</a>, <a href="https://publications.waset.org/abstracts/search?q=robotics" title=" robotics"> robotics</a>, <a href="https://publications.waset.org/abstracts/search?q=microcontroller" title=" microcontroller"> microcontroller</a> </p> <a href="https://publications.waset.org/abstracts/10916/human-machine-interface-for-controlling-a-robot-using-image-processing" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/10916.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">292</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11944</span> A Study to Connect the Objective Interface Design Characters To Ergonomic Safety</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Gaoguang%20Yang">Gaoguang Yang</a>, <a href="https://publications.waset.org/abstracts/search?q=Shan%20Fu"> Shan Fu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Human-machine interface (HMI) intermediate system information to human operators to facilitate human ability to manage and control the system. Well-designed HMI would enhance human ability. An evaluation must be performed to confirm that the designed HMI would enhance but not degrade human ability. However, the prevalent HMI evaluation techniques have difficulties in more thoroughly and accurately evaluating the suitability and fitness of a given HMI for the wide variety of uncertainty contained in both the existing HMI evaluation techniques and the large number of task scenarios. The first limitation should be attributed to the subjective and qualitative analysis characteristics of these evaluation methods, and the second one should be attributed to the cost balance. This study aims to explore the connection between objective HMI characters and ergonomic safety and step forward toward solving these limitations with objective, characterized HMI parameters. A simulation experiment was performed with the time needed for human operators to recognize the HMI information as characterized HMI parameter, and the result showed a strong correlation between the parameter and ergonomic safety level. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Human-Machine%20Interface%20%28HMI%29" title="Human-Machine Interface (HMI)">Human-Machine Interface (HMI)</a>, <a href="https://publications.waset.org/abstracts/search?q=evaluation" title=" evaluation"> evaluation</a>, <a href="https://publications.waset.org/abstracts/search?q=objective" title=" objective"> objective</a>, <a href="https://publications.waset.org/abstracts/search?q=characterization" title=" characterization"> characterization</a>, <a href="https://publications.waset.org/abstracts/search?q=simulation" title=" simulation"> simulation</a> </p> <a href="https://publications.waset.org/abstracts/175438/a-study-to-connect-the-objective-interface-design-characters-to-ergonomic-safety" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/175438.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">66</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11943</span> Hand Controlled Mobile Robot Applied in Virtual Environment</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jozsef%20Katona">Jozsef Katona</a>, <a href="https://publications.waset.org/abstracts/search?q=Attila%20Kovari"> Attila Kovari</a>, <a href="https://publications.waset.org/abstracts/search?q=Tibor%20Ujbanyi"> Tibor Ujbanyi</a>, <a href="https://publications.waset.org/abstracts/search?q=Gergely%20Sziladi"> Gergely Sziladi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> By the development of IT systems, human-computer interaction is also developing even faster and newer communication methods become available in human-machine interaction. In this article, the application of a hand gesture controlled human-computer interface is being introduced through the example of a mobile robot. The control of the mobile robot is implemented in a realistic virtual environment that is advantageous regarding the aspect of different tests, parallel examinations, so the purchase of expensive equipment is unnecessary. The usability of the implemented hand gesture control has been evaluated by test subjects. According to the opinion of the testing subjects, the system can be well used, and its application would be recommended on other application fields too. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=human-machine%20interface%20%28HCI%29" title="human-machine interface (HCI)">human-machine interface (HCI)</a>, <a href="https://publications.waset.org/abstracts/search?q=mobile%20robot" title=" mobile robot"> mobile robot</a>, <a href="https://publications.waset.org/abstracts/search?q=hand%20control" title=" hand control"> hand control</a>, <a href="https://publications.waset.org/abstracts/search?q=virtual%20environment" title=" virtual environment"> virtual environment</a> </p> <a href="https://publications.waset.org/abstracts/75711/hand-controlled-mobile-robot-applied-in-virtual-environment" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/75711.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">298</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11942</span> Determination of Concentrated State Using Multiple EEG Channels</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Tae%20Jin%20Choi">Tae Jin Choi</a>, <a href="https://publications.waset.org/abstracts/search?q=Jong%20Ok%20Kim"> Jong Ok Kim</a>, <a href="https://publications.waset.org/abstracts/search?q=Sang%20Min%20Jin"> Sang Min Jin</a>, <a href="https://publications.waset.org/abstracts/search?q=Gilwon%20Yoon"> Gilwon Yoon</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Analysis of EEG brainwave provides information on mental or emotional states. One of the particular states that can have various applications in human machine interface (HMI) is concentration. 8-channel EEG signals were measured and analyzed. The concentration index was compared during resting and concentrating periods. Among eight channels, locations the frontal lobe (Fp1 and Fp2) showed a clear increase of the concentration index during concentration regardless of subjects. The rest six channels produced conflicting observations depending on subjects. At this time, it is not clear whether individual difference or how to concentrate made these results for the rest six channels. Nevertheless, it is expected that Fp1 and Fp2 are promising locations for extracting control signal for HMI applications. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=concentration" title="concentration">concentration</a>, <a href="https://publications.waset.org/abstracts/search?q=EEG" title=" EEG"> EEG</a>, <a href="https://publications.waset.org/abstracts/search?q=human%20machine%20interface" title=" human machine interface"> human machine interface</a>, <a href="https://publications.waset.org/abstracts/search?q=biophysical" title=" biophysical"> biophysical</a> </p> <a href="https://publications.waset.org/abstracts/13664/determination-of-concentrated-state-using-multiple-eeg-channels" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/13664.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">482</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11941</span> Emotions in Human-Machine Interaction</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Joanna%20Maj">Joanna Maj</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Awe inspiring is the idea that emotions could be present in human-machine interactions, both on the human side as well as the machine side. Human factors present intriguing components and are examined in detail while discussing this controversial topic. Mood, attention, memory, performance, assessment, causes of emotion, and neurological responses are analyzed as components of the interaction. Problems in computer-based technology, revenge of the system on its users and design, and applications comprise a major part of all descriptions and examples throughout this paper. It also allows for critical thinking while challenging intriguing questions regarding future directions in research, dealing with emotion in human-machine interactions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=biocomputing" title="biocomputing">biocomputing</a>, <a href="https://publications.waset.org/abstracts/search?q=biomedical%20engineering" title=" biomedical engineering"> biomedical engineering</a>, <a href="https://publications.waset.org/abstracts/search?q=emotions" title=" emotions"> emotions</a>, <a href="https://publications.waset.org/abstracts/search?q=human-machine%20interaction" title=" human-machine interaction"> human-machine interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=interfaces" title=" interfaces"> interfaces</a> </p> <a href="https://publications.waset.org/abstracts/156950/emotions-in-human-machine-interaction" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/156950.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">133</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11940</span> An Assistive Robotic Arm for Defence and Rescue Application</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=J.%20Harrison%20Kurunathan">J. Harrison Kurunathan</a>, <a href="https://publications.waset.org/abstracts/search?q=R.%20Jayaparvathy"> R. Jayaparvathy</a> </p> <p class="card-text"><strong>Abstract:</strong></p> "Assistive Robotics" is the field that deals with the study of robots that helps in human motion and also empowers human abilities by interfacing the robotic systems to be manipulated by human motion. The proposed model is a robotic arm that works as a haptic interface on the basis on accelerometers and DC motors that will function with respect to the movement of the human muscle. The proposed model would effectively work as a haptic interface that would reduce human effort in the field of defense and rescue. This can be used in very critical conditions like fire accidents to avoid causalities. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=accelerometers" title="accelerometers">accelerometers</a>, <a href="https://publications.waset.org/abstracts/search?q=haptic%20interface" title=" haptic interface"> haptic interface</a>, <a href="https://publications.waset.org/abstracts/search?q=servo%20motors" title=" servo motors"> servo motors</a>, <a href="https://publications.waset.org/abstracts/search?q=signal%20processing" title=" signal processing"> signal processing</a> </p> <a href="https://publications.waset.org/abstracts/6771/an-assistive-robotic-arm-for-defence-and-rescue-application" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/6771.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">397</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11939</span> Achieving Shear Wave Elastography by a Three-element Probe for Wearable Human-machine Interface</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jipeng%20Yan">Jipeng Yan</a>, <a href="https://publications.waset.org/abstracts/search?q=Xingchen%20Yang"> Xingchen Yang</a>, <a href="https://publications.waset.org/abstracts/search?q=Xiaowei%20Zhou"> Xiaowei Zhou</a>, <a href="https://publications.waset.org/abstracts/search?q=Mengxing%20Tang"> Mengxing Tang</a>, <a href="https://publications.waset.org/abstracts/search?q=Honghai%20Liu"> Honghai Liu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Shear elastic modulus of skeletal muscles can be obtained by shear wave elastography (SWE) and has been linearly related to muscle force. However, SWE is currently implemented using array probes. Price and volumes of these probes and their driving equipment prevent SWE from being used in wearable human-machine interfaces (HMI). Moreover, beamforming processing for array probes reduces the real-time performance. To achieve SWE by wearable HMIs, a customized three-element probe is adopted in this work, with one element for acoustic radiation force generation and the others for shear wave tracking. In-phase quadrature demodulation and 2D autocorrelation are adopted to estimate velocities of tissues on the sound beams of the latter two elements. Shear wave speeds are calculated by phase shift between the tissue velocities. Three agar phantoms with different elasticities were made by changing the weights of agar. Values of the shear elastic modulus of the phantoms were measured as 8.98, 23.06 and 36.74 kPa at a depth of 7.5 mm respectively. This work verifies the feasibility of measuring shear elastic modulus by wearable devices. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=shear%20elastic%20modulus" title="shear elastic modulus">shear elastic modulus</a>, <a href="https://publications.waset.org/abstracts/search?q=skeletal%20muscle" title=" skeletal muscle"> skeletal muscle</a>, <a href="https://publications.waset.org/abstracts/search?q=ultrasound" title=" ultrasound"> ultrasound</a>, <a href="https://publications.waset.org/abstracts/search?q=wearable%20human-machine%20interface" title=" wearable human-machine interface"> wearable human-machine interface</a> </p> <a href="https://publications.waset.org/abstracts/127469/achieving-shear-wave-elastography-by-a-three-element-probe-for-wearable-human-machine-interface" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/127469.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">161</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11938</span> An Analysis of OpenSim Graphical User Interface Effectiveness</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sina%20Saadati">Sina Saadati</a> </p> <p class="card-text"><strong>Abstract:</strong></p> OpenSim is a well-known software in biomechanical studies. There are worthy algorithms developed in this program which are used for modeling and simulation of human motions. In this research, we analyze the OpenSim application from the computer science perspective. It is important that every application have a user-friendly interface. An effective user interface can decrease the time, costs, and energy needed to learn how to use a program. In this paper, we survey the user interface of OpenSim as an important factor of the software. Finally, we infer that there are many challenges to be addressed in the development of OpenSim. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=biomechanics" title="biomechanics">biomechanics</a>, <a href="https://publications.waset.org/abstracts/search?q=computer%20engineering" title=" computer engineering"> computer engineering</a>, <a href="https://publications.waset.org/abstracts/search?q=graphical%20user%20interface" title=" graphical user interface"> graphical user interface</a>, <a href="https://publications.waset.org/abstracts/search?q=modeling%20and%20simulation" title=" modeling and simulation"> modeling and simulation</a>, <a href="https://publications.waset.org/abstracts/search?q=interface%20effectiveness" title=" interface effectiveness"> interface effectiveness</a> </p> <a href="https://publications.waset.org/abstracts/168517/an-analysis-of-opensim-graphical-user-interface-effectiveness" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/168517.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">95</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11937</span> Delamination of Scale in a Fe Carbon Steel Surface by Effect of Interface Roughness and Oxide Scale Thickness</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=J.%20M.%20Lee">J. M. Lee</a>, <a href="https://publications.waset.org/abstracts/search?q=W.%20R.%20Noh"> W. R. Noh</a>, <a href="https://publications.waset.org/abstracts/search?q=C.%20Y.%20Kim"> C. Y. Kim</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20G.%20Lee"> M. G. Lee</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Delamination of oxide scale has been often discovered at the interface between Fe carbon steel and oxide scale. Among several mechanisms of this delamination behavior, the normal tensile stress to the substrate-scale interface has been described as one of the main factors. The stress distribution at the interface is also known to be affected by thermal expansion mismatch between substrate and oxide scale, creep behavior during cooling and the geometry of the interface. In this study, stress states near the interface in a Fe carbon steel with oxide scale have been investigated using FE simulations. The thermal and mechanical properties of oxide scales are indicated in literature and Fe carbon steel is measured using tensile testing machine. In particular, the normal and shear stress components developed at the interface during bending are investigated. Preliminary numerical sensitivity analyses are provided to explain the effects of the interface geometry and oxide thickness on the delamination behavior. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=oxide%20scale" title="oxide scale">oxide scale</a>, <a href="https://publications.waset.org/abstracts/search?q=delamination" title=" delamination"> delamination</a>, <a href="https://publications.waset.org/abstracts/search?q=Fe%20analysis" title=" Fe analysis"> Fe analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=roughness" title=" roughness"> roughness</a>, <a href="https://publications.waset.org/abstracts/search?q=thickness" title=" thickness"> thickness</a>, <a href="https://publications.waset.org/abstracts/search?q=stress%20state" title=" stress state"> stress state</a> </p> <a href="https://publications.waset.org/abstracts/43731/delamination-of-scale-in-a-fe-carbon-steel-surface-by-effect-of-interface-roughness-and-oxide-scale-thickness" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/43731.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">344</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11936</span> Development of Sound Tactile Interface by Use of Human Sensation of Stiffness</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=K.%20Doi">K. Doi</a>, <a href="https://publications.waset.org/abstracts/search?q=T.%20Nishimura"> T. Nishimura</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Umeda"> M. Umeda</a> </p> <p class="card-text"><strong>Abstract:</strong></p> There are very few sound interfaces that both healthy people and hearing handicapped people can use to play together. In this study, we developed a sound tactile interface that makes use of the human sensation of stiffness. The interface comprises eight elastic objects having varying degrees of stiffness. Each elastic object is shaped like a column. When people with and without hearing disabilities press each elastic object, different sounds are produced depending on the stiffness of the elastic object. The types of sounds used were “Do Re Mi sounds.” The interface has a major advantage in that people with or without hearing disabilities can play with it. We found that users were able to recognize the hardness sensation and relate it to the corresponding Do Re Mi sounds. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=tactile%20sense" title="tactile sense">tactile sense</a>, <a href="https://publications.waset.org/abstracts/search?q=sound%20interface" title=" sound interface"> sound interface</a>, <a href="https://publications.waset.org/abstracts/search?q=stiffness%20perception" title=" stiffness perception"> stiffness perception</a>, <a href="https://publications.waset.org/abstracts/search?q=elastic%20object" title=" elastic object"> elastic object</a> </p> <a href="https://publications.waset.org/abstracts/13051/development-of-sound-tactile-interface-by-use-of-human-sensation-of-stiffness" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/13051.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">285</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11935</span> Project Paulina: A Human-Machine Interface for Individuals with Limited Mobility and Conclusions from Research and Development</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Radoslaw%20Nagay">Radoslaw Nagay</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The Paulina Project aims to address the challenges faced by immobilized individuals, such as those with multiple sclerosis, muscle dystrophy, or spinal cord injuries, by developing a flexible hardware and software solution. This paper presents the research and development efforts of our team, which commenced in 2019 and is now in its final stage. Recognizing the diverse needs and limitations of individuals with limited mobility, we conducted in-depth testing with a group of 30 participants. The insights gained from these tests led to the complete redesign of the system. Our presentation covers the initial project ideas, observations from in-situ tests, and the newly developed system that is currently under construction. Moreover, in response to the financial constraints faced by many disabled individuals, we propose an affordable business model for the future commercialization of our invention. Through the Paulina Project, we strive to empower immobilized individuals, providing them with greater independence and improved quality of life. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=UI" title="UI">UI</a>, <a href="https://publications.waset.org/abstracts/search?q=human-machine%20interface" title=" human-machine interface"> human-machine interface</a>, <a href="https://publications.waset.org/abstracts/search?q=social%20inclusion" title=" social inclusion"> social inclusion</a>, <a href="https://publications.waset.org/abstracts/search?q=multiple%20sclerosis" title=" multiple sclerosis"> multiple sclerosis</a>, <a href="https://publications.waset.org/abstracts/search?q=muscular%20dystrophy" title=" muscular dystrophy"> muscular dystrophy</a>, <a href="https://publications.waset.org/abstracts/search?q=spinal%20cord%20injury" title=" spinal cord injury"> spinal cord injury</a>, <a href="https://publications.waset.org/abstracts/search?q=quadriplegic" title=" quadriplegic"> quadriplegic</a> </p> <a href="https://publications.waset.org/abstracts/168449/project-paulina-a-human-machine-interface-for-individuals-with-limited-mobility-and-conclusions-from-research-and-development" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/168449.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">70</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11934</span> Deleterious SNP’s Detection Using Machine Learning</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hamza%20Zidoum">Hamza Zidoum</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper investigates the impact of human genetic variation on the function of human proteins using machine-learning algorithms. Single-Nucleotide Polymorphism represents the most common form of human genome variation. We focus on the single amino-acid polymorphism located in the coding region as they can affect the protein function leading to pathologic phenotypic change. We use several supervised Machine Learning methods to identify structural properties correlated with increased risk of the missense mutation being damaging. SVM associated with Principal Component Analysis give the best performance. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=single-nucleotide%20polymorphism" title="single-nucleotide polymorphism">single-nucleotide polymorphism</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title=" machine learning"> machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=feature%20selection" title=" feature selection"> feature selection</a>, <a href="https://publications.waset.org/abstracts/search?q=SVM" title=" SVM"> SVM</a> </p> <a href="https://publications.waset.org/abstracts/45046/deleterious-snps-detection-using-machine-learning" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/45046.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">377</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11933</span> Enabling Oral Communication and Accelerating Recovery: The Creation of a Novel Low-Cost Electroencephalography-Based Brain-Computer Interface for the Differently Abled</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Rishabh%20Ambavanekar">Rishabh Ambavanekar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Expressive Aphasia (EA) is an oral disability, common among stroke victims, in which the Broca’s area of the brain is damaged, interfering with verbal communication abilities. EA currently has no technological solutions and its only current viable solutions are inefficient or only available to the affluent. This prompts the need for an affordable, innovative solution to facilitate recovery and assist in speech generation. This project proposes a novel concept: using a wearable low-cost electroencephalography (EEG) device-based brain-computer interface (BCI) to translate a user’s inner dialogue into words. A low-cost EEG device was developed and found to be 10 to 100 times less expensive than any current EEG device on the market. As part of the BCI, a machine learning (ML) model was developed and trained using the EEG data. Two stages of testing were conducted to analyze the effectiveness of the device: a proof-of-concept and a final solution test. The proof-of-concept test demonstrated an average accuracy of above 90% and the final solution test demonstrated an average accuracy of above 75%. These two successful tests were used as a basis to demonstrate the viability of BCI research in developing lower-cost verbal communication devices. Additionally, the device proved to not only enable users to verbally communicate but has the potential to also assist in accelerated recovery from the disorder. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=neurotechnology" title="neurotechnology">neurotechnology</a>, <a href="https://publications.waset.org/abstracts/search?q=brain-computer%20interface" title=" brain-computer interface"> brain-computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=neuroscience" title=" neuroscience"> neuroscience</a>, <a href="https://publications.waset.org/abstracts/search?q=human-machine%20interface" title=" human-machine interface"> human-machine interface</a>, <a href="https://publications.waset.org/abstracts/search?q=BCI" title=" BCI"> BCI</a>, <a href="https://publications.waset.org/abstracts/search?q=HMI" title=" HMI"> HMI</a>, <a href="https://publications.waset.org/abstracts/search?q=aphasia" title=" aphasia"> aphasia</a>, <a href="https://publications.waset.org/abstracts/search?q=verbal%20disability" title=" verbal disability"> verbal disability</a>, <a href="https://publications.waset.org/abstracts/search?q=stroke" title=" stroke"> stroke</a>, <a href="https://publications.waset.org/abstracts/search?q=low-cost" title=" low-cost"> low-cost</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title=" machine learning"> machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=ML" title=" ML"> ML</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20recognition" title=" image recognition"> image recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=EEG" title=" EEG"> EEG</a>, <a href="https://publications.waset.org/abstracts/search?q=signal%20analysis" title=" signal analysis"> signal analysis</a> </p> <a href="https://publications.waset.org/abstracts/149743/enabling-oral-communication-and-accelerating-recovery-the-creation-of-a-novel-low-cost-electroencephalography-based-brain-computer-interface-for-the-differently-abled" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/149743.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">119</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11932</span> Wearable Interface for Telepresence in Robotics</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Uriel%20Martinez-Hernandez">Uriel Martinez-Hernandez</a>, <a href="https://publications.waset.org/abstracts/search?q=Luke%20W.%20Boorman"> Luke W. Boorman</a>, <a href="https://publications.waset.org/abstracts/search?q=Hamideh%20Kerdegari"> Hamideh Kerdegari</a>, <a href="https://publications.waset.org/abstracts/search?q=Tony%20J.%20Prescott"> Tony J. Prescott</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, we present architecture for the study of telepresence, immersion and human-robot interaction. The architecture is built around a wearable interface, developed here, that provides the human with visual, audio and tactile feedback from a remote location. We have chosen to interface the system with the iCub humanoid robot, as it mimics many human sensory modalities, such as vision, with gaze control and tactile feedback. This allows for a straightforward integration of multiple sensory modalities, but also offers a more complete immersion experience for the human. These systems are integrated, controlled and synchronised by an architecture developed for telepresence and human-robot interaction. Our wearable interface allows human participants to observe and explore a remote location, while also being able to communicate verbally with humans located in the remote environment. Our approach has been tested from local, domestic and business venues, using wired, wireless and Internet based connections. This has involved the implementation of data compression to maintain data quality to improve the immersion experience. Initial testing has shown the wearable interface to be robust. The system will endow humans with the ability to explore and interact with other humans at remote locations using multiple sensing modalities. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=telepresence" title="telepresence">telepresence</a>, <a href="https://publications.waset.org/abstracts/search?q=telerobotics" title=" telerobotics"> telerobotics</a>, <a href="https://publications.waset.org/abstracts/search?q=human-robot%20interaction" title=" human-robot interaction"> human-robot interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=virtual%20reality" title=" virtual reality"> virtual reality</a> </p> <a href="https://publications.waset.org/abstracts/43636/wearable-interface-for-telepresence-in-robotics" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/43636.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">290</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11931</span> Political Perspectives Regarding International Laws</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hamid%20Vahidkia">Hamid Vahidkia</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This exposition investigates the connection between two viewpoints on the nature of human rights. Agreeing with the “political” or “practical” point of view, human rights are claims that people have against certain regulation structures in specific present-day states, in the ethicalness of interface they have in settings that incorporate them. Agreeing with the more conventional “humanist” or “naturalistic” viewpoint, human rights are pre-institutional claims that people have against all other people in the ethicalness of interface characteristic of their common humankind. This paper contends that once we recognize the two viewpoints in their best light, we are able to see that they are complementary, and, in reality, we require both to form a great standardizing sense of the modern home of human rights. It clarifies how humanist and political contemplations can and ought to work in couple to account for the concept, substance, and legitimization of human rights. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=politics" title="politics">politics</a>, <a href="https://publications.waset.org/abstracts/search?q=human%20rights" title=" human rights"> human rights</a>, <a href="https://publications.waset.org/abstracts/search?q=humanities" title=" humanities"> humanities</a>, <a href="https://publications.waset.org/abstracts/search?q=mankind" title=" mankind"> mankind</a>, <a href="https://publications.waset.org/abstracts/search?q=law" title=" law"> law</a> </p> <a href="https://publications.waset.org/abstracts/183717/political-perspectives-regarding-international-laws" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/183717.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">59</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11930</span> Prototype Development of ARM-7 Based Embedded Controller for Packaging Machine</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jeelka%20Ray">Jeelka Ray</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Survey of the papers revealed that there is no practical design available for packaging machine based on Embedded system, so the need arose for the development of the prototype model. In this paper, author has worked on the development of an ARM7 based Embedded Controller for controlling the sequence of packaging machine. The unit is made user friendly with TFT and Touch Screen implementing human machine interface (HMI). The different system components are briefly discussed, followed by a description of the overall design. The major functions which involve bag forming, sealing temperature control, fault detection, alarm, animated view on the home screen when the machine is working as per different parameters set makes the machine performance more successful. LPC2478 ARM 7 Embedded Microcontroller controls the coordination of individual control function modules. In back gone days, these machines were manufactured with mechanical fittings. Later on, the electronic system replaced them. With the help of ongoing technologies, these mechanical systems were controlled electronically using Microprocessors. These became the backbone of the system which became a cause for the updating technologies in which the control was handed over to the Microcontrollers with Servo drives for accurate positioning of the material. This helped to maintain the quality of the products. Including all, RS 485 MODBUS Communication technology is used for synchronizing AC Drive & Servo Drive. These all concepts are operated either manually or through a Graphical User Interface. Automatic tuning of heaters, sealers and their temperature is controlled using Proportional, Integral and Derivation loops. In the upcoming latest technological world, the practical implementation of the above mentioned concepts is really important to be in the user friendly environment. Real time model is implemented and tested on the actual machine and received fruitful results. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=packaging%20machine" title="packaging machine">packaging machine</a>, <a href="https://publications.waset.org/abstracts/search?q=embedded%20system" title=" embedded system"> embedded system</a>, <a href="https://publications.waset.org/abstracts/search?q=ARM%207" title=" ARM 7"> ARM 7</a>, <a href="https://publications.waset.org/abstracts/search?q=micro%20controller" title=" micro controller"> micro controller</a>, <a href="https://publications.waset.org/abstracts/search?q=HMI" title=" HMI"> HMI</a>, <a href="https://publications.waset.org/abstracts/search?q=TFT" title=" TFT"> TFT</a>, <a href="https://publications.waset.org/abstracts/search?q=touch%20screen" title=" touch screen"> touch screen</a>, <a href="https://publications.waset.org/abstracts/search?q=PID" title=" PID"> PID</a> </p> <a href="https://publications.waset.org/abstracts/43478/prototype-development-of-arm-7-based-embedded-controller-for-packaging-machine" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/43478.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">275</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11929</span> Clarifier Dialogue Interface to resolve linguistic ambiguities in E-Learning Environment</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Dalila%20Souilem">Dalila Souilem</a>, <a href="https://publications.waset.org/abstracts/search?q=Salma%20Boumiza"> Salma Boumiza</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdelkarim%20Abdelkader"> Abdelkarim Abdelkader</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The Clarifier Dialogue Interface (CDI) is a part of an online teaching system based on human-machine communication in learning situation. This interface used in the system during the learning action specifically in the evaluation step, to clarify ambiguities in the learner's response. The CDI can generate patterns allowing access to an information system, using the selectors associated with lexical units. To instantiate these patterns, the user request (especially learner’s response), must be analyzed and interpreted to deduce the canonical form, the semantic form and the subject of the sentence. For the efficiency of this interface at the interpretation level, a set of substitution operators is carried out in order to extend the possibilities of manipulation with a natural language. A second approach that will be presented in this paper focuses on the object languages with new prospects such as combination of natural language with techniques of handling information system in the area of online education. So all operators, the CDI and other interfaces associated to the domain expertise and teaching strategies will be unified using FRAME representation form. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=dialogue" title="dialogue">dialogue</a>, <a href="https://publications.waset.org/abstracts/search?q=e-learning" title=" e-learning"> e-learning</a>, <a href="https://publications.waset.org/abstracts/search?q=FRAME" title=" FRAME"> FRAME</a>, <a href="https://publications.waset.org/abstracts/search?q=information%20system" title=" information system"> information system</a>, <a href="https://publications.waset.org/abstracts/search?q=natural%20language" title=" natural language"> natural language</a> </p> <a href="https://publications.waset.org/abstracts/40509/clarifier-dialogue-interface-to-resolve-linguistic-ambiguities-in-e-learning-environment" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/40509.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">377</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11928</span> A New Approach towards the Development of Next Generation CNC</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yusri%20Yusof">Yusri Yusof</a>, <a href="https://publications.waset.org/abstracts/search?q=Kamran%20Latif"> Kamran Latif</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Computer Numeric Control (CNC) machine has been widely used in the industries since its inception. Currently, in CNC technology has been used for various operations like milling, drilling, packing and welding etc. with the rapid growth in the manufacturing world the demand of flexibility in the CNC machines has rapidly increased. Previously, the commercial CNC failed to provide flexibility because its structure was of closed nature that does not provide access to the inner features of CNC. Also CNC’s operating ISO data interface model was found to be limited. Therefore, to overcome that problem, Open Architecture Control (OAC) technology and STEP-NC data interface model are introduced. At present the Personal Computer (PC) has been the best platform for the development of open-CNC systems. In this paper, both ISO data interface model interpretation, its verification and execution has been highlighted with the introduction of the new techniques. The proposed is composed of ISO data interpretation, 3D simulation and machine motion control modules. The system is tested on an old 3 axis CNC milling machine. The results are found to be satisfactory in performance. This implementation has successfully enabled sustainable manufacturing environment. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=CNC" title="CNC">CNC</a>, <a href="https://publications.waset.org/abstracts/search?q=ISO%206983" title=" ISO 6983"> ISO 6983</a>, <a href="https://publications.waset.org/abstracts/search?q=ISO%2014649" title=" ISO 14649"> ISO 14649</a>, <a href="https://publications.waset.org/abstracts/search?q=LabVIEW" title=" LabVIEW"> LabVIEW</a>, <a href="https://publications.waset.org/abstracts/search?q=open%20architecture%20control" title=" open architecture control"> open architecture control</a>, <a href="https://publications.waset.org/abstracts/search?q=reconfigurable%20manufacturing%20systems" title=" reconfigurable manufacturing systems"> reconfigurable manufacturing systems</a>, <a href="https://publications.waset.org/abstracts/search?q=sustainable%20manufacturing" title=" sustainable manufacturing"> sustainable manufacturing</a>, <a href="https://publications.waset.org/abstracts/search?q=Soft-CNC" title=" Soft-CNC"> Soft-CNC</a> </p> <a href="https://publications.waset.org/abstracts/28745/a-new-approach-towards-the-development-of-next-generation-cnc" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/28745.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">516</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11927</span> Gesture-Controlled Interface Using Computer Vision and Python</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Vedant%20Vardhan%20Rathour">Vedant Vardhan Rathour</a>, <a href="https://publications.waset.org/abstracts/search?q=Anant%20Agrawal"> Anant Agrawal</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The project aims to provide a touchless, intuitive interface for human-computer interaction, enabling users to control their computer using hand gestures and voice commands. The system leverages advanced computer vision techniques using the MediaPipe framework and OpenCV to detect and interpret real time hand gestures, transforming them into mouse actions such as clicking, dragging, and scrolling. Additionally, the integration of a voice assistant powered by the Speech Recognition library allows for seamless execution of tasks like web searches, location navigation and gesture control on the system through voice commands. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=gesture%20recognition" title="gesture recognition">gesture recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=hand%20tracking" title=" hand tracking"> hand tracking</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title=" machine learning"> machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=convolutional%20neural%20networks" title=" convolutional neural networks"> convolutional neural networks</a> </p> <a href="https://publications.waset.org/abstracts/193844/gesture-controlled-interface-using-computer-vision-and-python" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/193844.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">12</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11926</span> Models Development of Graphical Human Interface Using Fuzzy Logic</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=%C3%89rick%20Arag%C3%A3o%20Ribeiro">Érick Aragão Ribeiro</a>, <a href="https://publications.waset.org/abstracts/search?q=George%20Andr%C3%A9%20Pereira%20Th%C3%A9"> George André Pereira Thé</a>, <a href="https://publications.waset.org/abstracts/search?q=Jos%C3%A9%20Marques%20Soares"> José Marques Soares</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Graphical Human Interface, also known as supervision software, are increasingly present in industrial processes supported by Supervisory Control and Data Acquisition (SCADA) systems and so it is evident the need for qualified developers. In order to make engineering students able to produce high quality supervision software, method for the development must be created. In this paper we propose model, based on the international standards ISO/IEC 25010 and ISO/IEC 25040, for the development of graphical human interface. When compared with to other methods through experiments, the model here presented leads to improved quality indexes, therefore help guiding the decisions of programmers. Results show the efficiency of the models and the contribution to student learning. Students assessed the training they have received and considered it satisfactory. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=software%20development%20models" title="software development models">software development models</a>, <a href="https://publications.waset.org/abstracts/search?q=software%20quality" title=" software quality"> software quality</a>, <a href="https://publications.waset.org/abstracts/search?q=supervision%20software" title=" supervision software"> supervision software</a>, <a href="https://publications.waset.org/abstracts/search?q=fuzzy%20logic" title=" fuzzy logic"> fuzzy logic</a> </p> <a href="https://publications.waset.org/abstracts/40132/models-development-of-graphical-human-interface-using-fuzzy-logic" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/40132.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">373</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11925</span> Experimental Simulation Set-Up for Validating Out-Of-The-Loop Mitigation when Monitoring High Levels of Automation in Air Traffic Control</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Oliver%20Ohneiser">Oliver Ohneiser</a>, <a href="https://publications.waset.org/abstracts/search?q=Francesca%20De%20Crescenzio"> Francesca De Crescenzio</a>, <a href="https://publications.waset.org/abstracts/search?q=Gianluca%20Di%20Flumeri"> Gianluca Di Flumeri</a>, <a href="https://publications.waset.org/abstracts/search?q=Jan%20Kraemer"> Jan Kraemer</a>, <a href="https://publications.waset.org/abstracts/search?q=Bruno%20Berberian"> Bruno Berberian</a>, <a href="https://publications.waset.org/abstracts/search?q=Sara%20Bagassi"> Sara Bagassi</a>, <a href="https://publications.waset.org/abstracts/search?q=Nicolina%20Sciaraffa"> Nicolina Sciaraffa</a>, <a href="https://publications.waset.org/abstracts/search?q=Pietro%20Aric%C3%B2"> Pietro Aricò</a>, <a href="https://publications.waset.org/abstracts/search?q=Gianluca%20Borghini"> Gianluca Borghini</a>, <a href="https://publications.waset.org/abstracts/search?q=Fabio%20Babiloni"> Fabio Babiloni</a> </p> <p class="card-text"><strong>Abstract:</strong></p> An increasing degree of automation in air traffic will also change the role of the air traffic controller (ATCO). ATCOs will fulfill significantly more monitoring tasks compared to today. However, this rather passive role may lead to Out-Of-The-Loop (OOTL) effects comprising vigilance decrement and less situation awareness. The project MINIMA (Mitigating Negative Impacts of Monitoring high levels of Automation) has conceived a system to control and mitigate such OOTL phenomena. In order to demonstrate the MINIMA concept, an experimental simulation set-up has been designed. This set-up consists of two parts: 1) a Task Environment (TE) comprising a Terminal Maneuvering Area (TMA) simulator as well as 2) a Vigilance and Attention Controller (VAC) based on neurophysiological data recording such as electroencephalography (EEG) and eye-tracking devices. The current vigilance level and the attention focus of the controller are measured during the ATCO&rsquo;s active work in front of the human machine interface (HMI). The derived vigilance level and attention trigger adaptive automation functionalities in the TE to avoid OOTL effects. This paper describes the full-scale experimental set-up and the component development work towards it. Hence, it encompasses a pre-test whose results influenced the development of the VAC as well as the functionalities of the final TE and the two VAC&rsquo;s sub-components. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=automation" title="automation">automation</a>, <a href="https://publications.waset.org/abstracts/search?q=human%20factors" title=" human factors"> human factors</a>, <a href="https://publications.waset.org/abstracts/search?q=air%20traffic%20controller" title=" air traffic controller"> air traffic controller</a>, <a href="https://publications.waset.org/abstracts/search?q=MINIMA" title=" MINIMA"> MINIMA</a>, <a href="https://publications.waset.org/abstracts/search?q=OOTL%20%28Out-Of-The-Loop%29" title=" OOTL (Out-Of-The-Loop)"> OOTL (Out-Of-The-Loop)</a>, <a href="https://publications.waset.org/abstracts/search?q=EEG%20%28Electroencephalography%29" title=" EEG (Electroencephalography)"> EEG (Electroencephalography)</a>, <a href="https://publications.waset.org/abstracts/search?q=HMI%20%28Human%20Machine%20Interface%29" title=" HMI (Human Machine Interface)"> HMI (Human Machine Interface)</a> </p> <a href="https://publications.waset.org/abstracts/84169/experimental-simulation-set-up-for-validating-out-of-the-loop-mitigation-when-monitoring-high-levels-of-automation-in-air-traffic-control" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/84169.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">383</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11924</span> The Mental Workload of ICU Nurses in Performing Human-Machine Tasks: A Cross-sectional Survey</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yan%20Yan">Yan Yan</a>, <a href="https://publications.waset.org/abstracts/search?q=Erhong%20Sun"> Erhong Sun</a>, <a href="https://publications.waset.org/abstracts/search?q=Lin%20Peng"> Lin Peng</a>, <a href="https://publications.waset.org/abstracts/search?q=Xuchun%20Ye"> Xuchun Ye</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Aims: The present study aimed to explore Intensive Care Unit(ICU) nurses’ mental workload (MWL) and associated factors with it in performing human-machine tasks. Background: A wide range of emerging technologies have penetrated widely in the field of health care, and ICU nurses are facing a dramatic increase in nursing human-machine tasks. However, there is still a paucity of literature reporting on the general MWL of ICU nurses performing human-machine tasks and the associated influencing factors. Methods: A cross-sectional survey was employed. The data was collected from January to February 2021 from 9 tertiary hospitals in 6 provinces (Shanghai, Gansu, Guangdong, Liaoning, Shandong, and Hubei). Two-stage sampling was used to recruit eligible ICU nurses (n=427). The data were collected with an electronic questionnaire comprising sociodemographic characteristics and the measures of MWL, self-efficacy, system usability, and task difficulty. The univariate analysis, two-way analysis of variance(ANOVA), and a linear mixed model were used for data analysis. Results: Overall, the mental workload of ICU nurses in performing human-machine tasks was medium (score 52.04 on a 0-100 scale). Among the typical nursing human-machine tasks selected, the MWL of ICU nurses in completing first aid and life support tasks (‘Using a defibrillator to defibrillate’ and ‘Use of ventilator’) was significantly higher than others (p < .001). And ICU nurses’ MWL in performing human-machine tasks was also associated with age (p = .001), professional title (p = .002), years of working in ICU (p < .001), willingness to study emerging technology actively (p = .006), task difficulty (p < .001), and system usability (p < .001). Conclusion: The MWL of ICU nurses is at a moderate level in the context of a rapid increase in nursing human-machine tasks. However, there are significant differences in MWL when performing different types of human-machine tasks, and MWL can be influenced by a combination of factors. Nursing managers need to develop intervention strategies in multiple ways. Implications for practice: Multidimensional approaches are required to perform human-machine tasks better, including enhancing nurses' willingness to learn emerging technologies actively, developing training strategies that vary with tasks, and identifying obstacles in the process of human-machine system interaction. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=mental%20workload%28MWL%29" title="mental workload(MWL)">mental workload(MWL)</a>, <a href="https://publications.waset.org/abstracts/search?q=nurse" title=" nurse"> nurse</a>, <a href="https://publications.waset.org/abstracts/search?q=ICU" title=" ICU"> ICU</a>, <a href="https://publications.waset.org/abstracts/search?q=human-machine" title=" human-machine"> human-machine</a>, <a href="https://publications.waset.org/abstracts/search?q=tasks" title=" tasks"> tasks</a>, <a href="https://publications.waset.org/abstracts/search?q=cross-sectional%20study" title=" cross-sectional study"> cross-sectional study</a>, <a href="https://publications.waset.org/abstracts/search?q=linear%20mixed%20model" title=" linear mixed model"> linear mixed model</a>, <a href="https://publications.waset.org/abstracts/search?q=China" title=" China"> China</a> </p> <a href="https://publications.waset.org/abstracts/161738/the-mental-workload-of-icu-nurses-in-performing-human-machine-tasks-a-cross-sectional-survey" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/161738.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">104</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11923</span> Reliability Assessment and Failure Detection in a Complex Human-Machine System Using Agent-Based and Human Decision-Making Modeling</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sanjal%20Gavande">Sanjal Gavande</a>, <a href="https://publications.waset.org/abstracts/search?q=Thomas%20Mazzuchi"> Thomas Mazzuchi</a>, <a href="https://publications.waset.org/abstracts/search?q=Shahram%20Sarkani"> Shahram Sarkani</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In a complex aerospace operational environment, identifying failures in a procedure involving multiple human-machine interactions are difficult. These failures could lead to accidents causing loss of hardware or human life. The likelihood of failure further increases if operational procedures are tested for a novel system with multiple human-machine interfaces and with no prior performance data. The existing approach in the literature of reviewing complex operational tasks in a flowchart or tabular form doesn’t provide any insight into potential system failures due to human decision-making ability. To address these challenges, this research explores an agent-based simulation approach for reliability assessment and fault detection in complex human-machine systems while utilizing a human decision-making model. The simulation will predict the emergent behavior of the system due to the interaction between humans and their decision-making capability with the varying states of the machine and vice-versa. Overall system reliability will be evaluated based on a defined set of success-criteria conditions and the number of recorded failures over an assigned limit of Monte Carlo runs. The study also aims at identifying high-likelihood failure locations for the system. The research concludes that system reliability and failures can be effectively calculated when individual human and machine agent states are clearly defined. This research is limited to the operations phase of a system lifecycle process in an aerospace environment only. Further exploration of the proposed agent-based and human decision-making model will be required to allow for a greater understanding of this topic for application outside of the operations domain. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=agent-based%20model" title="agent-based model">agent-based model</a>, <a href="https://publications.waset.org/abstracts/search?q=complex%20human-machine%20system" title=" complex human-machine system"> complex human-machine system</a>, <a href="https://publications.waset.org/abstracts/search?q=human%20decision-making%20model" title=" human decision-making model"> human decision-making model</a>, <a href="https://publications.waset.org/abstracts/search?q=system%20reliability%20assessment" title=" system reliability assessment"> system reliability assessment</a> </p> <a href="https://publications.waset.org/abstracts/167003/reliability-assessment-and-failure-detection-in-a-complex-human-machine-system-using-agent-based-and-human-decision-making-modeling" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/167003.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">168</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11922</span> Noninvasive Brain-Machine Interface to Control Both Mecha TE Robotic Hands Using Emotiv EEG Neuroheadset</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Adrienne%20Kline">Adrienne Kline</a>, <a href="https://publications.waset.org/abstracts/search?q=Jaydip%20Desai"> Jaydip Desai</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Electroencephalogram (EEG) is a noninvasive technique that registers signals originating from the firing of neurons in the brain. The Emotiv EEG Neuroheadset is a consumer product comprised of 14 EEG channels and was used to record the reactions of the neurons within the brain to two forms of stimuli in 10 participants. These stimuli consisted of auditory and visual formats that provided directions of ‘right’ or ‘left.’ Participants were instructed to raise their right or left arm in accordance with the instruction given. A scenario in OpenViBE was generated to both stimulate the participants while recording their data. In OpenViBE, the Graz Motor BCI Stimulator algorithm was configured to govern the duration and number of visual stimuli. Utilizing EEGLAB under the cross platform MATLAB®, the electrodes most stimulated during the study were defined. Data outputs from EEGLAB were analyzed using IBM SPSS Statistics® Version 20. This aided in determining the electrodes to use in the development of a brain-machine interface (BMI) using real-time EEG signals from the Emotiv EEG Neuroheadset. Signal processing and feature extraction were accomplished via the Simulink® signal processing toolbox. An Arduino™ Duemilanove microcontroller was used to link the Emotiv EEG Neuroheadset and the right and left Mecha TE™ Hands. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=brain-machine%20interface" title="brain-machine interface">brain-machine interface</a>, <a href="https://publications.waset.org/abstracts/search?q=EEGLAB" title=" EEGLAB"> EEGLAB</a>, <a href="https://publications.waset.org/abstracts/search?q=emotiv%20EEG%20neuroheadset" title=" emotiv EEG neuroheadset"> emotiv EEG neuroheadset</a>, <a href="https://publications.waset.org/abstracts/search?q=OpenViBE" title=" OpenViBE"> OpenViBE</a>, <a href="https://publications.waset.org/abstracts/search?q=simulink" title=" simulink"> simulink</a> </p> <a href="https://publications.waset.org/abstracts/28333/noninvasive-brain-machine-interface-to-control-both-mecha-te-robotic-hands-using-emotiv-eeg-neuroheadset" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/28333.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">502</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11921</span> Parameters Influencing Human Machine Interaction in Hospitals</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hind%20Bouami">Hind Bouami</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Handling life-critical systems complexity requires to be equipped with appropriate technology and the right human agents’ functions such as knowledge, experience, and competence in problem’s prevention and solving. Human agents are involved in the management and control of human-machine system’s performance. Documenting human agent’s situation awareness is crucial to support human-machine designers’ decision-making. Knowledge about risks, critical parameters and factors that can impact and threaten automation system’s performance should be collected using preventive and retrospective approaches. This paper aims to document operators’ situation awareness through the analysis of automated organizations’ feedback. The analysis of automated hospital pharmacies feedbacks helps to identify and control critical parameters influencing human machine interaction in order to enhance system’s performance and security. Our human machine system evaluation approach has been deployed in Macon hospital center’s pharmacy which is equipped with automated drug dispensing systems since 2015. Automation’s specifications are related to technical aspects, human-machine interaction, and human aspects. The evaluation of drug delivery automation performance in Macon hospital center has shown that the performance of the automated activity depends on the performance of the automated solution chosen, and also on the control of systemic factors. In fact, 80.95% of automation specification related to the chosen Sinteco’s automated solution is met. The performance of the chosen automated solution is involved in 28.38% of automation specifications performance in Macon hospital center. The remaining systemic parameters involved in automation specifications performance need to be controlled. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=life-critical%20systems" title="life-critical systems">life-critical systems</a>, <a href="https://publications.waset.org/abstracts/search?q=situation%20awareness" title=" situation awareness"> situation awareness</a>, <a href="https://publications.waset.org/abstracts/search?q=human-machine%20interaction" title=" human-machine interaction"> human-machine interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=decision-making" title=" decision-making"> decision-making</a> </p> <a href="https://publications.waset.org/abstracts/139410/parameters-influencing-human-machine-interaction-in-hospitals" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/139410.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">181</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11920</span> The Mental Workload of Intensive Care Unit Nurses in Performing Human-Machine Tasks: A Cross-Sectional Survey</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yan%20Yan">Yan Yan</a>, <a href="https://publications.waset.org/abstracts/search?q=Erhong%20Sun"> Erhong Sun</a>, <a href="https://publications.waset.org/abstracts/search?q=Lin%20Peng"> Lin Peng</a>, <a href="https://publications.waset.org/abstracts/search?q=Xuchun%20Ye"> Xuchun Ye</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Aims: The present study aimed to explore Intensive Care Unit (ICU) nurses’ mental workload (MWL) and associated factors with it in performing human-machine tasks. Background: A wide range of emerging technologies have penetrated widely in the field of health care, and ICU nurses are facing a dramatic increase in nursing human-machine tasks. However, there is still a paucity of literature reporting on the general MWL of ICU nurses performing human-machine tasks and the associated influencing factors. Methods: A cross-sectional survey was employed. The data was collected from January to February 2021 from 9 tertiary hospitals in 6 provinces (Shanghai, Gansu, Guangdong, Liaoning, Shandong, and Hubei). Two-stage sampling was used to recruit eligible ICU nurses (n=427). The data were collected with an electronic questionnaire comprising sociodemographic characteristics and the measures of MWL, self-efficacy, system usability, and task difficulty. The univariate analysis, two-way analysis of variance (ANOVA), and a linear mixed model were used for data analysis. Results: Overall, the mental workload of ICU nurses in performing human-machine tasks was medium (score 52.04 on a 0-100 scale). Among the typical nursing human-machine tasks selected, the MWL of ICU nurses in completing first aid and life support tasks (‘Using a defibrillator to defibrillate’ and ‘Use of ventilator’) was significantly higher than others (p < .001). And ICU nurses’ MWL in performing human-machine tasks was also associated with age (p = .001), professional title (p = .002), years of working in ICU (p < .001), willingness to study emerging technology actively (p = .006), task difficulty (p < .001), and system usability (p < .001). Conclusion: The MWL of ICU nurses is at a moderate level in the context of a rapid increase in nursing human-machine tasks. However, there are significant differences in MWL when performing different types of human-machine tasks, and MWL can be influenced by a combination of factors. Nursing managers need to develop intervention strategies in multiple ways. Implications for practice: Multidimensional approaches are required to perform human-machine tasks better, including enhancing nurses' willingness to learn emerging technologies actively, developing training strategies that vary with tasks, and identifying obstacles in the process of human-machine system interaction. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=mental%20workload" title="mental workload">mental workload</a>, <a href="https://publications.waset.org/abstracts/search?q=nurse" title=" nurse"> nurse</a>, <a href="https://publications.waset.org/abstracts/search?q=ICU" title=" ICU"> ICU</a>, <a href="https://publications.waset.org/abstracts/search?q=human-machine" title=" human-machine"> human-machine</a>, <a href="https://publications.waset.org/abstracts/search?q=tasks" title=" tasks"> tasks</a>, <a href="https://publications.waset.org/abstracts/search?q=cross-sectional%20study" title=" cross-sectional study"> cross-sectional study</a>, <a href="https://publications.waset.org/abstracts/search?q=linear%20mixed%20model" title=" linear mixed model"> linear mixed model</a>, <a href="https://publications.waset.org/abstracts/search?q=China" title=" China"> China</a> </p> <a href="https://publications.waset.org/abstracts/160977/the-mental-workload-of-intensive-care-unit-nurses-in-performing-human-machine-tasks-a-cross-sectional-survey" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/160977.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">69</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11919</span> Brain Computer Interface Implementation for Affective Computing Sensing: Classifiers Comparison</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ram%C3%B3n%20Aparicio-Garc%C3%ADa">Ramón Aparicio-García</a>, <a href="https://publications.waset.org/abstracts/search?q=Gustavo%20Ju%C3%A1rez%20Gracia"> Gustavo Juárez Gracia</a>, <a href="https://publications.waset.org/abstracts/search?q=Jes%C3%BAs%20%C3%81lvarez%20Cedillo"> Jesús Álvarez Cedillo</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A research line of the computer science that involve the study of the Human-Computer Interaction (HCI), which search to recognize and interpret the user intent by the storage and the subsequent analysis of the electrical signals of the brain, for using them in the control of electronic devices. On the other hand, the affective computing research applies the human emotions in the HCI process helping to reduce the user frustration. This paper shows the results obtained during the hardware and software development of a Brain Computer Interface (BCI) capable of recognizing the human emotions through the association of the brain electrical activity patterns. The hardware involves the sensing stage and analogical-digital conversion. The interface software involves algorithms for pre-processing of the signal in time and frequency analysis and the classification of patterns associated with the electrical brain activity. The methods used for the analysis and classification of the signal have been tested separately, by using a database that is accessible to the public, besides to a comparison among classifiers in order to know the best performing. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=affective%20computing" title="affective computing">affective computing</a>, <a href="https://publications.waset.org/abstracts/search?q=interface" title=" interface"> interface</a>, <a href="https://publications.waset.org/abstracts/search?q=brain" title=" brain"> brain</a>, <a href="https://publications.waset.org/abstracts/search?q=intelligent%20interaction" title=" intelligent interaction"> intelligent interaction</a> </p> <a href="https://publications.waset.org/abstracts/27725/brain-computer-interface-implementation-for-affective-computing-sensing-classifiers-comparison" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/27725.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">388</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11918</span> A Holographic Infotainment System for Connected and Driverless Cars: An Exploratory Study of Gesture Based Interaction </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Nicholas%20Lambert">Nicholas Lambert</a>, <a href="https://publications.waset.org/abstracts/search?q=Seungyeon%20Ryu"> Seungyeon Ryu</a>, <a href="https://publications.waset.org/abstracts/search?q=Mehmet%20Mulla"> Mehmet Mulla</a>, <a href="https://publications.waset.org/abstracts/search?q=Albert%20Kim"> Albert Kim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, an interactive in-car interface called HoloDash is presented. It is intended to provide information and infotainment in both autonomous vehicles and &lsquo;connected cars&rsquo;, vehicles equipped with Internet access via cellular services. The research focuses on the development of interactive avatars for this system and its gesture-based control system. This is a case study for the development of a possible human-centred means of presenting a connected or autonomous vehicle&rsquo;s On-Board Diagnostics through a projected &lsquo;holographic&rsquo; infotainment system. This system is termed a Holographic Human Vehicle Interface (HHIV), as it utilises a dashboard projection unit and gesture detection. The research also examines the suitability for gestures in an automotive environment, given that it might be used in both driver-controlled and driverless vehicles. Using Human Centred Design methods, questions were posed to test subjects and preferences discovered in terms of the gesture interface and the user experience for passengers within the vehicle. These affirm the benefits of this mode of visual communication for both connected and driverless cars. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=gesture" title="gesture">gesture</a>, <a href="https://publications.waset.org/abstracts/search?q=holographic%20interface" title=" holographic interface"> holographic interface</a>, <a href="https://publications.waset.org/abstracts/search?q=human-computer%20interaction" title=" human-computer interaction"> human-computer interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=user-centered%20design" title=" user-centered design"> user-centered design</a> </p> <a href="https://publications.waset.org/abstracts/87789/a-holographic-infotainment-system-for-connected-and-driverless-cars-an-exploratory-study-of-gesture-based-interaction" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/87789.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">312</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">&lsaquo;</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=HMI%20%28Human%20Machine%20Interface%29&amp;page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=HMI%20%28Human%20Machine%20Interface%29&amp;page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=HMI%20%28Human%20Machine%20Interface%29&amp;page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=HMI%20%28Human%20Machine%20Interface%29&amp;page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=HMI%20%28Human%20Machine%20Interface%29&amp;page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=HMI%20%28Human%20Machine%20Interface%29&amp;page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=HMI%20%28Human%20Machine%20Interface%29&amp;page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=HMI%20%28Human%20Machine%20Interface%29&amp;page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=HMI%20%28Human%20Machine%20Interface%29&amp;page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=HMI%20%28Human%20Machine%20Interface%29&amp;page=398">398</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=HMI%20%28Human%20Machine%20Interface%29&amp;page=399">399</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=HMI%20%28Human%20Machine%20Interface%29&amp;page=2" rel="next">&rsaquo;</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">&copy; 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">&times;</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10