CINXE.COM

Search results for: haptic

<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: haptic</title> <meta name="description" content="Search results for: haptic"> <meta name="keywords" content="haptic"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="haptic" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="haptic"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 37</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: haptic</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">37</span> Effects of Non-Diagnostic Haptic Information on Consumers&#039; Product Judgments and Decisions</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Eun%20Young%20Park">Eun Young Park</a>, <a href="https://publications.waset.org/abstracts/search?q=Jongwon%20Park"> Jongwon Park</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A physical touch of a product can provide ample diagnostic information about the product attributes and quality. However, consumers’ product judgments and purchases can be erroneously influenced by non-diagnostic haptic information. For example, consumers’ evaluations of the coffee they drink could be affected by the heaviness of a cup that is used for just serving the coffee. This important issue has received little attention in prior research. The present research contributes to the literature by identifying when and how non-diagnostic haptic information can have an influence and why such influence occurs. Specifically, five studies experimentally varied the content of non-diagnostic haptic information, such as the weight of a cup (heavy vs. light) and the texture of a cup holder (smooth vs. rough), and then assessed the impact of the manipulation on product judgments and decisions. Results show that non-diagnostic haptic information has a biasing impact on consumer judgments. For example, the heavy (vs. light) cup increases consumers’ perception of the richness of coffee in it, and the rough (vs. smooth) texture of a cup holder increases the perception of the healthfulness of fruit juice in it, which in turn increases consumers’ purchase intentions of the product. When consumers are cognitively distracted during the touch experience, the impact of the content of haptic information is no longer evident, but the valence (positive vs. negative) of the haptic experience influences product judgments. However, consumers are able to avoid the impact of non-diagnostic haptic information, if and only if they are both knowledgeable about the product category and undistracted from processing the touch experience. In sum, the nature of the influence by non-diagnostic haptic information (i.e., assimilation effect vs. contrast effect vs. null effect) is determined by the content and valence of haptic information, the relative impact of which depends on whether consumers can identify the content and source of the haptic information. Theoretically, to our best knowledge, this research is the first to document the empirical evidence of the interplay between cognitive and affective processes that determines the impact of non-diagnostic haptic information. Managerial implications are discussed. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=consumer%20behavior" title="consumer behavior">consumer behavior</a>, <a href="https://publications.waset.org/abstracts/search?q=haptic%20information" title=" haptic information"> haptic information</a>, <a href="https://publications.waset.org/abstracts/search?q=product%20judgments" title=" product judgments"> product judgments</a>, <a href="https://publications.waset.org/abstracts/search?q=touch%20effect" title=" touch effect"> touch effect</a> </p> <a href="https://publications.waset.org/abstracts/94100/effects-of-non-diagnostic-haptic-information-on-consumers-product-judgments-and-decisions" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/94100.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">174</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">36</span> Haptic Cycle: Designing Enhanced Museum Learning Activities</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Menelaos%20N.%20Katsantonis">Menelaos N. Katsantonis</a>, <a href="https://publications.waset.org/abstracts/search?q=Athanasios%20Manikas"> Athanasios Manikas</a>, <a href="https://publications.waset.org/abstracts/search?q=Alexandros%20Chatzis"> Alexandros Chatzis</a>, <a href="https://publications.waset.org/abstracts/search?q=Stavros%20Doropoulos"> Stavros Doropoulos</a>, <a href="https://publications.waset.org/abstracts/search?q=Anastasios%20Avramis"> Anastasios Avramis</a>, <a href="https://publications.waset.org/abstracts/search?q=Ioannis%20Mavridis"> Ioannis Mavridis</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Museums enhance their potential by adopting new technologies and techniques to appeal to more visitors and engage them in creative and joyful activities. In this study, the Haptic Cycle is presented, a cycle of museum activities proposed for the development of museum learning approaches with optimized effectiveness and engagement. Haptic Cycle envisages the improvement of the museum’s services by offering a wide range of activities. Haptic Cycle activities make the museum’s exhibitions more approachable by bringing them closer to the visitors. Visitors can interact with the museum’s artifacts and explore them haptically and sonically. Haptic Cycle proposes constructivist learning activities in which visitors actively construct their knowledge by exploring the artifacts, experimenting with them and realizing their importance. Based on the Haptic Cycle, we developed the HapticSOUND system, an innovative virtual reality system that includes an advanced user interface that employs gesture-based technology. HapticSOUND’s interface utilizes the leap motion gesture recognition controller and a 3D-printed traditional Cretan lute, utilized by visitors to perform various activities such as exploring the lute and playing notes and songs. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=haptic%20cycle" title="haptic cycle">haptic cycle</a>, <a href="https://publications.waset.org/abstracts/search?q=HapticSOUND" title=" HapticSOUND"> HapticSOUND</a>, <a href="https://publications.waset.org/abstracts/search?q=museum%20learning" title=" museum learning"> museum learning</a>, <a href="https://publications.waset.org/abstracts/search?q=gesture-based" title=" gesture-based"> gesture-based</a>, <a href="https://publications.waset.org/abstracts/search?q=leap%20motion" title=" leap motion"> leap motion</a> </p> <a href="https://publications.waset.org/abstracts/165300/haptic-cycle-designing-enhanced-museum-learning-activities" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/165300.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">91</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">35</span> An Assistive Robotic Arm for Defence and Rescue Application</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=J.%20Harrison%20Kurunathan">J. Harrison Kurunathan</a>, <a href="https://publications.waset.org/abstracts/search?q=R.%20Jayaparvathy"> R. Jayaparvathy</a> </p> <p class="card-text"><strong>Abstract:</strong></p> "Assistive Robotics" is the field that deals with the study of robots that helps in human motion and also empowers human abilities by interfacing the robotic systems to be manipulated by human motion. The proposed model is a robotic arm that works as a haptic interface on the basis on accelerometers and DC motors that will function with respect to the movement of the human muscle. The proposed model would effectively work as a haptic interface that would reduce human effort in the field of defense and rescue. This can be used in very critical conditions like fire accidents to avoid causalities. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=accelerometers" title="accelerometers">accelerometers</a>, <a href="https://publications.waset.org/abstracts/search?q=haptic%20interface" title=" haptic interface"> haptic interface</a>, <a href="https://publications.waset.org/abstracts/search?q=servo%20motors" title=" servo motors"> servo motors</a>, <a href="https://publications.waset.org/abstracts/search?q=signal%20processing" title=" signal processing"> signal processing</a> </p> <a href="https://publications.waset.org/abstracts/6771/an-assistive-robotic-arm-for-defence-and-rescue-application" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/6771.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">397</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">34</span> Learning Gains and Constraints Resulting from Haptic Sensory Feedback among Preschoolers&#039; Engagement during Science Experimentation </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Marios%20Papaevripidou">Marios Papaevripidou</a>, <a href="https://publications.waset.org/abstracts/search?q=Yvoni%20Pavlou"> Yvoni Pavlou</a>, <a href="https://publications.waset.org/abstracts/search?q=Zacharias%20Zacharia"> Zacharias Zacharia</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Embodied cognition and additional (touch) sensory channel theories indicate that physical manipulation is crucial to learning since it provides, among others, touch sensory input, which is needed for constructing knowledge. Given these theories, the use of Physical Manipulatives (PM) becomes a prerequisite for learning. On the other hand, empirical research on Virtual Manipulatives (VM) (e.g., simulations) learning has provided evidence showing that the use of PM, and thus haptic sensory input, is not always a prerequisite for learning. In order to investigate which means of experimentation, PM or VM, are required for enhancing student science learning at the kindergarten level, an empirical study was conducted that sought to investigate the impact of haptic feedback on the conceptual understanding of pre-school students (n=44, age mean=5,7) in three science domains: beam balance (D1), sinking/floating (D2) and springs (D3). The participants were equally divided in two groups according to the type of manipulatives used (PM: presence of haptic feedback, VM: absence of haptic feedback) during a semi-structured interview for each of the domains. All interviews followed the Predict-Observe-Explain (POE) strategy and consisted of three phases: initial evaluation, experimentation, final evaluation. The data collected through the interviews were analyzed qualitatively (open-coding for identifying students’ ideas in each domain) and quantitatively (use of non-parametric tests). Findings revealed that the haptic feedback enabled students to distinguish heavier to lighter objects when held in hands during experimentation. In D1 the haptic feedback did not differentiate PM and VM students' conceptual understanding of the function of the beam as a mean to compare the mass of objects. In D2 the haptic feedback appeared to have a negative impact on PM students’ learning. Feeling the weight of an object strengthen PM students’ misconception that heavier objects always sink, whereas the scientifically correct idea that the material of an object determines its sinking/floating behavior in the water was found to be significantly higher among the VM students than the PM ones. In D3 the PM students outperformed significantly the VM students with regard to the idea that the heavier an object is the more the spring will expand, indicating that the haptic input experienced by the PM students served as an advantage to their learning. These findings point to the fact that PMs, and thus touch sensory input, might not always be a requirement for science learning and that VMs could be considered, under certain circumstances, as a viable means for experimentation. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=haptic%20feedback" title="haptic feedback">haptic feedback</a>, <a href="https://publications.waset.org/abstracts/search?q=physical%20and%20virtual%20manipulatives" title=" physical and virtual manipulatives"> physical and virtual manipulatives</a>, <a href="https://publications.waset.org/abstracts/search?q=pre-school%20science%20learning" title=" pre-school science learning"> pre-school science learning</a>, <a href="https://publications.waset.org/abstracts/search?q=science%20experimentation" title=" science experimentation"> science experimentation</a> </p> <a href="https://publications.waset.org/abstracts/119896/learning-gains-and-constraints-resulting-from-haptic-sensory-feedback-among-preschoolers-engagement-during-science-experimentation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/119896.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">138</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">33</span> A Microsurgery-Specific End-Effector Equipped with a Bipolar Surgical Tool and Haptic Feedback </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hamidreza%20Hoshyarmanesh">Hamidreza Hoshyarmanesh</a>, <a href="https://publications.waset.org/abstracts/search?q=Sanju%20Lama"> Sanju Lama</a>, <a href="https://publications.waset.org/abstracts/search?q=Garnette%20R.%20Sutherland"> Garnette R. Sutherland</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In tele-operative robotic surgery, an ideal haptic device should be equipped with an intuitive and smooth end-effector to cover the surgeon’s hand/wrist degrees of freedom (DOF) and translate the hand joint motions to the end-effector of the remote manipulator with low effort and high level of comfort. This research introduces the design and development of a microsurgery-specific end-effector, a gimbal mechanism possessing 4 passive and 1 active DOFs, equipped with a bipolar forceps and haptic feedback. The robust gimbal structure is comprised of three light-weight links/joint, pitch, yaw, and roll, each consisting of low-friction support and a 2-channel accurate optical position sensor. The third link, which provides the tool roll, was specifically designed to grip the tool prongs and accommodate a low mass geared actuator together with a miniaturized capstan-rope mechanism. The actuator is able to generate delicate torques, using a threaded cylindrical capstan, to emulate the sense of pinch/coagulation during conventional microsurgery. While the tool left prong is fixed to the rolling link, the right prong bears a miniaturized drum sector with a large diameter to expand the force scale and resolution. The drum transmits the actuator output torque to the right prong and generates haptic force feedback at the tool level. The tool is also equipped with a hall-effect sensor and magnet bar installed vis-à-vis on the inner side of the two prongs to measure the tooltip distance and provide an analogue signal to the control system. We believe that such a haptic end-effector could significantly increase the accuracy of telerobotic surgery and help avoid high forces that are known to cause bleeding/injury. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=end-effector" title="end-effector">end-effector</a>, <a href="https://publications.waset.org/abstracts/search?q=force%20generation" title=" force generation"> force generation</a>, <a href="https://publications.waset.org/abstracts/search?q=haptic%20interface" title=" haptic interface"> haptic interface</a>, <a href="https://publications.waset.org/abstracts/search?q=robotic%20surgery" title=" robotic surgery"> robotic surgery</a>, <a href="https://publications.waset.org/abstracts/search?q=surgical%20tool" title=" surgical tool"> surgical tool</a>, <a href="https://publications.waset.org/abstracts/search?q=tele-operation" title=" tele-operation"> tele-operation</a> </p> <a href="https://publications.waset.org/abstracts/120027/a-microsurgery-specific-end-effector-equipped-with-a-bipolar-surgical-tool-and-haptic-feedback" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/120027.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">118</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">32</span> Haptic Robotic Glove for Tele-Exploration of Explosive Devices</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Gizem%20Derya%20Demir">Gizem Derya Demir</a>, <a href="https://publications.waset.org/abstracts/search?q=Ilayda%20Yankilic"> Ilayda Yankilic</a>, <a href="https://publications.waset.org/abstracts/search?q=Daglar%20Karamuftuoglu"> Daglar Karamuftuoglu</a>, <a href="https://publications.waset.org/abstracts/search?q=Dante%20Dorantes"> Dante Dorantes</a> </p> <p class="card-text"><strong>Abstract:</strong></p> ABSTRACT HAPTIC ROBOTIC GLOVE FOR TELE-EXPLORATION OF EXPLOSIVE DEVICES Gizem Derya Demir, İlayda Yankılıç, Dağlar Karamüftüoğlu, Dante J. Dorantes-González Department of Mechanical Engineering, MEF University Ayazağa Cad. No.4, 34396 Maslak, Sarıyer, İstanbul, Turkey Nowadays, terror attacks are, unfortunately, a more common threat around the world. Therefore, safety measures have become much more essential. An alternative to providing safety and saving human lives is done by robots, such as disassembling and liquidation of bombs. In this article, remote exploration and manipulation of potential explosive devices from a safe-distance are addressed by designing a novel, simple and ergonomic haptic robotic glove. SolidWorks® Computer-Aided Design, computerized dynamic simulation, and MATLAB® kinematic and static analysis were used for the haptic robotic glove and finger design. Angle controls of servo motors were made using ARDUINO® IDE codes on a Makeblock® MegaPi control card. Simple grasping dexterity solutions for the fingers were obtained using one linear soft and one angle sensors for each finger, and six servo motors are used in total to remotely control a slave multi-tooled robotic hand. This project is still undergoing and presents current results. Future research steps are also presented. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Dexterity" title="Dexterity">Dexterity</a>, <a href="https://publications.waset.org/abstracts/search?q=Exoskeleton" title=" Exoskeleton"> Exoskeleton</a>, <a href="https://publications.waset.org/abstracts/search?q=Haptics" title=" Haptics "> Haptics </a>, <a href="https://publications.waset.org/abstracts/search?q=Position%20Control" title=" Position Control"> Position Control</a>, <a href="https://publications.waset.org/abstracts/search?q=Robotic%20Hand" title=" Robotic Hand "> Robotic Hand </a>, <a href="https://publications.waset.org/abstracts/search?q=Teleoperation" title=" Teleoperation"> Teleoperation</a> </p> <a href="https://publications.waset.org/abstracts/123929/haptic-robotic-glove-for-tele-exploration-of-explosive-devices" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/123929.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">177</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">31</span> Authoring Tactile Gestures: Case Study for Emotion Stimulation </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Rodrigo%20Lentini">Rodrigo Lentini</a>, <a href="https://publications.waset.org/abstracts/search?q=Beatrice%20Ionascu"> Beatrice Ionascu</a>, <a href="https://publications.waset.org/abstracts/search?q=Friederike%20A.%20Eyssel"> Friederike A. Eyssel</a>, <a href="https://publications.waset.org/abstracts/search?q=Scandar%20Copti"> Scandar Copti</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohamad%20Eid"> Mohamad Eid</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The haptic modality has brought a new dimension to human computer interaction by engaging the human sense of touch. However, designing appropriate haptic stimuli, and in particular tactile stimuli, for various applications is still challenging. To tackle this issue, we present an intuitive system that facilitates the authoring of tactile gestures for various applications. The system transforms a hand gesture into a tactile gesture that can be rendering using a home-made haptic jacket. A case study is presented to demonstrate the ability of the system to develop tactile gestures that are recognizable by human subjects. Four tactile gestures are identified and tested to intensify the following four emotional responses: high valence &ndash; high arousal, high valence &ndash; low arousal, low valence &ndash; high arousal, and low valence &ndash; low arousal. A usability study with 20 participants demonstrated high correlation between the selected tactile gestures and the intended emotional reaction. Results from this study can be used in a wide spectrum of applications ranging from gaming to interpersonal communication and multimodal simulations. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=tactile%20stimulation" title="tactile stimulation">tactile stimulation</a>, <a href="https://publications.waset.org/abstracts/search?q=tactile%20gesture" title=" tactile gesture"> tactile gesture</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20reactions" title=" emotion reactions"> emotion reactions</a>, <a href="https://publications.waset.org/abstracts/search?q=arousal" title=" arousal"> arousal</a>, <a href="https://publications.waset.org/abstracts/search?q=valence" title=" valence"> valence</a> </p> <a href="https://publications.waset.org/abstracts/52327/authoring-tactile-gestures-case-study-for-emotion-stimulation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/52327.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">371</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">30</span> Accuracy/Precision Evaluation of Excalibur I: A Neurosurgery-Specific Haptic Hand Controller</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hamidreza%20Hoshyarmanesh">Hamidreza Hoshyarmanesh</a>, <a href="https://publications.waset.org/abstracts/search?q=Benjamin%20Durante"> Benjamin Durante</a>, <a href="https://publications.waset.org/abstracts/search?q=Alex%20Irwin"> Alex Irwin</a>, <a href="https://publications.waset.org/abstracts/search?q=Sanju%20Lama"> Sanju Lama</a>, <a href="https://publications.waset.org/abstracts/search?q=Kourosh%20Zareinia"> Kourosh Zareinia</a>, <a href="https://publications.waset.org/abstracts/search?q=Garnette%20R.%20Sutherland"> Garnette R. Sutherland</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This study reports on a proposed method to evaluate the accuracy and precision of Excalibur I, a neurosurgery-specific haptic hand controller, designed and developed at Project neuroArm. Having an efficient and successful robot-assisted telesurgery is considerably contingent on how accurate and precise a haptic hand controller (master/local robot) would be able to interpret the kinematic indices of motion, i.e., position and orientation, from the surgeon’s upper limp to the slave/remote robot. A proposed test rig is designed and manufactured according to standard ASTM F2554-10 to determine the accuracy and precision range of Excalibur I at four different locations within its workspace: central workspace, extreme forward, far left and far right. The test rig is metrologically characterized by a coordinate measuring machine (accuracy and repeatability < ± 5 µm). Only the serial linkage of the haptic device is examined due to the use of the Structural Length Index (SLI). The results indicate that accuracy decreases by moving from the workspace central area towards the borders of the workspace. In a comparative study, Excalibur I performs on par with the PHANToM PremiumTM 3.0 and more accurate/precise than the PHANToM PremiumTM 1.5. The error in Cartesian coordinate system shows a dominant component in one direction (δx, δy or δz) for the movements on horizontal, vertical and inclined surfaces. The average error magnitude of three attempts is recorded, considering all three error components. This research is the first promising step to quantify the kinematic performance of Excalibur I. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=accuracy" title="accuracy">accuracy</a>, <a href="https://publications.waset.org/abstracts/search?q=advanced%20metrology" title=" advanced metrology"> advanced metrology</a>, <a href="https://publications.waset.org/abstracts/search?q=hand%20controller" title=" hand controller"> hand controller</a>, <a href="https://publications.waset.org/abstracts/search?q=precision" title=" precision"> precision</a>, <a href="https://publications.waset.org/abstracts/search?q=robot-assisted%20surgery" title=" robot-assisted surgery"> robot-assisted surgery</a>, <a href="https://publications.waset.org/abstracts/search?q=tele-operation" title=" tele-operation"> tele-operation</a>, <a href="https://publications.waset.org/abstracts/search?q=workspace" title=" workspace"> workspace</a> </p> <a href="https://publications.waset.org/abstracts/86416/accuracyprecision-evaluation-of-excalibur-i-a-neurosurgery-specific-haptic-hand-controller" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/86416.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">336</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">29</span> 6 DOF Cable-Driven Haptic Robot for Rendering High Axial Force with Low Off-Axis Impedance</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Naghmeh%20Zamani">Naghmeh Zamani</a>, <a href="https://publications.waset.org/abstracts/search?q=Ashkan%20Pourkand"> Ashkan Pourkand</a>, <a href="https://publications.waset.org/abstracts/search?q=David%20Grow"> David Grow</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper presents the design and mechanical model of a hybrid impedance/admittance haptic device optimized for applications, like bone drilling, spinal awl probe use, and other surgical techniques were high force is required in the tool-axial direction, and low impedance is needed in all other directions. The performance levels required cannot be satisfied by existing, off-the-shelf haptic devices. This design may allow critical improvements in simulator fidelity for surgery training. The device consists primarily of two low-mass (carbon fiber) plates with a rod passing through them. Collectively, the device provides 6 DOF. The rod slides through a bushing in the top plate and it is connected to the bottom plate with a universal joint, constrained to move in only 2 DOF, allowing axial torque display the user’s hand. The two parallel plates are actuated and located by means of four cables pulled by motors. The forward kinematic equations are derived to ensure that the plates orientation remains constant. The corresponding equations are solved using the Newton-Raphson method. The static force/torque equations are also presented. Finally, we present the predicted distribution of location error, cables velocity, cable tension, force and torque for the device. These results and preliminary hardware fabrication indicate that this design may provide a revolutionary approach for haptic display of many surgical procedures by means of an architecture that allows arbitrary workspace scaling. Scaling of the height and width can be scaled arbitrarily. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=cable%20direct%20driven%20robot" title="cable direct driven robot">cable direct driven robot</a>, <a href="https://publications.waset.org/abstracts/search?q=haptics" title=" haptics"> haptics</a>, <a href="https://publications.waset.org/abstracts/search?q=parallel%20plates" title=" parallel plates"> parallel plates</a>, <a href="https://publications.waset.org/abstracts/search?q=bone%20drilling" title=" bone drilling"> bone drilling</a> </p> <a href="https://publications.waset.org/abstracts/77321/6-dof-cable-driven-haptic-robot-for-rendering-high-axial-force-with-low-off-axis-impedance" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/77321.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">258</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">28</span> A Study on User Authentication Method Using Haptic Actuator and Security Evaluation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yo%20Han%20Choi">Yo Han Choi</a>, <a href="https://publications.waset.org/abstracts/search?q=Hee%20Suk%20Seo"> Hee Suk Seo</a>, <a href="https://publications.waset.org/abstracts/search?q=Seung%20Hwan%20Ju"> Seung Hwan Ju</a>, <a href="https://publications.waset.org/abstracts/search?q=Sung%20Hyu%20Han"> Sung Hyu Han</a> </p> <p class="card-text"><strong>Abstract:</strong></p> As currently various portable devices were launched, smart business conducted using them became common. Since smart business can use company-internal resources in an external remote place, user authentication that can identify authentic users is an important factor. Commonly used user authentication is a method of using user ID and Password. In the user authentication using ID and Password, the user should see and enter authentication information him or herself. In this user authentication system depending on the user’s vision, there is the threat of password leaks through snooping in the process which the user enters his or her authentication information. This study designed and produced a user authentication module using an actuator to respond to the snooping threat. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=actuator" title="actuator">actuator</a>, <a href="https://publications.waset.org/abstracts/search?q=user%20authentication" title=" user authentication"> user authentication</a>, <a href="https://publications.waset.org/abstracts/search?q=security%20evaluation" title=" security evaluation"> security evaluation</a>, <a href="https://publications.waset.org/abstracts/search?q=haptic%20actuator" title=" haptic actuator"> haptic actuator</a> </p> <a href="https://publications.waset.org/abstracts/15894/a-study-on-user-authentication-method-using-haptic-actuator-and-security-evaluation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/15894.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">346</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">27</span> MAGNI Dynamics: A Vision-Based Kinematic and Dynamic Upper-Limb Model for Intelligent Robotic Rehabilitation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Alexandros%20Lioulemes">Alexandros Lioulemes</a>, <a href="https://publications.waset.org/abstracts/search?q=Michail%20Theofanidis"> Michail Theofanidis</a>, <a href="https://publications.waset.org/abstracts/search?q=Varun%20Kanal"> Varun Kanal</a>, <a href="https://publications.waset.org/abstracts/search?q=Konstantinos%20Tsiakas"> Konstantinos Tsiakas</a>, <a href="https://publications.waset.org/abstracts/search?q=Maher%20Abujelala"> Maher Abujelala</a>, <a href="https://publications.waset.org/abstracts/search?q=Chris%20Collander"> Chris Collander</a>, <a href="https://publications.waset.org/abstracts/search?q=William%20B.%20Townsend"> William B. Townsend</a>, <a href="https://publications.waset.org/abstracts/search?q=Angie%20Boisselle"> Angie Boisselle</a>, <a href="https://publications.waset.org/abstracts/search?q=Fillia%20Makedon"> Fillia Makedon</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper presents a home-based robot-rehabilitation instrument, called &rdquo;MAGNI Dynamics&rdquo;, that utilized a vision-based kinematic/dynamic module and an adaptive haptic feedback controller. The system is expected to provide personalized rehabilitation by adjusting its resistive and supportive behavior according to a fuzzy intelligence controller that acts as an inference system, which correlates the user&rsquo;s performance to different stiffness factors. The vision module uses the Kinect&rsquo;s skeletal tracking to monitor the user&rsquo;s effort in an unobtrusive and safe way, by estimating the torque that affects the user&rsquo;s arm. The system&rsquo;s torque estimations are justified by capturing electromyographic data from primitive hand motions (Shoulder Abduction and Shoulder Forward Flexion). Moreover, we present and analyze how the Barrett WAM generates a force-field with a haptic controller to support or challenge the users. Experiments show that by shifting the proportional value, that corresponds to different stiffness factors of the haptic path, can potentially help the user to improve his/her motor skills. Finally, potential areas for future research are discussed, that address how a rehabilitation robotic framework may include multisensing data, to improve the user&rsquo;s recovery process. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=human-robot%20interaction" title="human-robot interaction">human-robot interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=kinect" title=" kinect"> kinect</a>, <a href="https://publications.waset.org/abstracts/search?q=kinematics" title=" kinematics"> kinematics</a>, <a href="https://publications.waset.org/abstracts/search?q=dynamics" title=" dynamics"> dynamics</a>, <a href="https://publications.waset.org/abstracts/search?q=haptic%20control" title=" haptic control"> haptic control</a>, <a href="https://publications.waset.org/abstracts/search?q=rehabilitation%20robotics" title=" rehabilitation robotics"> rehabilitation robotics</a>, <a href="https://publications.waset.org/abstracts/search?q=artificial%20intelligence" title=" artificial intelligence"> artificial intelligence</a> </p> <a href="https://publications.waset.org/abstracts/58367/magni-dynamics-a-vision-based-kinematic-and-dynamic-upper-limb-model-for-intelligent-robotic-rehabilitation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/58367.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">329</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">26</span> Vibro-Tactile Equalizer for Musical Energy-Valence Categorization</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Dhanya%20Nair">Dhanya Nair</a>, <a href="https://publications.waset.org/abstracts/search?q=Nicholas%20Mirchandani"> Nicholas Mirchandani</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Musical haptic systems can enhance a listener’s musical experience while providing an alternative platform for the hearing impaired to experience music. Current music tactile technologies focus on representing tactile metronomes to synchronize performers or encoding musical notes into distinguishable (albeit distracting) tactile patterns. There is growing interest in the development of musical haptic systems to augment the auditory experience, although the haptic-music relationship is still not well understood. This paper represents a tactile music interface that provides vibrations to multiple fingertips in synchronicity with auditory music. Like an audio equalizer, different frequency bands are filtered out, and the power in each frequency band is computed and converted to a corresponding vibrational strength. These vibrations are felt on different fingertips, each corresponding to a different frequency band. Songs with music from different spectrums, as classified by their energy and valence, were used to test the effectiveness of the system and to understand the relationship between music and tactile sensations. Three participants were trained on one song categorized as sad (low energy and low valence score) and one song categorized as happy (high energy and high valence score). They were trained both with and without auditory feedback (listening to the song while experiencing the tactile music on their fingertips and then experiencing the vibrations alone without the music). The participants were then tested on three songs from both categories, without any auditory feedback, and were asked to classify the tactile vibrations they felt into either category. The participants were blinded to the songs being tested and were not provided any feedback on the accuracy of their classification. These participants were able to classify the music with 100% accuracy. Although the songs tested were on two opposite spectrums (sad/happy), the preliminary results show the potential of utilizing a vibrotactile equalizer, like the one presented, for augmenting musical experience while furthering the current understanding of music tactile relationship. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=haptic%20music%20relationship" title="haptic music relationship">haptic music relationship</a>, <a href="https://publications.waset.org/abstracts/search?q=tactile%20equalizer" title=" tactile equalizer"> tactile equalizer</a>, <a href="https://publications.waset.org/abstracts/search?q=tactile%20music" title=" tactile music"> tactile music</a>, <a href="https://publications.waset.org/abstracts/search?q=vibrations%20and%20mood" title=" vibrations and mood"> vibrations and mood</a> </p> <a href="https://publications.waset.org/abstracts/136784/vibro-tactile-equalizer-for-musical-energy-valence-categorization" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/136784.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">181</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">25</span> Integrating Wearable Devices in Real-Time Computer Applications of Petrochemical Systems</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Paul%20B%20Stone">Paul B Stone</a>, <a href="https://publications.waset.org/abstracts/search?q=Subhashini%20Ganapathy"> Subhashini Ganapathy</a>, <a href="https://publications.waset.org/abstracts/search?q=Mary%20E.%20Fendley"> Mary E. Fendley</a>, <a href="https://publications.waset.org/abstracts/search?q=Layla%20Akilan"> Layla Akilan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> As notifications become more common through mobile devices, it is important to understand the impact of wearable devices on the improved user experience of man-machine interfaces. This study examined the use of a wearable device for a real-time system using a computer-simulated petrochemical system. The key research question was to determine how using the information provided by the wearable device can improve human performance through measures of situational awareness and decision making. Results indicate that there was a reduction in response time when using the watch, and there was no difference in situational awareness. Perception of using the watch was positive, with 83% of users finding value in using the watch and receiving haptic feedback. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=computer%20applications" title="computer applications">computer applications</a>, <a href="https://publications.waset.org/abstracts/search?q=haptic%20feedback" title=" haptic feedback"> haptic feedback</a>, <a href="https://publications.waset.org/abstracts/search?q=petrochemical%20systems" title=" petrochemical systems"> petrochemical systems</a>, <a href="https://publications.waset.org/abstracts/search?q=situational%20awareness" title=" situational awareness"> situational awareness</a>, <a href="https://publications.waset.org/abstracts/search?q=wearable%20technology" title=" wearable technology"> wearable technology</a> </p> <a href="https://publications.waset.org/abstracts/139604/integrating-wearable-devices-in-real-time-computer-applications-of-petrochemical-systems" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/139604.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">200</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">24</span> A Parallel Computation Based on GPU Programming for a 3D Compressible Fluid Flow Simulation </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sugeng%20Rianto">Sugeng Rianto</a>, <a href="https://publications.waset.org/abstracts/search?q=P.W.%20Arinto%20Yudi"> P.W. Arinto Yudi</a>, <a href="https://publications.waset.org/abstracts/search?q=Soemarno%20%20Muhammad%20Nurhuda"> Soemarno Muhammad Nurhuda</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A computation of a 3D compressible fluid flow for virtual environment with haptic interaction can be a non-trivial issue. This is especially how to reach good performances and balancing between visualization, tactile feedback interaction, and computations. In this paper, we describe our approach of computation methods based on parallel programming on a GPU. The 3D fluid flow solvers have been developed for smoke dispersion simulation by using combinations of the cubic interpolated propagation (CIP) based fluid flow solvers and the advantages of the parallelism and programmability of the GPU. The fluid flow solver is generated in the GPU-CPU message passing scheme to get rapid development of haptic feedback modes for fluid dynamic data. A rapid solution in fluid flow solvers is developed by applying cubic interpolated propagation (CIP) fluid flow solvers. From this scheme, multiphase fluid flow equations can be solved simultaneously. To get more acceleration in the computation, the Navier-Stoke Equations (NSEs) is packed into channels of texel, where computation models are performed on pixels that can be considered to be a grid of cells. Therefore, despite of the complexity of the obstacle geometry, processing on multiple vertices and pixels can be done simultaneously in parallel. The data are also shared in global memory for CPU to control the haptic in providing kinaesthetic interaction and felling. The results show that GPU based parallel computation approaches provide effective simulation of compressible fluid flow model for real-time interaction in 3D computer graphic for PC platform. This report has shown the feasibility of a new approach of solving the compressible fluid flow equations on the GPU. The experimental tests proved that the compressible fluid flowing on various obstacles with haptic interactions on the few model obstacles can be effectively and efficiently simulated on the reasonable frame rate with a realistic visualization. These results confirm that good performances and balancing between visualization, tactile feedback interaction, and computations can be applied successfully. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=CIP" title="CIP">CIP</a>, <a href="https://publications.waset.org/abstracts/search?q=compressible%20fluid" title=" compressible fluid"> compressible fluid</a>, <a href="https://publications.waset.org/abstracts/search?q=GPU%20programming" title=" GPU programming"> GPU programming</a>, <a href="https://publications.waset.org/abstracts/search?q=parallel%20computation" title=" parallel computation"> parallel computation</a>, <a href="https://publications.waset.org/abstracts/search?q=real-time%20visualisation" title=" real-time visualisation"> real-time visualisation</a> </p> <a href="https://publications.waset.org/abstracts/3308/a-parallel-computation-based-on-gpu-programming-for-a-3d-compressible-fluid-flow-simulation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/3308.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">432</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">23</span> A Step Magnitude Haptic Feedback Device and Platform for Better Way to Review Kinesthetic Vibrotactile 3D Design in Professional Training</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Biki%20Sarmah">Biki Sarmah</a>, <a href="https://publications.waset.org/abstracts/search?q=Priyanko%20Raj%20Mudiar"> Priyanko Raj Mudiar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In the modern world of remotely interactive virtual reality-based learning and teaching, including professional skill-building training and acquisition practices, as well as data acquisition and robotic systems, the revolutionary application or implementation of field-programmable neurostimulator aids and first-hand interactive sensitisation techniques into 3D holographic audio-visual platforms have been a coveted dream of many scholars, professionals, scientists, and students. Integration of 'kinaesthetic vibrotactile haptic perception' along with an actuated step magnitude contact profiloscopy in augmented reality-based learning platforms and professional training can be implemented by using an extremely calculated and well-coordinated image telemetry including remote data mining and control technique. A real-time, computer-aided (PLC-SCADA) field calibration based algorithm must be designed for the purpose. But most importantly, in order to actually realise, as well as to 'interact' with some 3D holographic models displayed over a remote screen using remote laser image telemetry and control, all spatio-physical parameters like cardinal alignment, gyroscopic compensation, as well as surface profile and thermal compositions, must be implemented using zero-order type 1 actuators (or transducers) because they provide zero hystereses, zero backlashes, low deadtime as well as providing a linear, absolutely controllable, intrinsically observable and smooth performance with the least amount of error compensation while ensuring the best ergonomic comfort ever possible for the users. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=haptic%20feedback" title="haptic feedback">haptic feedback</a>, <a href="https://publications.waset.org/abstracts/search?q=kinaesthetic%20vibrotactile%203D%20%20design" title=" kinaesthetic vibrotactile 3D design"> kinaesthetic vibrotactile 3D design</a>, <a href="https://publications.waset.org/abstracts/search?q=medical%20simulation%20training" title=" medical simulation training"> medical simulation training</a>, <a href="https://publications.waset.org/abstracts/search?q=piezo%20diaphragm%20based%20actuator" title=" piezo diaphragm based actuator"> piezo diaphragm based actuator</a> </p> <a href="https://publications.waset.org/abstracts/131443/a-step-magnitude-haptic-feedback-device-and-platform-for-better-way-to-review-kinesthetic-vibrotactile-3d-design-in-professional-training" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/131443.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">166</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">22</span> 3D Text Toys: Creative Approach to Experiential and Immersive Learning for World Literacy</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Azyz%20Sharafy">Azyz Sharafy</a> </p> <p class="card-text"><strong>Abstract:</strong></p> 3D Text Toys is an innovative and creative approach that utilizes 3D text objects to enhance creativity, literacy, and basic learning in an enjoyable and gamified manner. By using 3D Text Toys, children can develop their creativity, visually learn words and texts, and apply their artistic talents within their creative abilities. This process incorporates haptic engagement with 2D and 3D texts, word building, and mechanical construction of everyday objects, thereby facilitating better word and text retention. The concept involves constructing visual objects made entirely out of 3D text/words, where each component of the object represents a word or text element. For instance, a bird can be recreated using words or text shaped like its wings, beak, legs, head, and body, resulting in a 3D representation of the bird purely composed of text. This can serve as an art piece or a learning tool in the form of a 3D text toy. These 3D text objects or toys can be crafted using natural materials such as leaves, twigs, strings, or ropes, or they can be made from various physical materials using traditional crafting tools. Digital versions of these objects can be created using 2D or 3D software on devices like phones, laptops, iPads, or computers. To transform digital designs into physical objects, computerized machines such as CNC routers, laser cutters, and 3D printers can be utilized. Once the parts are printed or cut out, students can assemble the 3D texts by gluing them together, resulting in natural or everyday 3D text objects. These objects can be painted to create artistic pieces or text toys, and the addition of wheels can transform them into moving toys. One of the significant advantages of this visual and creative object-based learning process is that students not only learn words but also derive enjoyment from the process of creating, painting, and playing with these objects. The ownership and creation process further enhances comprehension and word retention. Moreover, for individuals with learning disabilities such as dyslexia, ADD (Attention Deficit Disorder), or other learning difficulties, the visual and haptic approach of 3D Text Toys can serve as an additional creative and personalized learning aid. The application of 3D Text Toys extends to both the English language and any other global written language. The adaptation and creative application may vary depending on the country, space, and native written language. Furthermore, the implementation of this visual and haptic learning tool can be tailored to teach foreign languages based on age level and comprehension requirements. In summary, this creative, haptic, and visual approach has the potential to serve as a global literacy tool. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=3D%20text%20toys" title="3D text toys">3D text toys</a>, <a href="https://publications.waset.org/abstracts/search?q=creative" title=" creative"> creative</a>, <a href="https://publications.waset.org/abstracts/search?q=artistic" title=" artistic"> artistic</a>, <a href="https://publications.waset.org/abstracts/search?q=visual%20learning%20for%20world%20literacy" title=" visual learning for world literacy"> visual learning for world literacy</a> </p> <a href="https://publications.waset.org/abstracts/166053/3d-text-toys-creative-approach-to-experiential-and-immersive-learning-for-world-literacy" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/166053.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">64</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">21</span> Design and Fabrication of a Programmable Stiffness-Sensitive Gripper for Object Handling</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mehdi%20Modabberifar">Mehdi Modabberifar</a>, <a href="https://publications.waset.org/abstracts/search?q=Sanaz%20Jabary"> Sanaz Jabary</a>, <a href="https://publications.waset.org/abstracts/search?q=Mojtaba%20Ghodsi"> Mojtaba Ghodsi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Stiffness sensing is an important issue in medical diagnostic, robotics surgery, safe handling, and safe grasping of objects in production lines. Detecting and obtaining the characteristics in dwelling lumps embedded in a soft tissue and safe removing and handling of detected lumps is needed in surgery. Also in industry, grasping and handling an object without damaging in a place where it is not possible to access a human operator is very important. In this paper, a method for object handling is presented. It is based on the use of an intelligent gripper to detect the object stiffness and then setting a programmable force for grasping the object to move it. The main components of this system includes sensors (sensors for measuring force and displacement), electrical (electrical and electronic circuits, tactile data processing and force control system), mechanical (gripper mechanism and driving system for the gripper) and the display unit. The system uses a rotary potentiometer for measuring gripper displacement. A microcontroller using the feedback received by the load cell, mounted on the finger of the gripper, calculates the amount of stiffness, and then commands the gripper motor to apply a certain force on the object. Results of Experiments on some samples with different stiffness show that the gripper works successfully. The gripper can be used in haptic interfaces or robotic systems used for object handling. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=gripper" title="gripper">gripper</a>, <a href="https://publications.waset.org/abstracts/search?q=haptic" title=" haptic"> haptic</a>, <a href="https://publications.waset.org/abstracts/search?q=stiffness" title=" stiffness"> stiffness</a>, <a href="https://publications.waset.org/abstracts/search?q=robotic" title=" robotic"> robotic</a> </p> <a href="https://publications.waset.org/abstracts/50696/design-and-fabrication-of-a-programmable-stiffness-sensitive-gripper-for-object-handling" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/50696.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">358</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">20</span> Development of a Real-Time Simulink Based Robotic System to Study Force Feedback Mechanism during Instrument-Object Interaction</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jaydip%20M.%20Desai">Jaydip M. Desai</a>, <a href="https://publications.waset.org/abstracts/search?q=Antonio%20Valdevit"> Antonio Valdevit</a>, <a href="https://publications.waset.org/abstracts/search?q=Arthur%20Ritter"> Arthur Ritter</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Robotic surgery is used to enhance minimally invasive surgical procedure. It provides greater degree of freedom for surgical tools but lacks of haptic feedback system to provide sense of touch to the surgeon. Surgical robots work on master-slave operation, where user is a master and robotic arms are the slaves. Current, surgical robots provide precise control of the surgical tools, but heavily rely on visual feedback, which sometimes cause damage to the inner organs. The goal of this research was to design and develop a real-time simulink based robotic system to study force feedback mechanism during instrument-object interaction. Setup includes three Velmex XSlide assembly (XYZ Stage) for three dimensional movement, an end effector assembly for forceps, electronic circuit for four strain gages, two Novint Falcon 3D gaming controllers, microcontroller board with linear actuators, MATLAB and Simulink toolboxes. Strain gages were calibrated using Imada Digital Force Gauge device and tested with a hard-core wire to measure instrument-object interaction in the range of 0-35N. Designed simulink model successfully acquires 3D coordinates from two Novint Falcon controllers and transfer coordinates to the XYZ stage and forceps. Simulink model also reads strain gages signal through 10-bit analog to digital converter resolution of a microcontroller assembly in real time, converts voltage into force and feedback the output signals to the Novint Falcon controller for force feedback mechanism. Experimental setup allows user to change forward kinematics algorithms to achieve the best-desired movement of the XYZ stage and forceps. This project combines haptic technology with surgical robot to provide sense of touch to the user controlling forceps through machine-computer interface. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=surgical%20robot" title="surgical robot">surgical robot</a>, <a href="https://publications.waset.org/abstracts/search?q=haptic%20feedback" title=" haptic feedback"> haptic feedback</a>, <a href="https://publications.waset.org/abstracts/search?q=MATLAB" title=" MATLAB"> MATLAB</a>, <a href="https://publications.waset.org/abstracts/search?q=strain%20gage" title=" strain gage"> strain gage</a>, <a href="https://publications.waset.org/abstracts/search?q=simulink" title=" simulink"> simulink</a> </p> <a href="https://publications.waset.org/abstracts/27432/development-of-a-real-time-simulink-based-robotic-system-to-study-force-feedback-mechanism-during-instrument-object-interaction" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/27432.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">534</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">19</span> Anajaa-Visual Substitution System: A Navigation Assistive Device for the Visually Impaired</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Juan%20Pablo%20Botero%20Torres">Juan Pablo Botero Torres</a>, <a href="https://publications.waset.org/abstracts/search?q=Alba%20Avila"> Alba Avila</a>, <a href="https://publications.waset.org/abstracts/search?q=Luis%20Felipe%20Giraldo"> Luis Felipe Giraldo</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Independent navigation and mobility through unknown spaces pose a challenge for the autonomy of visually impaired people (VIP), who have relied on the use of traditional assistive tools like the white cane and trained dogs. However, emerging visually assistive technologies (VAT) have proposed several human-machine interfaces (HMIs) that could improve VIP’s ability for self-guidance. Hereby, we introduce the design and implementation of a visually assistive device, Anajaa – Visual Substitution System (AVSS). This system integrates ultrasonic sensors with custom electronics, and computer vision models (convolutional neural networks), in order to achieve a robust system that acquires information of the surrounding space and transmits it to the user in an intuitive and efficient manner. AVSS consists of two modules: the sensing and the actuation module, which are fitted to a chest mount and belt that communicate via Bluetooth. The sensing module was designed for the acquisition and processing of proximity signals provided by an array of ultrasonic sensors. The distribution of these within the chest mount allows an accurate representation of the surrounding space, discretized in three different levels of proximity, ranging from 0 to 6 meters. Additionally, this module is fitted with an RGB-D camera used to detect potentially threatening obstacles, like staircases, using a convolutional neural network specifically trained for this purpose. Posteriorly, the depth data is used to estimate the distance between the stairs and the user. The information gathered from this module is then sent to the actuation module that creates an HMI, by the means of a 3x2 array of vibration motors that make up the tactile display and allow the system to deliver haptic feedback. The actuation module uses vibrational messages (tactones); changing both in amplitude and frequency to deliver different awareness levels according to the proximity of the obstacle. This enables the system to deliver an intuitive interface. Both modules were tested under lab conditions, and the HMI was additionally tested with a focal group of VIP. The lab testing was conducted in order to establish the processing speed of the computer vision algorithms. This experimentation determined that the model can process 0.59 frames per second (FPS); this is considered as an adequate processing speed taking into account that the walking speed of VIP is 1.439 m/s. In order to test the HMI, we conducted a focal group composed of two females and two males between the ages of 35-65 years. The subject selection was aided by the Colombian Cooperative of Work and Services for the Sightless (COOTRASIN). We analyzed the learning process of the haptic messages throughout five experimentation sessions using two metrics: message discrimination and localization success. These correspond to the ability of the subjects to recognize different tactones and locate them within the tactile display. Both were calculated as the mean across all subjects. Results show that the focal group achieved message discrimination of 70% and a localization success of 80%, demonstrating how the proposed HMI leads to the appropriation and understanding of the feedback messages, enabling the user’s awareness of its surrounding space. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=computer%20vision%20on%20embedded%20systems" title="computer vision on embedded systems">computer vision on embedded systems</a>, <a href="https://publications.waset.org/abstracts/search?q=electronic%20trave%20aids" title=" electronic trave aids"> electronic trave aids</a>, <a href="https://publications.waset.org/abstracts/search?q=human-machine%20interface" title=" human-machine interface"> human-machine interface</a>, <a href="https://publications.waset.org/abstracts/search?q=haptic%20feedback" title=" haptic feedback"> haptic feedback</a>, <a href="https://publications.waset.org/abstracts/search?q=visual%20assistive%20technologies" title=" visual assistive technologies"> visual assistive technologies</a>, <a href="https://publications.waset.org/abstracts/search?q=vision%20substitution%20systems" title=" vision substitution systems"> vision substitution systems</a> </p> <a href="https://publications.waset.org/abstracts/138119/anajaa-visual-substitution-system-a-navigation-assistive-device-for-the-visually-impaired" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/138119.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">81</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">18</span> Students&#039; Perception of Using Dental E-Models in an Inquiry-Based Curriculum</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yanqi%20Yang">Yanqi Yang</a>, <a href="https://publications.waset.org/abstracts/search?q=Chongshan%20Liao"> Chongshan Liao</a>, <a href="https://publications.waset.org/abstracts/search?q=Cheuk%20Hin%20Ho"> Cheuk Hin Ho</a>, <a href="https://publications.waset.org/abstracts/search?q=Susan%20Bridges"> Susan Bridges </a> </p> <p class="card-text"><strong>Abstract:</strong></p> Aim: To investigate student’s perceptions of using e-models in an inquiry-based curriculum. Approach: 52 second-year dental students completed a pre- and post-test questionnaire relating to their perceptions of e-models and their use in inquiry-based learning. The pre-test occurred prior to any learning with e-models. The follow-up survey was conducted after one year's experience of using e-models. Results: There was no significant difference between the two sets of questionnaires regarding student’s perceptions of the usefulness of e-models and their willingness to use e-models in future inquiry-based learning. Most of the students preferred using both plaster models and e-models in tandem. Conclusion: Students did not change their attitude towards e-models and most of them agreed or were neutral that e-models are useful in inquiry-based learning. Whilst recognizing the utility of 3D models for learning, student's preference for combining these with solid models has implications for the development of haptic sensibility in an operative discipline. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=e-models" title="e-models">e-models</a>, <a href="https://publications.waset.org/abstracts/search?q=inquiry-based%20curriculum" title=" inquiry-based curriculum"> inquiry-based curriculum</a>, <a href="https://publications.waset.org/abstracts/search?q=education" title=" education"> education</a>, <a href="https://publications.waset.org/abstracts/search?q=questionnaire" title=" questionnaire"> questionnaire</a> </p> <a href="https://publications.waset.org/abstracts/3739/students-perception-of-using-dental-e-models-in-an-inquiry-based-curriculum" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/3739.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">431</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">17</span> Force Feedback Enabled Syringe for Aspiration and Biopsy</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Pelin%20Su%20Firat">Pelin Su Firat</a>, <a href="https://publications.waset.org/abstracts/search?q=Sohyung%20Cho"> Sohyung Cho</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Biopsy or aspiration procedures are known to be complicated as they involve the penetration of a needle through human tissues, including vital organs. This research presents the design of a force sensor-guided device to be used with syringes and needles for aspiration and biopsy. The development of the device was aimed to help accomplish accurate needle placement and increase the performance of the surgeon in navigating the tool and tracking the target. Specifically, a prototype for a force-sensor embedded syringe has been created using 3D (3-Dimensional) modeling and printing techniques in which two different force sensors were used to provide significant force feedback to users during the operations when needles pernitrate different tissues. From the extensive tests using synthetic tissues, it is shown that the proposed syringe design has accomplished the desired accuracy, efficiency, repeatability, and effectiveness. Further development is desirable through usability tests. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=biopsy" title="biopsy">biopsy</a>, <a href="https://publications.waset.org/abstracts/search?q=syringe" title=" syringe"> syringe</a>, <a href="https://publications.waset.org/abstracts/search?q=force%20sensors" title=" force sensors"> force sensors</a>, <a href="https://publications.waset.org/abstracts/search?q=haptic%20feedback" title=" haptic feedback"> haptic feedback</a> </p> <a href="https://publications.waset.org/abstracts/183278/force-feedback-enabled-syringe-for-aspiration-and-biopsy" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/183278.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">69</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">16</span> The Study of Sensory Breadth Experiences in an Online Try-On Environment</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Tseng-Lung%20Huang">Tseng-Lung Huang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Sensory breadth experiences, such as visualization, a sense of self-location, and haptic experiences, are critical in an online try-on environment. This research adopts an emotional appeal perspective, including concrete and abstract effects, to clarify the relationship between sensory experience and consumer's behavior intention in an online try-on context. This study employed an augmented reality interactive technology (ARIT) in an online clothes-fitting context and applied snowball sampling using e-mail to invite online consumers, first to use ARIT for trying on online apparel and then to complete a questionnaire. One hundred sixty-eight valid questionnaires were collected, and partial least squares (PLS) path modeling was used to test our hypotheses. The results showed that sensory breadth, by arousing concrete effect, induces impulse buying intention and willingness to pay a price premium of online shopping. Parasocial presence, as an abstract effect, diminishes the effect of concrete effects on willingness to pay a price premium. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=sensory%20breadth" title="sensory breadth">sensory breadth</a>, <a href="https://publications.waset.org/abstracts/search?q=impulsive%20behavior" title=" impulsive behavior"> impulsive behavior</a>, <a href="https://publications.waset.org/abstracts/search?q=price%20premium" title=" price premium"> price premium</a>, <a href="https://publications.waset.org/abstracts/search?q=emotional%20appeal" title=" emotional appeal"> emotional appeal</a>, <a href="https://publications.waset.org/abstracts/search?q=online%20try-on%20context" title=" online try-on context"> online try-on context</a> </p> <a href="https://publications.waset.org/abstracts/25396/the-study-of-sensory-breadth-experiences-in-an-online-try-on-environment" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/25396.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">548</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">15</span> Vibrotactility: Exploring and Prototyping the Aesthetics and Technology of Vibrotactility</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Elsa%20Kosmack%20Vaara">Elsa Kosmack Vaara</a>, <a href="https://publications.waset.org/abstracts/search?q=Cheryl%20Akner%20Koler"> Cheryl Akner Koler</a>, <a href="https://publications.waset.org/abstracts/search?q=Yusuf%20Mulla"> Yusuf Mulla</a>, <a href="https://publications.waset.org/abstracts/search?q=Parivash%20Ranjbar"> Parivash Ranjbar</a>, <a href="https://publications.waset.org/abstracts/search?q=Anneli%20N%C3%B6u"> Anneli Nöu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This transdisciplinary research weaves together an aesthetic perspective with a technical one to develop human sensitivity for vibration and construct flexible, wearable devices that are miniature, lightweight, and energy efficient. By applying methods from artistic research, performative arts, audio science, nanotechnology, and interaction design, we created working prototypes with actuators that were specifically positioned in various places on the body. The vibrotactile prototypes were tested by our research team, design students, and people with deafblindness and blindness, each with different intentions. Some tests supported connoisseurship for vibrotactile musical expression. Others aimed for precise navigational instructions. Our results and discussion concern problems in establishing standards for vibrotactility because standards minimize diversity and narrow possible ways vibration can be experienced. Human bodies vary significantly in ‘where’ vibrotactile signals can be sensed and ‘how’ they awaken emotions. We encourage others to embrace the dynamic exchange between new haptic technology and aesthetic complexity. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=aesthetics" title="aesthetics">aesthetics</a>, <a href="https://publications.waset.org/abstracts/search?q=vibration" title=" vibration"> vibration</a>, <a href="https://publications.waset.org/abstracts/search?q=music" title=" music"> music</a>, <a href="https://publications.waset.org/abstracts/search?q=interaction%20design" title=" interaction design"> interaction design</a>, <a href="https://publications.waset.org/abstracts/search?q=deafblindness" title=" deafblindness"> deafblindness</a> </p> <a href="https://publications.waset.org/abstracts/159494/vibrotactility-exploring-and-prototyping-the-aesthetics-and-technology-of-vibrotactility" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/159494.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">86</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">14</span> Bilateral Telecontrol of AutoMerlin Mobile Robot Using Time Domain Passivity Control</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Aamir%20Shahzad">Aamir Shahzad</a>, <a href="https://publications.waset.org/abstracts/search?q=Hubert%20Roth"> Hubert Roth</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper is presenting the bilateral telecontrol of AutoMerlin Mobile Robot having communication delay. Passivity Observers has been designed to monitor the net energy at both ports of a two port network and if any or both ports become active making net energy negative, then the passivity controllers dissipate the proper energy to make the overall system passive in the presence of time delay. The environment force is modeled and sent back to human operator so that s/he can feel it and has additional information about the environment in the vicinity of mobile robot. The experimental results have been presented to show the performance and stability of bilateral controller. The results show the whenever the passivity observers observe active behavior then the passivity controller come into action to neutralize the active behavior to make overall system passive. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=bilateral%20control" title="bilateral control">bilateral control</a>, <a href="https://publications.waset.org/abstracts/search?q=human%20operator" title=" human operator"> human operator</a>, <a href="https://publications.waset.org/abstracts/search?q=haptic%20device" title=" haptic device"> haptic device</a>, <a href="https://publications.waset.org/abstracts/search?q=communication%20network" title=" communication network"> communication network</a>, <a href="https://publications.waset.org/abstracts/search?q=time%20domain%20passivity%20control" title=" time domain passivity control"> time domain passivity control</a>, <a href="https://publications.waset.org/abstracts/search?q=passivity%20observer" title=" passivity observer"> passivity observer</a>, <a href="https://publications.waset.org/abstracts/search?q=passivity%20controller" title=" passivity controller"> passivity controller</a>, <a href="https://publications.waset.org/abstracts/search?q=time%20delay" title=" time delay"> time delay</a>, <a href="https://publications.waset.org/abstracts/search?q=mobile%20robot" title=" mobile robot"> mobile robot</a>, <a href="https://publications.waset.org/abstracts/search?q=environment%20force" title=" environment force"> environment force</a> </p> <a href="https://publications.waset.org/abstracts/39534/bilateral-telecontrol-of-automerlin-mobile-robot-using-time-domain-passivity-control" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/39534.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">392</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13</span> Multi-Tooled Robotic Hand for Tele-Operation of Explosive Devices</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Faik%20Derya%20Ince">Faik Derya Ince</a>, <a href="https://publications.waset.org/abstracts/search?q=Ugur%20Topgul"> Ugur Topgul</a>, <a href="https://publications.waset.org/abstracts/search?q=Alp%20%20Gunay"> Alp Gunay</a>, <a href="https://publications.waset.org/abstracts/search?q=Can%20Bayoglu"> Can Bayoglu</a>, <a href="https://publications.waset.org/abstracts/search?q=Dante%20J.%20Dorantes-Gonzalez"> Dante J. Dorantes-Gonzalez</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Explosive attacks are arguably the most lethal threat that may occur in terrorist attacks. In order to counteract this issue, explosive ordnance disposal operators put their lives on the line to dispose of a possible improvised explosive device. Robots can make the disposal process more accurately and saving human lives. For this purpose, there is a demand for more accurate and dexterous manipulating robotic hands that can be teleoperated from a distance. The aim of this project is to design a robotic hand that contains two active and two passive DOF for each finger, as well as a minimum set of tools for mechanical cutting and screw driving within the same robotic hand. Both hand and toolset, are teleoperated from a distance from a haptic robotic glove in order to manipulate dangerous objects such as improvised explosive devices. SolidWorks® Computer-Aided Design, computerized dynamic simulation, and MATLAB® kinematic and static analysis were used for the robotic hand and toolset design. Novel, dexterous and robust solutions for the fingers were obtained, and six servo motors are used in total to remotely control the multi-tooled robotic hand. This project is still undergoing and presents currents results. Future research steps are also presented. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Explosive%20Manipulation" title="Explosive Manipulation">Explosive Manipulation</a>, <a href="https://publications.waset.org/abstracts/search?q=Robotic%20Hand" title=" Robotic Hand"> Robotic Hand</a>, <a href="https://publications.waset.org/abstracts/search?q=Tele-Operation" title=" Tele-Operation"> Tele-Operation</a>, <a href="https://publications.waset.org/abstracts/search?q=Tool%20Integration" title=" Tool Integration"> Tool Integration</a> </p> <a href="https://publications.waset.org/abstracts/123898/multi-tooled-robotic-hand-for-tele-operation-of-explosive-devices" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/123898.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">141</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">12</span> Understanding the Experience of the Visually Impaired towards a Multi-Sensorial Architectural Design</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sarah%20M.%20Oteifa">Sarah M. Oteifa</a>, <a href="https://publications.waset.org/abstracts/search?q=Lobna%20A.%20Sherif"> Lobna A. Sherif</a>, <a href="https://publications.waset.org/abstracts/search?q=Yasser%20M.%20Mostafa"> Yasser M. Mostafa</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Visually impaired people, in their daily lives, face struggles and spatial barriers because the built environment is often designed with an extreme focus on the visual element, causing what is called architectural visual bias or ocularcentrism. The aim of the study is to holistically understand the world of the visually impaired as an attempt to extract the qualities of space that accommodate their needs, and to show the importance of multi-sensory, holistic designs for the blind. Within the framework of existential phenomenology, common themes are reached through &quot;intersubjectivity&quot;: experience descriptions by blind people and blind architects, observation of how blind children learn to perceive their surrounding environment, and a personal lived blind-folded experience are analyzed. The extracted themes show how visually impaired people filter out and prioritize tactile (active, passive and dynamic touch), acoustic and olfactory spatial qualities respectively, and how this happened during the personal lived blind folded experience. The themes clarify that haptic and aural inclusive designs are essential to create environments suitable for the visually impaired to empower them towards an independent, safe and efficient life. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=architecture" title="architecture">architecture</a>, <a href="https://publications.waset.org/abstracts/search?q=architectural%20ocularcentrism" title=" architectural ocularcentrism"> architectural ocularcentrism</a>, <a href="https://publications.waset.org/abstracts/search?q=multi-sensory%20design" title=" multi-sensory design"> multi-sensory design</a>, <a href="https://publications.waset.org/abstracts/search?q=visually%20impaired" title=" visually impaired"> visually impaired</a> </p> <a href="https://publications.waset.org/abstracts/72324/understanding-the-experience-of-the-visually-impaired-towards-a-multi-sensorial-architectural-design" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/72324.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">202</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11</span> Revisiting Pedestrians’ Appraisals of Urban Streets</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Norhaslina%20Hassan">Norhaslina Hassan</a>, <a href="https://publications.waset.org/abstracts/search?q=Sherina%20Rezvanipour"> Sherina Rezvanipour</a>, <a href="https://publications.waset.org/abstracts/search?q=Amirhosein%20Ghaffarian%20Hoseini"> Amirhosein Ghaffarian Hoseini</a>, <a href="https://publications.waset.org/abstracts/search?q=Ng%20Siew%20Cheok"> Ng Siew Cheok</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The walkability features of urban streets are prominent factors that are often focused on achieving a pedestrian-friendly environment. The limited attention that walkability enhancements devote to pedestrians' experiences or perceptions, on the other hand, raises the question of whether walkability enhancement is sufficient for pedestrians to enjoy using the streets. Thus, this paper evaluates the relationship between the socio-physical components of urban streets and pedestrians’ perceptions. A total of 1152 pedestrians from five urban streets in two major Malaysian cities, Kuala Lumpur, and George Town, Penang, participated in this study. In particular, this study used pedestrian preference scores towards socio-physical attributes that exist in urban streets to assess their impact on pedestrians’ appraisals of street likeability, comfort, and safety. Through analysis, the principal component analysis extracted eight socio-physical components, which were then tested via an ordinal regression model to identify their impact on pedestrian street likeability, comfort (visual, auditory, haptic and olfactory), and safety (physical safety, environmental safety, and security). Furthermore, a non-parametric Kruskal Wallis test was used to identify whether the results were subjected to any socio-demographic differences. The results found that all eight components had some degree of effect on the appraisals. It was also revealed that pedestrians’ preferences towards the attributes as well as their appraisals significantly varied based on their age, gender, ethnicity and education. These results and their implications for urban planning are further discussed in this paper. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=pedestrian%20appraisal" title="pedestrian appraisal">pedestrian appraisal</a>, <a href="https://publications.waset.org/abstracts/search?q=pedestrian%20perception" title=" pedestrian perception"> pedestrian perception</a>, <a href="https://publications.waset.org/abstracts/search?q=street%20sociophysical%20attributes" title=" street sociophysical attributes"> street sociophysical attributes</a>, <a href="https://publications.waset.org/abstracts/search?q=walking%20experience" title=" walking experience"> walking experience</a> </p> <a href="https://publications.waset.org/abstracts/155814/revisiting-pedestrians-appraisals-of-urban-streets" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/155814.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">124</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">10</span> Application of Industrial Ergonomics in Vehicle Service System Design</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Zhao%20Yu">Zhao Yu</a>, <a href="https://publications.waset.org/abstracts/search?q=Zhi-Nan%20Zhang"> Zhi-Nan Zhang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> More and more interactive devices are used in the transportation service system. Our mobile phones, on-board computers, and Head-Up Displays (HUDs) can all be used as the tools of the in-car service system. People can access smart systems with different terminals such as mobile phones, computers, pads and even their cars and watches. Different forms of terminals bring the different quality of interaction by the various human-computer Interaction modes. The new interactive devices require good ergonomics design at each stage of the whole design process. According to the theory of human factors and ergonomics, this paper compared three types of interactive devices by four driving tasks. Forty-eight drivers were chosen to experience these three interactive devices (mobile phones, on-board computers, and HUDs) by a simulate driving process. The subjects evaluated ergonomics performance and subjective workload after the process. And subjects were encouraged to support suggestions for improving the interactive device. The result shows that different interactive devices have different advantages in driving tasks, especially in non-driving tasks such as information and entertainment fields. Compared with mobile phones and onboard groups, the HUD groups had shorter response times in most tasks. The tasks of slow-up and the emergency braking are less accurate than the performance of a control group, which may because the haptic feedback of these two tasks is harder to distinguish than the visual information. Simulated driving is also helpful in improving the design of in-vehicle interactive devices. The paper summarizes the ergonomics characteristics of three in-vehicle interactive devices. And the research provides a reference for the future design of in-vehicle interactive devices through an ergonomic approach to ensure a good interaction relationship between the driver and the in-vehicle service system. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=human%20factors" title="human factors">human factors</a>, <a href="https://publications.waset.org/abstracts/search?q=industrial%20ergonomics" title=" industrial ergonomics"> industrial ergonomics</a>, <a href="https://publications.waset.org/abstracts/search?q=transportation%20system" title=" transportation system"> transportation system</a>, <a href="https://publications.waset.org/abstracts/search?q=usability" title=" usability"> usability</a>, <a href="https://publications.waset.org/abstracts/search?q=vehicle%20user%20interface" title=" vehicle user interface"> vehicle user interface</a> </p> <a href="https://publications.waset.org/abstracts/111147/application-of-industrial-ergonomics-in-vehicle-service-system-design" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/111147.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">139</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">9</span> Wearable Jacket for Game-Based Post-Stroke Arm Rehabilitation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=A.%20Raj%20Kumar">A. Raj Kumar</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Okunseinde"> A. Okunseinde</a>, <a href="https://publications.waset.org/abstracts/search?q=P.%20Raghavan"> P. Raghavan</a>, <a href="https://publications.waset.org/abstracts/search?q=V.%20Kapila"> V. Kapila</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Stroke is the leading cause of adult disability worldwide. With recent advances in immediate post-stroke care, there is an increasing number of young stroke survivors, under the age of 65 years. While most stroke survivors will regain the ability to walk, they often experience long-term arm and hand motor impairments. Long term upper limb rehabilitation is needed to restore movement and function, and prevent deterioration from complications such as learned non-use and learned bad-use. We have developed a novel virtual coach, a wearable instrumented rehabilitation jacket, to motivate individuals to participate in long-term skill re-learning, that can be personalized to their impairment profile. The jacket can estimate the movements of an individual’s arms using embedded off-the-shelf sensors (e.g., 9-DOF IMU for inertial measurements, flex-sensors for measuring angular orientation of fingers) and a Bluetooth Low Energy (BLE) powered microcontroller (e.g., RFduino) to non-intrusively extract data. The 9-DOF IMU sensors contain 3-axis accelerometer, 3-axis gyroscope, and 3-axis magnetometer to compute the quaternions, which are transmitted to a computer to compute the Euler angles and estimate the angular orientation of the arms. The data are used in a gaming environment to provide visual, and/or haptic feedback for goal-based, augmented-reality training to facilitate re-learning in a cost-effective, evidence-based manner. The full paper will elaborate the technical aspects of communication, interactive gaming environment, and physical aspects of electronics necessary to achieve our stated goal. Moreover, the paper will suggest methods to utilize the proposed system as a cheaper, portable, and versatile system vis-à-vis existing instrumentation to facilitate post-stroke personalized arm rehabilitation. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=feedback" title="feedback">feedback</a>, <a href="https://publications.waset.org/abstracts/search?q=gaming" title=" gaming"> gaming</a>, <a href="https://publications.waset.org/abstracts/search?q=Euler%20angles" title=" Euler angles"> Euler angles</a>, <a href="https://publications.waset.org/abstracts/search?q=rehabilitation" title=" rehabilitation"> rehabilitation</a>, <a href="https://publications.waset.org/abstracts/search?q=augmented%20reality" title=" augmented reality"> augmented reality</a> </p> <a href="https://publications.waset.org/abstracts/47061/wearable-jacket-for-game-based-post-stroke-arm-rehabilitation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/47061.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">277</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">8</span> Autistic Traits and Multisensory Integration–Using a Size-Weight Illusion Paradigm</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Man%20Wai%20Lei">Man Wai Lei</a>, <a href="https://publications.waset.org/abstracts/search?q=Charles%20Mark%20Zaroff"> Charles Mark Zaroff</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Objective: A majority of studies suggest that people with Autism Spectrum Disorder (ASD) have multisensory integration deficits. However, normal and even supranormal multisensory integration abilities have also been reported. Additionally, little of this work has been undertaken utilizing a dimensional conceptualization of ASD; i.e., a broader autism phenotype. Utilizing methodology that controls for common potential confounds, the current study aimed to examine if deficits in multisensory integration are associated with ASD traits in a non-clinical population. The contribution of affective versus non-affective components of sensory hypersensitivity to multisensory integration was also examined. Methods: Participants were 147 undergraduate university students in Macau, a Special Administrative Region of China, of Chinese ethnicity, aged 16 to 21 (Mean age = 19.13; SD = 1.07). Participants completed the Autism-Spectrum Quotient, the Sensory Perception Quotient, and the Adolescent/Adult Sensory Profile, in order to measure ASD traits, non-affective, and affective aspects of sensory/perceptual hypersensitivity, respectively. In order to explore multisensory integration across visual and haptic domains, participants were asked to judge which one of two equally weighted, but different sized cylinders was heavier, as a means of detecting the presence of the size-weight illusion (SWI). Results: ASD trait level was significantly and negatively correlated with susceptibility to the SWI (p < 0.05); this correlation was not associated with either accuracy in weight discrimination or gender. Examining the top decile of the non-normally distributed SWI scores revealed a significant negative association with sensation avoiding, but not other aspects of effective or non-effective sensory hypersensitivity. Conclusion and Implications: Within the normal population, a greater degree of ASD traits is associated with a lower likelihood of multisensory integration; echoing was often found in individuals with a clinical diagnosis of ASD, and providing further evidence for the dimensional nature of this disorder. This tendency appears to be associated with dysphoric emotional reactions to sensory input. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Autism%20Spectrum%20Disorder" title="Autism Spectrum Disorder">Autism Spectrum Disorder</a>, <a href="https://publications.waset.org/abstracts/search?q=dimensional" title=" dimensional"> dimensional</a>, <a href="https://publications.waset.org/abstracts/search?q=multisensory%20integration" title=" multisensory integration"> multisensory integration</a>, <a href="https://publications.waset.org/abstracts/search?q=size-weight%20illusion" title=" size-weight illusion"> size-weight illusion</a> </p> <a href="https://publications.waset.org/abstracts/34641/autistic-traits-and-multisensory-integration-using-a-size-weight-illusion-paradigm" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/34641.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">482</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">&lsaquo;</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=haptic&amp;page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=haptic&amp;page=2" rel="next">&rsaquo;</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">&copy; 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">&times;</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10