CINXE.COM

Search results for: leap motion

<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: leap motion</title> <meta name="description" content="Search results for: leap motion"> <meta name="keywords" content="leap motion"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="leap motion" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="leap motion"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 1361</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: leap motion</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1361</span> Hand Motion and Gesture Control of Laboratory Test Equipment Using the Leap Motion Controller</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ian%20A.%20Grout">Ian A. Grout</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, the design and development of a system to provide hand motion and gesture control of laboratory test equipment is considered and discussed. The Leap Motion controller is used to provide an input to control a laboratory power supply as part of an electronic circuit experiment. By suitable hand motions and gestures, control of the power supply is provided remotely and without the need to physically touch the equipment used. As such, it provides an alternative manner in which to control electronic equipment via a PC and is considered here within the field of human computer interaction (HCI). <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=control" title="control">control</a>, <a href="https://publications.waset.org/abstracts/search?q=hand%20gesture" title=" hand gesture"> hand gesture</a>, <a href="https://publications.waset.org/abstracts/search?q=human%20computer%20interaction" title=" human computer interaction"> human computer interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=test%20equipment" title=" test equipment"> test equipment</a> </p> <a href="https://publications.waset.org/abstracts/72099/hand-motion-and-gesture-control-of-laboratory-test-equipment-using-the-leap-motion-controller" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/72099.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">315</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1360</span> CONDUCTHOME: Gesture Interface Control of Home Automation Boxes</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=J.%20Branstett">J. Branstett</a>, <a href="https://publications.waset.org/abstracts/search?q=V.%20Gagneux"> V. Gagneux</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Leleu"> A. Leleu</a>, <a href="https://publications.waset.org/abstracts/search?q=B.%20Levadoux"> B. Levadoux</a>, <a href="https://publications.waset.org/abstracts/search?q=J.%20Pascale"> J. Pascale</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper presents the interface CONDUCTHOME which controls home automation systems with a Leap Motion using ‘invariant gesture protocols’. The function of this interface is to simplify the interaction of the user with its environment. A hardware part allows the Leap Motion to be carried around the house. A software part interacts with the home automation box and displays the useful information for the user. An objective of this work is the development a natural/invariant/simple gesture control interface to help elder people/people with disabilities. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=automation" title="automation">automation</a>, <a href="https://publications.waset.org/abstracts/search?q=ergonomics" title=" ergonomics"> ergonomics</a>, <a href="https://publications.waset.org/abstracts/search?q=gesture%20recognition" title=" gesture recognition"> gesture recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=interoperability" title=" interoperability"> interoperability</a> </p> <a href="https://publications.waset.org/abstracts/38302/conducthome-gesture-interface-control-of-home-automation-boxes" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/38302.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">431</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1359</span> Haptic Cycle: Designing Enhanced Museum Learning Activities</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Menelaos%20N.%20Katsantonis">Menelaos N. Katsantonis</a>, <a href="https://publications.waset.org/abstracts/search?q=Athanasios%20Manikas"> Athanasios Manikas</a>, <a href="https://publications.waset.org/abstracts/search?q=Alexandros%20Chatzis"> Alexandros Chatzis</a>, <a href="https://publications.waset.org/abstracts/search?q=Stavros%20Doropoulos"> Stavros Doropoulos</a>, <a href="https://publications.waset.org/abstracts/search?q=Anastasios%20Avramis"> Anastasios Avramis</a>, <a href="https://publications.waset.org/abstracts/search?q=Ioannis%20Mavridis"> Ioannis Mavridis</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Museums enhance their potential by adopting new technologies and techniques to appeal to more visitors and engage them in creative and joyful activities. In this study, the Haptic Cycle is presented, a cycle of museum activities proposed for the development of museum learning approaches with optimized effectiveness and engagement. Haptic Cycle envisages the improvement of the museum’s services by offering a wide range of activities. Haptic Cycle activities make the museum’s exhibitions more approachable by bringing them closer to the visitors. Visitors can interact with the museum’s artifacts and explore them haptically and sonically. Haptic Cycle proposes constructivist learning activities in which visitors actively construct their knowledge by exploring the artifacts, experimenting with them and realizing their importance. Based on the Haptic Cycle, we developed the HapticSOUND system, an innovative virtual reality system that includes an advanced user interface that employs gesture-based technology. HapticSOUND’s interface utilizes the leap motion gesture recognition controller and a 3D-printed traditional Cretan lute, utilized by visitors to perform various activities such as exploring the lute and playing notes and songs. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=haptic%20cycle" title="haptic cycle">haptic cycle</a>, <a href="https://publications.waset.org/abstracts/search?q=HapticSOUND" title=" HapticSOUND"> HapticSOUND</a>, <a href="https://publications.waset.org/abstracts/search?q=museum%20learning" title=" museum learning"> museum learning</a>, <a href="https://publications.waset.org/abstracts/search?q=gesture-based" title=" gesture-based"> gesture-based</a>, <a href="https://publications.waset.org/abstracts/search?q=leap%20motion" title=" leap motion"> leap motion</a> </p> <a href="https://publications.waset.org/abstracts/165300/haptic-cycle-designing-enhanced-museum-learning-activities" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/165300.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">91</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1358</span> Analysis of Electricity Demand at Household Level Using Leap Model in Balochistan, Pakistan</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sheikh%20Saeed%20Ahmad">Sheikh Saeed Ahmad</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Electricity is vital for any state’s development that needs policy for planning the power network extension. This study is about simulation modeling for electricity in Balochistan province. Baseline data of electricity consumption was used of year 2004 and projected with the help of LEAP model up to subsequent 30 years. Three scenarios were created to run software. One scenario was baseline and other two were alternative or green scenarios i.e. solar and wind energy scenarios. Present study revealed that Balochistan has much greater potential for solar and wind energy for electricity production. By adopting these alternative energy forms, Balochistan can save energy in future nearly 23 and 48% by incorporating solar and wind power respectively. Thus, the study suggests to government planners, an aspect of integrating renewable sources in power system for ensuring sustainable development and growth. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=demand%20and%20supply" title="demand and supply">demand and supply</a>, <a href="https://publications.waset.org/abstracts/search?q=LEAP" title=" LEAP"> LEAP</a>, <a href="https://publications.waset.org/abstracts/search?q=solar%20energy" title=" solar energy"> solar energy</a>, <a href="https://publications.waset.org/abstracts/search?q=wind%20energy" title=" wind energy"> wind energy</a>, <a href="https://publications.waset.org/abstracts/search?q=households" title=" households"> households</a> </p> <a href="https://publications.waset.org/abstracts/18942/analysis-of-electricity-demand-at-household-level-using-leap-model-in-balochistan-pakistan" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/18942.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">427</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1357</span> ISME: Integrated Style Motion Editor for 3D Humanoid Character</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ismahafezi%20Ismail">Ismahafezi Ismail</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohd%20Shahrizal%20Sunar"> Mohd Shahrizal Sunar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The motion of a realistic 3D humanoid character is very important especially for the industries developing computer animations and games. However, this type of motion is seen with a very complex dimensional data as well as body position, orientation, and joint rotation. Integrated Style Motion Editor (ISME), on the other hand, is a method used to alter the 3D humanoid motion capture data utilised in computer animation and games development. Therefore, this study was carried out with the purpose of demonstrating a method that is able to manipulate and deform different motion styles by integrating Key Pose Deformation Technique and Trajectory Control Technique. This motion editing method allows the user to generate new motions from the original motion capture data using a simple interface control. Unlike the previous method, our method produces a realistic humanoid motion style in real time. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=computer%20animation" title="computer animation">computer animation</a>, <a href="https://publications.waset.org/abstracts/search?q=humanoid%20motion" title=" humanoid motion"> humanoid motion</a>, <a href="https://publications.waset.org/abstracts/search?q=motion%20capture" title=" motion capture"> motion capture</a>, <a href="https://publications.waset.org/abstracts/search?q=motion%20editing" title=" motion editing"> motion editing</a> </p> <a href="https://publications.waset.org/abstracts/54401/isme-integrated-style-motion-editor-for-3d-humanoid-character" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/54401.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">382</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1356</span> Simulation Approach for Analyzing Transportation Energy System in South Korea</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sungjun%20Hong">Sungjun Hong</a>, <a href="https://publications.waset.org/abstracts/search?q=Youah%20Lee"> Youah Lee</a>, <a href="https://publications.waset.org/abstracts/search?q=Jongwook%20Kim"> Jongwook Kim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In the last COP21 held in Paris on 2015, Korean government announced that Intended Nationally Determined Contributions (INDC) was 37% based on BAU by 2030. The GHG reduction rate of the transportation sector is the strongest among all sectors by 2020. In order to cope with Korean INDC, Korean government established that 3rd eco-friendly car deployment national plans at the end of 2015. In this study, we make the energy system model for estimating GHG emissions using LEAP model. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=INDC" title="INDC">INDC</a>, <a href="https://publications.waset.org/abstracts/search?q=greenhouse%20gas" title=" greenhouse gas"> greenhouse gas</a>, <a href="https://publications.waset.org/abstracts/search?q=LEAP" title=" LEAP"> LEAP</a>, <a href="https://publications.waset.org/abstracts/search?q=transportation" title=" transportation"> transportation</a> </p> <a href="https://publications.waset.org/abstracts/55860/simulation-approach-for-analyzing-transportation-energy-system-in-south-korea" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/55860.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">205</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1355</span> Classification of Equations of Motion</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Amritpal%20Singh%20Nafria">Amritpal Singh Nafria</a>, <a href="https://publications.waset.org/abstracts/search?q=Rohit%20Sharma"> Rohit Sharma</a>, <a href="https://publications.waset.org/abstracts/search?q=Md.%20Shami%20Ansari"> Md. Shami Ansari</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Up to now only five different equations of motion can be derived from velocity time graph without needing to know the normal and frictional forces acting at the point of contact. In this paper we obtained all possible requisite conditions to be considering an equation as an equation of motion. After that we classified equations of motion by considering two equations as fundamental kinematical equations of motion and other three as additional kinematical equations of motion. After deriving these five equations of motion, we examine the easiest way of solving a wide variety of useful numerical problems. At the end of the paper, we discussed the importance and educational benefits of classification of equations of motion. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=velocity-time%20graph" title="velocity-time graph">velocity-time graph</a>, <a href="https://publications.waset.org/abstracts/search?q=fundamental%20equations" title=" fundamental equations"> fundamental equations</a>, <a href="https://publications.waset.org/abstracts/search?q=additional%20equations" title=" additional equations"> additional equations</a>, <a href="https://publications.waset.org/abstracts/search?q=requisite%20conditions" title=" requisite conditions"> requisite conditions</a>, <a href="https://publications.waset.org/abstracts/search?q=importance%20and%20educational%20benefits" title=" importance and educational benefits"> importance and educational benefits</a> </p> <a href="https://publications.waset.org/abstracts/15102/classification-of-equations-of-motion" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/15102.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">787</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1354</span> Adaptive Motion Planning for 6-DOF Robots Based on Trigonometric Functions</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jincan%20Li">Jincan Li</a>, <a href="https://publications.waset.org/abstracts/search?q=Mingyu%20Gao"> Mingyu Gao</a>, <a href="https://publications.waset.org/abstracts/search?q=Zhiwei%20He"> Zhiwei He</a>, <a href="https://publications.waset.org/abstracts/search?q=Yuxiang%20Yang"> Yuxiang Yang</a>, <a href="https://publications.waset.org/abstracts/search?q=Zhongfei%20Yu"> Zhongfei Yu</a>, <a href="https://publications.waset.org/abstracts/search?q=Yuanyuan%20Liu"> Yuanyuan Liu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Building an appropriate motion model is crucial for trajectory planning of robots and determines the operational quality directly. An adaptive acceleration and deceleration motion planning based on trigonometric functions for the end-effector of 6-DOF robots in Cartesian coordinate system is proposed in this paper. This method not only achieves the smooth translation motion and rotation motion by constructing a continuous jerk model, but also automatically adjusts the parameters of trigonometric functions according to the variable inputs and the kinematic constraints. The results of computer simulation show that this method is correct and effective to achieve the adaptive motion planning for linear trajectories. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=kinematic%20constraints" title="kinematic constraints">kinematic constraints</a>, <a href="https://publications.waset.org/abstracts/search?q=motion%20planning" title=" motion planning"> motion planning</a>, <a href="https://publications.waset.org/abstracts/search?q=trigonometric%20function" title=" trigonometric function"> trigonometric function</a>, <a href="https://publications.waset.org/abstracts/search?q=6-DOF%20robots" title=" 6-DOF robots"> 6-DOF robots</a> </p> <a href="https://publications.waset.org/abstracts/87082/adaptive-motion-planning-for-6-dof-robots-based-on-trigonometric-functions" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/87082.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">271</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1353</span> Design of a Low Cost Motion Data Acquisition Setup for Mechatronic Systems</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Baris%20Can%20Yalcin">Baris Can Yalcin</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Motion sensors have been commonly used as a valuable component in mechatronic systems, however, many mechatronic designs and applications that need motion sensors cost enormous amount of money, especially high-tech systems. Design of a software for communication protocol between data acquisition card and motion sensor is another issue that has to be solved. This study presents how to design a low cost motion data acquisition setup consisting of MPU 6050 motion sensor (gyro and accelerometer in 3 axes) and Arduino Mega2560 microcontroller. Design parameters are calibration of the sensor, identification and communication between sensor and data acquisition card, interpretation of data collected by the sensor. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=design" title="design">design</a>, <a href="https://publications.waset.org/abstracts/search?q=mechatronics" title=" mechatronics"> mechatronics</a>, <a href="https://publications.waset.org/abstracts/search?q=motion%20sensor" title=" motion sensor"> motion sensor</a>, <a href="https://publications.waset.org/abstracts/search?q=data%20acquisition" title=" data acquisition"> data acquisition</a> </p> <a href="https://publications.waset.org/abstracts/10243/design-of-a-low-cost-motion-data-acquisition-setup-for-mechatronic-systems" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/10243.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">588</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1352</span> Motion Effects of Arabic Typography on Screen-Based Media</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ibrahim%20Hassan">Ibrahim Hassan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Motion typography is one of the most important types of visual communication based on display. Through the digital display media, we can control the text properties (size, direction, thickness, color, etc.). The use of motion typography in visual communication made it have several images. We need to adjust the terminology and clarify the different differences between them, so relying on the word motion typography -considered a general term- is not enough to separate the different communicative functions of the moving text. In this paper, we discuss the different effects of motion typography on Arabic writing and how we can achieve harmony between the movement and the letterform, and we will, during our experiments, present a new type of text movement. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Arabic%20typography" title="Arabic typography">Arabic typography</a>, <a href="https://publications.waset.org/abstracts/search?q=motion%20typography" title=" motion typography"> motion typography</a>, <a href="https://publications.waset.org/abstracts/search?q=kinetic%20typography" title=" kinetic typography"> kinetic typography</a>, <a href="https://publications.waset.org/abstracts/search?q=fluid%20typography" title=" fluid typography"> fluid typography</a>, <a href="https://publications.waset.org/abstracts/search?q=temporal%20typography" title=" temporal typography"> temporal typography</a> </p> <a href="https://publications.waset.org/abstracts/142182/motion-effects-of-arabic-typography-on-screen-based-media" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/142182.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">160</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1351</span> Analysis on Greenhouse Gas Emissions Potential by Deploying the Green Cars in Korean Road Transport Sector</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sungjun%20Hong">Sungjun Hong</a>, <a href="https://publications.waset.org/abstracts/search?q=Yanghon%20Chung"> Yanghon Chung</a>, <a href="https://publications.waset.org/abstracts/search?q=Nyunbae%20Park"> Nyunbae Park</a>, <a href="https://publications.waset.org/abstracts/search?q=Sangyong%20Park"> Sangyong Park</a> </p> <p class="card-text"><strong>Abstract:</strong></p> South Korea, as the 7th largest greenhouse gas emitting country in 2011, announced that the national reduction target of greenhouse gas emissions was 30% based on BAU (Business As Usual) by 2020. And the reduction rate of the transport sector is 34.3% which is the highest figure among all sectors. This paper attempts to analyze the environmental effect on deploying the green cars in Korean road transport sector. In order to calculate the greenhouse gas emissions, the LEAP model is applied in this study. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=green%20car" title="green car">green car</a>, <a href="https://publications.waset.org/abstracts/search?q=greenhouse%20gas" title=" greenhouse gas"> greenhouse gas</a>, <a href="https://publications.waset.org/abstracts/search?q=LEAP%20model" title=" LEAP model"> LEAP model</a>, <a href="https://publications.waset.org/abstracts/search?q=road%20transport%20sector" title=" road transport sector"> road transport sector</a> </p> <a href="https://publications.waset.org/abstracts/18570/analysis-on-greenhouse-gas-emissions-potential-by-deploying-the-green-cars-in-korean-road-transport-sector" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/18570.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">615</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1350</span> A New Center of Motion in Cabling Robots</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Alireza%20Abbasi%20Moshaii">Alireza Abbasi Moshaii</a>, <a href="https://publications.waset.org/abstracts/search?q=Farshid%20Najafi"> Farshid Najafi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper a new model for centre of motion creating is proposed. This new method uses cables. So, it is very useful in robots because it is light and has easy assembling process. In the robots which need to be in touch with some things this method is very good. It will be described in the following. The accuracy of the idea is proved by an experiment. This system could be used in the robots which need a fixed point in the contact with some things and make a circular motion. Such as dancer, physician or repair robots. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=centre%20of%20motion" title="centre of motion">centre of motion</a>, <a href="https://publications.waset.org/abstracts/search?q=robotic%20cables" title=" robotic cables"> robotic cables</a>, <a href="https://publications.waset.org/abstracts/search?q=permanent%20touching" title=" permanent touching"> permanent touching</a>, <a href="https://publications.waset.org/abstracts/search?q=mechatronics%20engineering" title=" mechatronics engineering"> mechatronics engineering</a> </p> <a href="https://publications.waset.org/abstracts/24087/a-new-center-of-motion-in-cabling-robots" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/24087.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">443</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1349</span> Comparing the Motion of Solar System with Water Droplet Motion to Predict the Future of Solar System</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Areena%20Bhatti">Areena Bhatti</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The geometric arrangement of planet and moon is the result of a self-organizing system. In our solar system, the planets and moons are constantly orbiting around the sun. The aim of this theory is to compare the motion of a solar system with the motion of water droplet when poured into a water body. The basic methodology is to compare both motions to know how they are related to each other. The difference between both systems will be that one is extremely fast, and the other is extremely slow. The role of this theory is that by looking at the fast system we can conclude how slow the system will get to an end. Just like ripples are formed around water droplet that move away from the droplet and water droplet forming those ripples become small in size will tell us how solar system will behave in the same way. So it is concluded that large and small systems can work under the same process but with different motions of time, and motion of the solar system is the slowest form of water droplet motion. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=motion" title="motion">motion</a>, <a href="https://publications.waset.org/abstracts/search?q=water" title=" water"> water</a>, <a href="https://publications.waset.org/abstracts/search?q=sun" title=" sun"> sun</a>, <a href="https://publications.waset.org/abstracts/search?q=time" title=" time"> time</a> </p> <a href="https://publications.waset.org/abstracts/111769/comparing-the-motion-of-solar-system-with-water-droplet-motion-to-predict-the-future-of-solar-system" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/111769.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">151</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1348</span> Mapping of Electrical Energy Consumption Yogyakarta Province in 2014-2025</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Alfi%20Al%20Fahreizy">Alfi Al Fahreizy </a> </p> <p class="card-text"><strong>Abstract:</strong></p> Yogyakarta is one of the provinces in Indonesia that often get a power outage because of high load electrical consumption. The authors mapped the electrical energy consumption [GWh] for the province of Yogyakarta in 2014-2025 using LEAP (Long-range Energy Alternatives Planning system) software. This paper use BAU (Business As Usual) scenario. BAU scenario in which the projection is based on the assumption that growth in electricity consumption will run as normally as before. The goal is to be able to see the electrical energy consumption in the household sector, industry , business, social, government office building, and street lighting. The data is the data projected statistical population and consumption data electricity [GWh] 2010, 2011, 2012 in Yogyakarta province. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=LEAP" title="LEAP">LEAP</a>, <a href="https://publications.waset.org/abstracts/search?q=energy%20consumption" title=" energy consumption"> energy consumption</a>, <a href="https://publications.waset.org/abstracts/search?q=Yogyakarta" title=" Yogyakarta"> Yogyakarta</a>, <a href="https://publications.waset.org/abstracts/search?q=BAU" title=" BAU"> BAU</a> </p> <a href="https://publications.waset.org/abstracts/20956/mapping-of-electrical-energy-consumption-yogyakarta-province-in-2014-2025" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/20956.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">598</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1347</span> Relevant LMA Features for Human Motion Recognition</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Insaf%20Ajili">Insaf Ajili</a>, <a href="https://publications.waset.org/abstracts/search?q=Malik%20Mallem"> Malik Mallem</a>, <a href="https://publications.waset.org/abstracts/search?q=Jean-Yves%20Didier"> Jean-Yves Didier</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Motion recognition from videos is actually a very complex task due to the high variability of motions. This paper describes the challenges of human motion recognition, especially motion representation step with relevant features. Our descriptor vector is inspired from Laban Movement Analysis method. We propose discriminative features using the Random Forest algorithm in order to remove redundant features and make learning algorithms operate faster and more effectively. We validate our method on MSRC-12 and UTKinect datasets. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=discriminative%20LMA%20features" title="discriminative LMA features">discriminative LMA features</a>, <a href="https://publications.waset.org/abstracts/search?q=features%20reduction" title=" features reduction"> features reduction</a>, <a href="https://publications.waset.org/abstracts/search?q=human%20motion%20recognition" title=" human motion recognition"> human motion recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=random%20forest" title=" random forest"> random forest</a> </p> <a href="https://publications.waset.org/abstracts/96299/relevant-lma-features-for-human-motion-recognition" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/96299.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">195</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1346</span> Mixed Sub-Fractional Brownian Motion</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mounir%20Zili">Mounir Zili</a> </p> <p class="card-text"><strong>Abstract:</strong></p> We will introduce a new extension of the Brownian motion, that could serve to get a good model of many natural phenomena. It is a linear combination of a finite number of sub-fractional Brownian motions; that is why we will call it the mixed sub-fractional Brownian motion. We will present some basic properties of this process. Among others, we will check that our process is non-Markovian and that it has non-stationary increments. We will also give the conditions under which it is a semimartingale. Finally, the main features of its sample paths will be specified. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=mixed%20Gaussian%20processes" title="mixed Gaussian processes">mixed Gaussian processes</a>, <a href="https://publications.waset.org/abstracts/search?q=Sub-fractional%20Brownian%20motion" title=" Sub-fractional Brownian motion"> Sub-fractional Brownian motion</a>, <a href="https://publications.waset.org/abstracts/search?q=sample%20paths" title=" sample paths"> sample paths</a> </p> <a href="https://publications.waset.org/abstracts/32479/mixed-sub-fractional-brownian-motion" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/32479.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">488</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1345</span> Motion Planning of SCARA Robots for Trajectory Tracking</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Giovanni%20Incerti">Giovanni Incerti</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The paper presents a method for a simple and immediate motion planning of a SCARA robot, whose end-effector has to move along a given trajectory; the calculation procedure requires the user to define in analytical form or by points the trajectory to be followed and to assign the curvilinear abscissa as function of the time. On the basis of the geometrical characteristics of the robot, a specifically developed program determines the motion laws of the actuators that enable the robot to generate the required movement; this software can be used in all industrial applications for which a SCARA robot has to be frequently reprogrammed, in order to generate various types of trajectories with different motion times. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=motion%20planning" title="motion planning">motion planning</a>, <a href="https://publications.waset.org/abstracts/search?q=SCARA%20robot" title=" SCARA robot"> SCARA robot</a>, <a href="https://publications.waset.org/abstracts/search?q=trajectory%20tracking" title=" trajectory tracking"> trajectory tracking</a>, <a href="https://publications.waset.org/abstracts/search?q=analytical%20form" title=" analytical form"> analytical form</a> </p> <a href="https://publications.waset.org/abstracts/19726/motion-planning-of-scara-robots-for-trajectory-tracking" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/19726.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">318</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1344</span> Mixed-Sub Fractional Brownian Motion</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mounir%20Zili">Mounir Zili</a> </p> <p class="card-text"><strong>Abstract:</strong></p> We will introduce a new extension of the Brownian motion, that could serve to get a good model of many natural phenomena. It is a linear combination of a finite number of sub-fractional Brownian motions; that is why we will call it the mixed sub-fractional Brownian motion. We will present some basic properties of this process. Among others, we will check that our process is non-markovian and that it has non-stationary increments. We will also give the conditions under which it is a semi-martingale. Finally, the main features of its sample paths will be specified. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=fractal%20dimensions" title="fractal dimensions">fractal dimensions</a>, <a href="https://publications.waset.org/abstracts/search?q=mixed%20gaussian%20processes" title=" mixed gaussian processes"> mixed gaussian processes</a>, <a href="https://publications.waset.org/abstracts/search?q=sample%20paths" title=" sample paths"> sample paths</a>, <a href="https://publications.waset.org/abstracts/search?q=sub-fractional%20brownian%20motion" title=" sub-fractional brownian motion "> sub-fractional brownian motion </a> </p> <a href="https://publications.waset.org/abstracts/36677/mixed-sub-fractional-brownian-motion" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/36677.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">420</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1343</span> Detection of Image Blur and Its Restoration for Image Enhancement</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=M.%20V.%20Chidananda%20Murthy">M. V. Chidananda Murthy</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Z.%20Kurian"> M. Z. Kurian</a>, <a href="https://publications.waset.org/abstracts/search?q=H.%20S.%20Guruprasad"> H. S. Guruprasad</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Image restoration in the process of communication is one of the emerging fields in the image processing. The motion analysis processing is the simplest case to detect motion in an image. Applications of motion analysis widely spread in many areas such as surveillance, remote sensing, film industry, navigation of autonomous vehicles, etc. The scene may contain multiple moving objects, by using motion analysis techniques the blur caused by the movement of the objects can be enhanced by filling-in occluded regions and reconstruction of transparent objects, and it also removes the motion blurring. This paper presents the design and comparison of various motion detection and enhancement filters. Median filter, Linear image deconvolution, Inverse filter, Pseudoinverse filter, Wiener filter, Lucy Richardson filter and Blind deconvolution filters are used to remove the blur. In this work, we have considered different types and different amount of blur for the analysis. Mean Square Error (MSE) and Peak Signal to Noise Ration (PSNR) are used to evaluate the performance of the filters. The designed system has been implemented in Matlab software and tested for synthetic and real-time images. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=image%20enhancement" title="image enhancement">image enhancement</a>, <a href="https://publications.waset.org/abstracts/search?q=motion%20analysis" title=" motion analysis"> motion analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=motion%20detection" title=" motion detection"> motion detection</a>, <a href="https://publications.waset.org/abstracts/search?q=motion%20estimation" title=" motion estimation"> motion estimation</a> </p> <a href="https://publications.waset.org/abstracts/59485/detection-of-image-blur-and-its-restoration-for-image-enhancement" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/59485.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">288</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1342</span> Effective Virtual Tunnel Shape for Motion Modification in Upper-Limb Perception-Assist with a Power-Assist Robot</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Kazuo%20Kiguchi">Kazuo Kiguchi</a>, <a href="https://publications.waset.org/abstracts/search?q=Kouta%20Ikegami"> Kouta Ikegami</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In the case of physically weak persons, not only motor abilities, but also sensory abilities are sometimes deteriorated. The concept of perception-assist has been proposed to assist the sensory ability of the physically weak persons with a power-assist robot. Since upper-limb motion is very important in daily living, perception-assist for upper-limb motion has been proposed to assist upper-limb motion in daily living. A virtual tunnel was applied to modify the user’s upper-limb motion if it was necessary. In this paper, effective shape of the virtual tunnel which is applied in the perception-assist for upper-limb motion is proposed. Not only the position of the grasped tool but also the angle of the grasped tool are modified if it is necessary. Therefore, the upper-limb motion in daily living can be effectively modified to realize certain proper daily motion. The effectiveness of the proposed virtual tunnel was evaluated by performing the experiments. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=motion%20modification" title="motion modification">motion modification</a>, <a href="https://publications.waset.org/abstracts/search?q=power-assist%20robots" title=" power-assist robots"> power-assist robots</a>, <a href="https://publications.waset.org/abstracts/search?q=perception-assist" title=" perception-assist"> perception-assist</a>, <a href="https://publications.waset.org/abstracts/search?q=upper-limb%20motion" title=" upper-limb motion"> upper-limb motion</a> </p> <a href="https://publications.waset.org/abstracts/53101/effective-virtual-tunnel-shape-for-motion-modification-in-upper-limb-perception-assist-with-a-power-assist-robot" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/53101.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">241</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1341</span> Latency-Based Motion Detection in Spiking Neural Networks</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mohammad%20Saleh%20Vahdatpour">Mohammad Saleh Vahdatpour</a>, <a href="https://publications.waset.org/abstracts/search?q=Yanqing%20Zhang"> Yanqing Zhang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Understanding the neural mechanisms underlying motion detection in the human visual system has long been a fascinating challenge in neuroscience and artificial intelligence. This paper presents a spiking neural network model inspired by the processing of motion information in the primate visual system, particularly focusing on the Middle Temporal (MT) area. In our study, we propose a multi-layer spiking neural network model to perform motion detection tasks, leveraging the idea that synaptic delays in neuronal communication are pivotal in motion perception. Synaptic delay, determined by factors like axon length and myelin insulation, affects the temporal order of input spikes, thereby encoding motion direction and speed. Overall, our spiking neural network model demonstrates the feasibility of capturing motion detection principles observed in the primate visual system. The combination of synaptic delays, learning mechanisms, and shared weights and delays in SMD provides a promising framework for motion perception in artificial systems, with potential applications in computer vision and robotics. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=neural%20network" title="neural network">neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=motion%20detection" title=" motion detection"> motion detection</a>, <a href="https://publications.waset.org/abstracts/search?q=signature%20detection" title=" signature detection"> signature detection</a>, <a href="https://publications.waset.org/abstracts/search?q=convolutional%20neural%20network" title=" convolutional neural network"> convolutional neural network</a> </p> <a href="https://publications.waset.org/abstracts/174855/latency-based-motion-detection-in-spiking-neural-networks" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/174855.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">88</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1340</span> Cepstrum Analysis of Human Walking Signal</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Koichi%20Kurita">Koichi Kurita</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this study, we propose a real-time data collection technique for the detection of human walking motion from the charge generated on the human body. This technique is based on the detection of a sub-picoampere electrostatic induction current, generated by the motion, flowing through the electrode of a wireless portable sensor attached to the subject. An FFT analysis of the wave-forms of the electrostatic induction currents generated by the walking motions showed that the currents generated under normal and restricted walking conditions were different. Moreover, we carried out a cepstrum analysis to detect any differences in the walking style. Results suggest that a slight difference in motion, either due to the individual’s gait or a splinted leg, is directly reflected in the electrostatic induction current generated by the walking motion. The proposed wireless portable sensor enables the detection of even subtle differences in walking motion. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=human%20walking%20motion" title="human walking motion">human walking motion</a>, <a href="https://publications.waset.org/abstracts/search?q=motion%20measurement" title=" motion measurement"> motion measurement</a>, <a href="https://publications.waset.org/abstracts/search?q=current%20measurement" title=" current measurement"> current measurement</a>, <a href="https://publications.waset.org/abstracts/search?q=electrostatic%20induction" title=" electrostatic induction"> electrostatic induction</a> </p> <a href="https://publications.waset.org/abstracts/12335/cepstrum-analysis-of-human-walking-signal" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/12335.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">344</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1339</span> Human Motion Capture: New Innovations in the Field of Computer Vision</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Najm%20Alotaibi">Najm Alotaibi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Human motion capture has become one of the major area of interest in the field of computer vision. Some of the major application areas that have been rapidly evolving include the advanced human interfaces, virtual reality and security/surveillance systems. This study provides a brief overview of the techniques and applications used for the markerless human motion capture, which deals with analyzing the human motion in the form of mathematical formulations. The major contribution of this research is that it classifies the computer vision based techniques of human motion capture based on the taxonomy, and then breaks its down into four systematically different categories of tracking, initialization, pose estimation and recognition. The detailed descriptions and the relationships descriptions are given for the techniques of tracking and pose estimation. The subcategories of each process are further described. Various hypotheses have been used by the researchers in this domain are surveyed and the evolution of these techniques have been explained. It has been concluded in the survey that most researchers have focused on using the mathematical body models for the markerless motion capture. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=human%20motion%20capture" title="human motion capture">human motion capture</a>, <a href="https://publications.waset.org/abstracts/search?q=computer%20vision" title=" computer vision"> computer vision</a>, <a href="https://publications.waset.org/abstracts/search?q=vision-based" title=" vision-based"> vision-based</a>, <a href="https://publications.waset.org/abstracts/search?q=tracking" title=" tracking"> tracking</a> </p> <a href="https://publications.waset.org/abstracts/22770/human-motion-capture-new-innovations-in-the-field-of-computer-vision" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/22770.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">320</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1338</span> Visualization-Based Feature Extraction for Classification in Real-Time Interaction</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=%C3%81goston%20Nagy">Ágoston Nagy</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper introduces a method of using unsupervised machine learning to visualize the feature space of a dataset in 2D, in order to find most characteristic segments in the set. After dimension reduction, users can select clusters by manual drawing. Selected clusters are recorded into a data model that is used for later predictions, based on realtime data. Predictions are made with supervised learning, using Gesture Recognition Toolkit. The paper introduces two example applications: a semantic audio organizer for analyzing incoming sounds, and a gesture database organizer where gestural data (recorded by a Leap motion) is visualized for further manipulation. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=gesture%20recognition" title="gesture recognition">gesture recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title=" machine learning"> machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=real-time%20interaction" title=" real-time interaction"> real-time interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=visualization" title=" visualization"> visualization</a> </p> <a href="https://publications.waset.org/abstracts/68382/visualization-based-feature-extraction-for-classification-in-real-time-interaction" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/68382.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">354</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1337</span> Virtual Reality Application for Neurorehabilitation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Daniel%20Vargas-Herrera">Daniel Vargas-Herrera</a>, <a href="https://publications.waset.org/abstracts/search?q=Ivette%20Caldelas"> Ivette Caldelas</a>, <a href="https://publications.waset.org/abstracts/search?q=Fernando%20Brambila-Paz"> Fernando Brambila-Paz</a>, <a href="https://publications.waset.org/abstracts/search?q=Rodrigo%20Montufar-Chaveznava"> Rodrigo Montufar-Chaveznava</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, we present a virtual reality application for neurorehabilitation. This application was developed using the Unity SDK integrating the Oculus Rift and Leap Motion devices. Essentially, it consists of three stages according to the kind of rehabilitation to carry on: ocular rehabilitation, head/neck rehabilitation, and eye-hand coordination. We build three scenes for each task; for ocular and head/neck rehabilitation, there are different objects moving in the field of view and extended field of view of the user according to some patterns relative to the therapy. In the third stage the user must try to touch with the hand some objects guided by its view. We report the primer results of the use of the application with healthy people. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=virtual%20reality" title="virtual reality">virtual reality</a>, <a href="https://publications.waset.org/abstracts/search?q=interactive%20technologies" title=" interactive technologies"> interactive technologies</a>, <a href="https://publications.waset.org/abstracts/search?q=video%20games" title=" video games"> video games</a>, <a href="https://publications.waset.org/abstracts/search?q=neurorehabilitation" title=" neurorehabilitation"> neurorehabilitation</a> </p> <a href="https://publications.waset.org/abstracts/55918/virtual-reality-application-for-neurorehabilitation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/55918.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">412</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1336</span> Motion Estimator Architecture with Optimized Number of Processing Elements for High Efficiency Video Coding</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Seongsoo%20Lee">Seongsoo Lee</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Motion estimation occupies the heaviest computation in HEVC (high efficiency video coding). Many fast algorithms such as TZS (test zone search) have been proposed to reduce the computation. Still the huge computation of the motion estimation is a critical issue in the implementation of HEVC video codec. In this paper, motion estimator architecture with optimized number of PEs (processing element) is presented by exploiting early termination. It also reduces hardware size by exploiting parallel processing. The presented motion estimator architecture has 8 PEs, and it can efficiently perform TZS with very high utilization of PEs. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=motion%20estimation" title="motion estimation">motion estimation</a>, <a href="https://publications.waset.org/abstracts/search?q=test%20zone%20search" title=" test zone search"> test zone search</a>, <a href="https://publications.waset.org/abstracts/search?q=high%20efficiency%20video%20coding" title=" high efficiency video coding"> high efficiency video coding</a>, <a href="https://publications.waset.org/abstracts/search?q=processing%20element" title=" processing element"> processing element</a>, <a href="https://publications.waset.org/abstracts/search?q=optimization" title=" optimization"> optimization</a> </p> <a href="https://publications.waset.org/abstracts/70881/motion-estimator-architecture-with-optimized-number-of-processing-elements-for-high-efficiency-video-coding" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/70881.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">363</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1335</span> Approximation of the Time Series by Fractal Brownian Motion</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Valeria%20Bondarenko">Valeria Bondarenko</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, we propose two problems related to fractal Brownian motion. First problem is simultaneous estimation of two parameters, Hurst exponent and the volatility, that describe this random process. Numerical tests for the simulated fBm provided an efficient method. Second problem is approximation of the increments of the observed time series by a power function by increments from the fractional Brownian motion. Approximation and estimation are shown on the example of real data, daily deposit interest rates. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=fractional%20Brownian%20motion" title="fractional Brownian motion">fractional Brownian motion</a>, <a href="https://publications.waset.org/abstracts/search?q=Gausssian%20processes" title=" Gausssian processes"> Gausssian processes</a>, <a href="https://publications.waset.org/abstracts/search?q=approximation" title=" approximation"> approximation</a>, <a href="https://publications.waset.org/abstracts/search?q=time%20series" title=" time series"> time series</a>, <a href="https://publications.waset.org/abstracts/search?q=estimation%20of%20properties%20of%20the%20model" title=" estimation of properties of the model"> estimation of properties of the model</a> </p> <a href="https://publications.waset.org/abstracts/4285/approximation-of-the-time-series-by-fractal-brownian-motion" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/4285.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">376</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1334</span> Efficient Motion Estimation by Fast Three Step Search Algorithm</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=S.%20M.%20Kulkarni">S. M. Kulkarni</a>, <a href="https://publications.waset.org/abstracts/search?q=D.%20S.%20Bormane"> D. S. Bormane</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20L.%20Nalbalwar"> S. L. Nalbalwar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The rapid development in the technology have dramatic impact on the medical health care field. Medical data base obtained with latest machines like CT Machine, MRI scanner requires large amount of memory storage and also it requires large bandwidth for transmission of data in telemedicine applications. Thus, there is need for video compression. As the database of medical images contain number of frames (slices), hence while coding of these images there is need of motion estimation. Motion estimation finds out movement of objects in an image sequence and gets motion vectors which represents estimated motion of object in the frame. In order to reduce temporal redundancy between successive frames of video sequence, motion compensation is preformed. In this paper three step search (TSS) block matching algorithm is implemented on different types of video sequences. It is shown that three step search algorithm produces better quality performance and less computational time compared with exhaustive full search algorithm. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=block%20matching" title="block matching">block matching</a>, <a href="https://publications.waset.org/abstracts/search?q=exhaustive%20search%20motion%20estimation" title=" exhaustive search motion estimation"> exhaustive search motion estimation</a>, <a href="https://publications.waset.org/abstracts/search?q=three%20step%20search" title=" three step search"> three step search</a>, <a href="https://publications.waset.org/abstracts/search?q=video%20compression" title=" video compression"> video compression</a> </p> <a href="https://publications.waset.org/abstracts/23746/efficient-motion-estimation-by-fast-three-step-search-algorithm" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/23746.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">491</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1333</span> A Study on the Establishment of a 4-Joint Based Motion Capture System and Data Acquisition</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Kyeong-Ri%20Ko">Kyeong-Ri Ko</a>, <a href="https://publications.waset.org/abstracts/search?q=Seong%20Bong%20Bae"> Seong Bong Bae</a>, <a href="https://publications.waset.org/abstracts/search?q=Jang%20Sik%20Choi"> Jang Sik Choi</a>, <a href="https://publications.waset.org/abstracts/search?q=Sung%20Bum%20Pan"> Sung Bum Pan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A simple method for testing the posture imbalance of the human body is to check for differences in the bilateral shoulder and pelvic height of the target. In this paper, to check for spinal disorders the authors have studied ways to establish a motion capture system to obtain and express motions of 4-joints, and to acquire data based on this system. The 4 sensors are attached to the both shoulders and pelvis. To verify the established system, the normal and abnormal postures of the targets listening to a lecture were obtained using the established 4-joint based motion capture system. From the results, it was confirmed that the motions taken by the target was identical to the 3-dimensional simulation. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=inertial%20sensor" title="inertial sensor">inertial sensor</a>, <a href="https://publications.waset.org/abstracts/search?q=motion%20capture" title=" motion capture"> motion capture</a>, <a href="https://publications.waset.org/abstracts/search?q=motion%20data%20acquisition" title=" motion data acquisition"> motion data acquisition</a>, <a href="https://publications.waset.org/abstracts/search?q=posture%20imbalance" title=" posture imbalance"> posture imbalance</a> </p> <a href="https://publications.waset.org/abstracts/4802/a-study-on-the-establishment-of-a-4-joint-based-motion-capture-system-and-data-acquisition" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/4802.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">515</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1332</span> Derivation of Fractional Black-Scholes Equations Driven by Fractional G-Brownian Motion and Their Application in European Option Pricing</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Changhong%20Guo">Changhong Guo</a>, <a href="https://publications.waset.org/abstracts/search?q=Shaomei%20Fang"> Shaomei Fang</a>, <a href="https://publications.waset.org/abstracts/search?q=Yong%20He"> Yong He</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, fractional Black-Scholes models for the European option pricing were established based on the fractional G-Brownian motion (fGBm), which generalizes the concepts of the classical Brownian motion, fractional Brownian motion and the G-Brownian motion, and that can be used to be a tool for considering the long range dependence and uncertain volatility for the financial markets simultaneously. A generalized fractional Black-Scholes equation (FBSE) was derived by using the Taylor&rsquo;s series of fractional order and the theory of absence of arbitrage. Finally, some explicit option pricing formulas for the European call option and put option under the FBSE were also solved, which extended the classical option pricing formulas given by F. Black and M. Scholes. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=European%20option%20pricing" title="European option pricing">European option pricing</a>, <a href="https://publications.waset.org/abstracts/search?q=fractional%20Black-Scholes%20equations" title=" fractional Black-Scholes equations"> fractional Black-Scholes equations</a>, <a href="https://publications.waset.org/abstracts/search?q=fractional%20g-Brownian%20motion" title=" fractional g-Brownian motion"> fractional g-Brownian motion</a>, <a href="https://publications.waset.org/abstracts/search?q=Taylor%27s%20series%20of%20fractional%20order" title=" Taylor&#039;s series of fractional order"> Taylor&#039;s series of fractional order</a>, <a href="https://publications.waset.org/abstracts/search?q=uncertain%20volatility" title=" uncertain volatility"> uncertain volatility</a> </p> <a href="https://publications.waset.org/abstracts/127107/derivation-of-fractional-black-scholes-equations-driven-by-fractional-g-brownian-motion-and-their-application-in-european-option-pricing" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/127107.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">163</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">&lsaquo;</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=leap%20motion&amp;page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=leap%20motion&amp;page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=leap%20motion&amp;page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=leap%20motion&amp;page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=leap%20motion&amp;page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=leap%20motion&amp;page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=leap%20motion&amp;page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=leap%20motion&amp;page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=leap%20motion&amp;page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=leap%20motion&amp;page=45">45</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=leap%20motion&amp;page=46">46</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=leap%20motion&amp;page=2" rel="next">&rsaquo;</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">&copy; 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">&times;</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10