CINXE.COM

Search results for: brain computer interface (BCI)

<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: brain computer interface (BCI)</title> <meta name="description" content="Search results for: brain computer interface (BCI)"> <meta name="keywords" content="brain computer interface (BCI)"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="brain computer interface (BCI)" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="brain computer interface (BCI)"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 4688</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: brain computer interface (BCI)</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4688</span> Mechanical Prosthesis Controlled by Brain-Computer Interface</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Tianyu%20Cao">Tianyu Cao</a>, <a href="https://publications.waset.org/abstracts/search?q=KIRA%20%28Ruizhi%20Zhao%29"> KIRA (Ruizhi Zhao)</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The purpose of our research is to study the possibility of people with physical disabilities manipulating mechanical prostheses through brain-computer interface (BCI) technology. The brain-machine interface (BCI) of the neural prosthesis records signals from neurons and uses mathematical modeling to decode them, converting desired movements into body movements. In order to improve the patient's neural control, the prosthesis is given a natural feeling. It records data from sensitive areas from the body to the prosthetic limb and encodes signals in the form of electrical stimulation to the brain. In our research, the brain-computer interface (BCI) is a bridge connecting patients’ cognition and the real world, allowing information to interact with each other. The efficient work between the two is achieved through external devices. The flow of information is controlled by BCI’s ability to record neuronal signals and decode signals, which are converted into device control. In this way, we could encode information and then send it to the brain through electrical stimulation, which has significant medical application. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=biomedical%20engineering" title="biomedical engineering">biomedical engineering</a>, <a href="https://publications.waset.org/abstracts/search?q=brain-computer%20interface" title=" brain-computer interface"> brain-computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=prosthesis" title=" prosthesis"> prosthesis</a>, <a href="https://publications.waset.org/abstracts/search?q=neural%20control" title=" neural control"> neural control</a> </p> <a href="https://publications.waset.org/abstracts/138055/mechanical-prosthesis-controlled-by-brain-computer-interface" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/138055.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">181</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4687</span> African Personhood and the Regulation of Brain-Computer Interface (BCI) Technologies: A South African view</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Meshandren%20Naidoo">Meshandren Naidoo</a>, <a href="https://publications.waset.org/abstracts/search?q=Amy%20Gooden"> Amy Gooden</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Implantable brain-computer interface (BCI) technologies have developed to the point where brain-computer communication is possible. This has great potential in the medical field, as it allows persons who have lost capacities. However, ethicists and regulators call for a strict approach to these technologies due to the impact on personhood. This research demonstrates that the personhood debate is more nuanced and that where an African approach to personhood is used, it may produce results more favorable to the development and use of this technology. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=artificial%20intelligence" title="artificial intelligence">artificial intelligence</a>, <a href="https://publications.waset.org/abstracts/search?q=law" title=" law"> law</a>, <a href="https://publications.waset.org/abstracts/search?q=neuroscience" title=" neuroscience"> neuroscience</a>, <a href="https://publications.waset.org/abstracts/search?q=ethics" title=" ethics"> ethics</a> </p> <a href="https://publications.waset.org/abstracts/153437/african-personhood-and-the-regulation-of-brain-computer-interface-bci-technologies-a-south-african-view" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/153437.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">131</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4686</span> Meditation Based Brain Painting Promotes Foreign Language Memory through Establishing a Brain-Computer Interface</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Zhepeng%20Rui">Zhepeng Rui</a>, <a href="https://publications.waset.org/abstracts/search?q=Zhenyu%20Gu"> Zhenyu Gu</a>, <a href="https://publications.waset.org/abstracts/search?q=Caitilin%20de%20B%C3%A9rigny"> Caitilin de Bérigny</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In the current study, we designed an interactive meditation and brain painting application to cultivate users’ creativity, promote meditation, reduce stress, and improve cognition while attempting to learn a foreign language. User tests and data analyses were conducted on 42 male and 42 female participants to better understand sex-associated psychological and aesthetic differences. Our method utilized brain-computer interfaces to import meditation and attention data to create artwork in meditation-based applications. Female participants showed statistically significantly different language learning outcomes following three meditation paradigms. The art style of brain painting helped females with language memory. Our results suggest that the most ideal methods for promoting memory attention were meditation methods and brain painting exercises contributing to language learning, memory concentration promotion, and foreign word memorization. We conclude that a short period of meditation practice can help in learning a foreign language. These findings provide new insights into meditation, creative language education, brain-computer interface, and human-computer interactions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=brain-computer%20interface" title="brain-computer interface">brain-computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=creative%20thinking" title=" creative thinking"> creative thinking</a>, <a href="https://publications.waset.org/abstracts/search?q=meditation" title=" meditation"> meditation</a>, <a href="https://publications.waset.org/abstracts/search?q=mental%20health" title=" mental health"> mental health</a> </p> <a href="https://publications.waset.org/abstracts/147651/meditation-based-brain-painting-promotes-foreign-language-memory-through-establishing-a-brain-computer-interface" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/147651.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">127</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4685</span> Brain Computer Interface Implementation for Affective Computing Sensing: Classifiers Comparison</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ram%C3%B3n%20Aparicio-Garc%C3%ADa">Ramón Aparicio-García</a>, <a href="https://publications.waset.org/abstracts/search?q=Gustavo%20Ju%C3%A1rez%20Gracia"> Gustavo Juárez Gracia</a>, <a href="https://publications.waset.org/abstracts/search?q=Jes%C3%BAs%20%C3%81lvarez%20Cedillo"> Jesús Álvarez Cedillo</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A research line of the computer science that involve the study of the Human-Computer Interaction (HCI), which search to recognize and interpret the user intent by the storage and the subsequent analysis of the electrical signals of the brain, for using them in the control of electronic devices. On the other hand, the affective computing research applies the human emotions in the HCI process helping to reduce the user frustration. This paper shows the results obtained during the hardware and software development of a Brain Computer Interface (BCI) capable of recognizing the human emotions through the association of the brain electrical activity patterns. The hardware involves the sensing stage and analogical-digital conversion. The interface software involves algorithms for pre-processing of the signal in time and frequency analysis and the classification of patterns associated with the electrical brain activity. The methods used for the analysis and classification of the signal have been tested separately, by using a database that is accessible to the public, besides to a comparison among classifiers in order to know the best performing. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=affective%20computing" title="affective computing">affective computing</a>, <a href="https://publications.waset.org/abstracts/search?q=interface" title=" interface"> interface</a>, <a href="https://publications.waset.org/abstracts/search?q=brain" title=" brain"> brain</a>, <a href="https://publications.waset.org/abstracts/search?q=intelligent%20interaction" title=" intelligent interaction"> intelligent interaction</a> </p> <a href="https://publications.waset.org/abstracts/27725/brain-computer-interface-implementation-for-affective-computing-sensing-classifiers-comparison" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/27725.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">388</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4684</span> A Robotic Rehabilitation Arm Driven by Somatosensory Brain-Computer Interface</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jiewei%20Li">Jiewei Li</a>, <a href="https://publications.waset.org/abstracts/search?q=Hongyan%20Cui"> Hongyan Cui</a>, <a href="https://publications.waset.org/abstracts/search?q=Chunqi%20Chang"> Chunqi Chang</a>, <a href="https://publications.waset.org/abstracts/search?q=Yong%20Hu"> Yong Hu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> It was expected to benefit patient with hemiparesis after stroke by extensive arm rehabilitation, to partially regain forearm and hand function. This paper propose a robotic rehabilitation arm in assisting the hemiparetic patient to learn new ways of using and moving their weak arms. In this study, the robotic arm was driven by a somatosensory stimulated brain computer interface (BCI), which is a new modality BCI. The use of somatosensory stimulation is not only an input for BCI, but also a electrical stimulation for treatment of hemiparesis to strengthen the arm and improve its range of motion. A trial of this robotic rehabilitation arm was performed in a stroke patient with pure motor hemiparesis. The initial trial showed a promising result from the patient with great motivation and function improvement. It suggests that robotic rehabilitation arm driven by somatosensory BCI can enhance the rehabilitation performance and progress for hemiparetic patients after stroke. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=robotic%20rehabilitation%20arm" title="robotic rehabilitation arm">robotic rehabilitation arm</a>, <a href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface%20%28BCI%29" title=" brain computer interface (BCI)"> brain computer interface (BCI)</a>, <a href="https://publications.waset.org/abstracts/search?q=hemiparesis" title=" hemiparesis"> hemiparesis</a>, <a href="https://publications.waset.org/abstracts/search?q=stroke" title=" stroke"> stroke</a>, <a href="https://publications.waset.org/abstracts/search?q=somatosensory%20stimulation" title=" somatosensory stimulation"> somatosensory stimulation</a> </p> <a href="https://publications.waset.org/abstracts/9792/a-robotic-rehabilitation-arm-driven-by-somatosensory-brain-computer-interface" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/9792.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">390</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4683</span> Frequency Recognition Models for Steady State Visual Evoked Potential Based Brain Computer Interfaces (BCIs)</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Zeki%20Oralhan">Zeki Oralhan</a>, <a href="https://publications.waset.org/abstracts/search?q=Mahmut%20Tokmak%C3%A7%C4%B1"> Mahmut Tokmakçı</a> </p> <p class="card-text"><strong>Abstract:</strong></p> SSVEP based brain computer interface (BCI) systems have been preferred, because of high information transfer rate (ITR) and practical use. ITR is the parameter of BCI overall performance. For high ITR value, one of specification BCI system is that has high accuracy. In this study, we investigated to recognize SSVEP with shorter time and lower error rate. In the experiment, there were 8 flickers on light crystal display (LCD). Participants gazed to flicker which had 12 Hz frequency and 50% duty cycle ratio on the LCD during 10 seconds. During the experiment, EEG signals were acquired via EEG device. The EEG data was filtered in preprocessing session. After that Canonical Correlation Analysis (CCA), Multiset CCA (MsetCCA), phase constrained CCA (PCCA), and Multiway CCA (MwayCCA) methods were applied on data. The highest average accuracy value was reached when MsetCCA was applied. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface" title="brain computer interface">brain computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=canonical%20correlation%20analysis" title=" canonical correlation analysis"> canonical correlation analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=human%20computer%20interaction" title=" human computer interaction"> human computer interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=SSVEP" title=" SSVEP"> SSVEP</a> </p> <a href="https://publications.waset.org/abstracts/54342/frequency-recognition-models-for-steady-state-visual-evoked-potential-based-brain-computer-interfaces-bcis" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/54342.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">266</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4682</span> Device Control Using Brain Computer Interface</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=P.%20Neeraj">P. Neeraj</a>, <a href="https://publications.waset.org/abstracts/search?q=Anurag%20Sharma"> Anurag Sharma</a>, <a href="https://publications.waset.org/abstracts/search?q=Harsukhpreet%20Singh"> Harsukhpreet Singh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In current years, Brain-Computer Interface (BCI) scheme based on steady-state Visual Evoked Potential (SSVEP) have earned much consideration. This study tries to evolve an SSVEP based BCI scheme that can regulate any gadget mock-up in two unique positions ON and OFF. In this paper, two distinctive gleam frequencies in low-frequency part were utilized to evoke the SSVEPs and were shown on a Liquid Crystal Display (LCD) screen utilizing Lab View. Two stimuli shading, Yellow, and Blue were utilized to prepare the system in SSVEPs. The Electroencephalogram (EEG) signals recorded from the occipital part. Elements of the brain were separated by utilizing discrete wavelet Transform. A prominent system for multilayer system diverse Neural Network Algorithm (NNA), is utilized to characterize SSVEP signals. During training of the network with diverse calculation Regression plot results demonstrated that when Levenberg-Marquardt preparing calculation was utilized the exactness turns out to be 93.9%, which is superior to another training algorithm. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface" title="brain computer interface">brain computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=electroencephalography" title=" electroencephalography"> electroencephalography</a>, <a href="https://publications.waset.org/abstracts/search?q=steady-state%20visual%20evoked%20potential" title=" steady-state visual evoked potential"> steady-state visual evoked potential</a>, <a href="https://publications.waset.org/abstracts/search?q=wavelet%20transform" title=" wavelet transform"> wavelet transform</a>, <a href="https://publications.waset.org/abstracts/search?q=neural%20network" title=" neural network"> neural network</a> </p> <a href="https://publications.waset.org/abstracts/47898/device-control-using-brain-computer-interface" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/47898.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">334</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4681</span> Robot Control by ERPs of Brain Waves</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=K.%20T.%20Sun">K. T. Sun</a>, <a href="https://publications.waset.org/abstracts/search?q=Y.%20H.%20Tai"> Y. H. Tai</a>, <a href="https://publications.waset.org/abstracts/search?q=H.%20W.%20Yang"> H. W. Yang</a>, <a href="https://publications.waset.org/abstracts/search?q=H.%20T.%20Lin"> H. T. Lin</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper presented the technique of robot control by event-related potentials (ERPs) of brain waves. Based on the proposed technique, severe physical disabilities can free browse outside world. A specific component of ERPs, N2P3, was found and used to control the movement of robot and the view of camera on the designed brain-computer interface (BCI). Users only required watching the stimuli of attended button on the BCI, the evoked potentials of brain waves of the target button, N2P3, had the greatest amplitude among all control buttons. An experimental scene had been constructed that the robot required walking to a specific position and move the view of camera to see the instruction of the mission, and then completed the task. Twelve volunteers participated in this experiment, and experimental results showed that the correct rate of BCI control achieved 80% and the average of execution time was 353 seconds for completing the mission. Four main contributions included in this research: (1) find an efficient component of ERPs, N2P3, for BCI control, (2) embed robot's viewpoint image into user interface for robot control, (3) design an experimental scene and conduct the experiment, and (4) evaluate the performance of the proposed system for assessing the practicability. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=severe%20physical%20disabilities" title="severe physical disabilities">severe physical disabilities</a>, <a href="https://publications.waset.org/abstracts/search?q=robot%20control" title=" robot control"> robot control</a>, <a href="https://publications.waset.org/abstracts/search?q=event-related%20potentials%20%28ERPs%29" title=" event-related potentials (ERPs)"> event-related potentials (ERPs)</a>, <a href="https://publications.waset.org/abstracts/search?q=brain-computer%20interface%20%28BCI%29" title=" brain-computer interface (BCI)"> brain-computer interface (BCI)</a>, <a href="https://publications.waset.org/abstracts/search?q=brain%20waves" title=" brain waves"> brain waves</a> </p> <a href="https://publications.waset.org/abstracts/11425/robot-control-by-erps-of-brain-waves" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/11425.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">369</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4680</span> Identification of EEG Attention Level Using Empirical Mode Decompositions for BCI Applications</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Chia-Ju%20Peng">Chia-Ju Peng</a>, <a href="https://publications.waset.org/abstracts/search?q=Shih-Jui%20Chen"> Shih-Jui Chen</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper proposes a method to discriminate electroencephalogram (EEG) signals between different concentration states using empirical mode decomposition (EMD). Brain-computer interface (BCI), also called brain-machine interface, is a direct communication pathway between the brain and an external device without the inherent pathway such as the peripheral nervous system or skeletal muscles. Attention level is a common index as a control signal of BCI systems. The EEG signals acquired from people paying attention or in relaxation, respectively, are decomposed into a set of intrinsic mode functions (IMF) by EMD. Fast Fourier transform (FFT) analysis is then applied to each IMF to obtain the frequency spectrums. By observing power spectrums of IMFs, the proposed method has the better identification of EEG attention level than the original EEG signals between different concentration states. The band power of IMF3 is the most obvious especially in β wave, which corresponds to fully awake and generally alert. The signal processing method and results of this experiment paves a new way for BCI robotic system using the attention-level control strategy. The integrated signal processing method reveals appropriate information for discrimination of the attention and relaxation, contributing to a more enhanced BCI performance. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=biomedical%20engineering" title="biomedical engineering">biomedical engineering</a>, <a href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface" title=" brain computer interface"> brain computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=electroencephalography" title=" electroencephalography"> electroencephalography</a>, <a href="https://publications.waset.org/abstracts/search?q=rehabilitation" title=" rehabilitation"> rehabilitation</a> </p> <a href="https://publications.waset.org/abstracts/31042/identification-of-eeg-attention-level-using-empirical-mode-decompositions-for-bci-applications" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/31042.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">391</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4679</span> IoT Based Approach to Healthcare System for a Quadriplegic Patient Using EEG</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=R.%20Gautam">R. Gautam</a>, <a href="https://publications.waset.org/abstracts/search?q=P.%20Sastha%20Kanagasabai"> P. Sastha Kanagasabai</a>, <a href="https://publications.waset.org/abstracts/search?q=G.%20N.%20Rathna"> G. N. Rathna </a> </p> <p class="card-text"><strong>Abstract:</strong></p> The proposed healthcare system enables quadriplegic patients, people with severe motor disabilities to send commands to electronic devices and monitor their vitals. The growth of Brain-Computer-Interface (BCI) has led to rapid development in 'assistive systems' for the disabled called 'assistive domotics'. Brain-Computer-Interface is capable of reading the brainwaves of an individual and analyse it to obtain some meaningful data. This processed data can be used to assist people having speech disorders and sometimes people with limited locomotion to communicate. In this Project, Emotiv EPOC Headset is used to obtain the electroencephalogram (EEG). The obtained data is processed to communicate pre-defined commands over the internet to the desired mobile phone user. Other Vital Information like the heartbeat, blood pressure, ECG and body temperature are monitored and uploaded to the server. Data analytics enables physicians to scan databases for a specific illness. The Data is processed in Intel Edison, system on chip (SoC). Patient metrics are displayed via Intel IoT Analytics cloud service. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface" title="brain computer interface">brain computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=Intel%20Edison" title=" Intel Edison"> Intel Edison</a>, <a href="https://publications.waset.org/abstracts/search?q=Emotiv%20EPOC" title=" Emotiv EPOC"> Emotiv EPOC</a>, <a href="https://publications.waset.org/abstracts/search?q=IoT%20analytics" title=" IoT analytics"> IoT analytics</a>, <a href="https://publications.waset.org/abstracts/search?q=electroencephalogram" title=" electroencephalogram"> electroencephalogram</a> </p> <a href="https://publications.waset.org/abstracts/57525/iot-based-approach-to-healthcare-system-for-a-quadriplegic-patient-using-eeg" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/57525.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">186</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4678</span> Enabling Oral Communication and Accelerating Recovery: The Creation of a Novel Low-Cost Electroencephalography-Based Brain-Computer Interface for the Differently Abled</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Rishabh%20Ambavanekar">Rishabh Ambavanekar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Expressive Aphasia (EA) is an oral disability, common among stroke victims, in which the Broca’s area of the brain is damaged, interfering with verbal communication abilities. EA currently has no technological solutions and its only current viable solutions are inefficient or only available to the affluent. This prompts the need for an affordable, innovative solution to facilitate recovery and assist in speech generation. This project proposes a novel concept: using a wearable low-cost electroencephalography (EEG) device-based brain-computer interface (BCI) to translate a user’s inner dialogue into words. A low-cost EEG device was developed and found to be 10 to 100 times less expensive than any current EEG device on the market. As part of the BCI, a machine learning (ML) model was developed and trained using the EEG data. Two stages of testing were conducted to analyze the effectiveness of the device: a proof-of-concept and a final solution test. The proof-of-concept test demonstrated an average accuracy of above 90% and the final solution test demonstrated an average accuracy of above 75%. These two successful tests were used as a basis to demonstrate the viability of BCI research in developing lower-cost verbal communication devices. Additionally, the device proved to not only enable users to verbally communicate but has the potential to also assist in accelerated recovery from the disorder. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=neurotechnology" title="neurotechnology">neurotechnology</a>, <a href="https://publications.waset.org/abstracts/search?q=brain-computer%20interface" title=" brain-computer interface"> brain-computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=neuroscience" title=" neuroscience"> neuroscience</a>, <a href="https://publications.waset.org/abstracts/search?q=human-machine%20interface" title=" human-machine interface"> human-machine interface</a>, <a href="https://publications.waset.org/abstracts/search?q=BCI" title=" BCI"> BCI</a>, <a href="https://publications.waset.org/abstracts/search?q=HMI" title=" HMI"> HMI</a>, <a href="https://publications.waset.org/abstracts/search?q=aphasia" title=" aphasia"> aphasia</a>, <a href="https://publications.waset.org/abstracts/search?q=verbal%20disability" title=" verbal disability"> verbal disability</a>, <a href="https://publications.waset.org/abstracts/search?q=stroke" title=" stroke"> stroke</a>, <a href="https://publications.waset.org/abstracts/search?q=low-cost" title=" low-cost"> low-cost</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title=" machine learning"> machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=ML" title=" ML"> ML</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20recognition" title=" image recognition"> image recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=EEG" title=" EEG"> EEG</a>, <a href="https://publications.waset.org/abstracts/search?q=signal%20analysis" title=" signal analysis"> signal analysis</a> </p> <a href="https://publications.waset.org/abstracts/149743/enabling-oral-communication-and-accelerating-recovery-the-creation-of-a-novel-low-cost-electroencephalography-based-brain-computer-interface-for-the-differently-abled" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/149743.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">119</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4677</span> Initial Dip: An Early Indicator of Neural Activity in Functional Near Infrared Spectroscopy Waveform</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mannan%20Malik%20Muhammad%20Naeem">Mannan Malik Muhammad Naeem</a>, <a href="https://publications.waset.org/abstracts/search?q=Jeong%20Myung%20Yung"> Jeong Myung Yung </a> </p> <p class="card-text"><strong>Abstract:</strong></p> Functional near infrared spectroscopy (fNIRS) has a favorable position in non-invasive brain imaging techniques. The concentration change of oxygenated hemoglobin and de-oxygenated hemoglobin during particular cognitive activity is the basis for this neuro-imaging modality. Two wavelengths of near-infrared light can be used with modified Beer-Lambert law to explain the indirect status of neuronal activity inside brain. The temporal resolution of fNIRS is very good for real-time brain computer-interface applications. The portability, low cost and an acceptable temporal resolution of fNIRS put it on a better position in neuro-imaging modalities. In this study, an optimization model for impulse response function has been used to estimate/predict initial dip using fNIRS data. In addition, the activity strength parameter related to motor based cognitive task has been analyzed. We found an initial dip that remains around 200-300 millisecond and better localize neural activity. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=fNIRS" title="fNIRS">fNIRS</a>, <a href="https://publications.waset.org/abstracts/search?q=brain-computer%20interface" title=" brain-computer interface"> brain-computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=optimization%20algorithm" title=" optimization algorithm"> optimization algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=adaptive%20signal%20processing" title=" adaptive signal processing"> adaptive signal processing</a> </p> <a href="https://publications.waset.org/abstracts/84942/initial-dip-an-early-indicator-of-neural-activity-in-functional-near-infrared-spectroscopy-waveform" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/84942.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">226</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4676</span> Electroencephalogram Based Approach for Mental Stress Detection during Gameplay with Level Prediction</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Priyadarsini%20Samal">Priyadarsini Samal</a>, <a href="https://publications.waset.org/abstracts/search?q=Rajesh%20Singla"> Rajesh Singla</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Many mobile games come with the benefits of entertainment by introducing stress to the human brain. In recognizing this mental stress, the brain-computer interface (BCI) plays an important role. It has various neuroimaging approaches which help in analyzing the brain signals. Electroencephalogram (EEG) is the most commonly used method among them as it is non-invasive, portable, and economical. Here, this paper investigates the pattern in brain signals when introduced with mental stress. Two healthy volunteers played a game whose aim was to search hidden words from the grid, and the levels were chosen randomly. The EEG signals during gameplay were recorded to investigate the impacts of stress with the changing levels from easy to medium to hard. A total of 16 features of EEG were analyzed for this experiment which includes power band features with relative powers, event-related desynchronization, along statistical features. Support vector machine was used as the classifier, which resulted in an accuracy of 93.9% for three-level stress analysis; for two levels, the accuracy of 92% and 98% are achieved. In addition to that, another game that was similar in nature was played by the volunteers. A suitable regression model was designed for prediction where the feature sets of the first and second game were used for testing and training purposes, respectively, and an accuracy of 73% was found. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface" title="brain computer interface">brain computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=electroencephalogram" title=" electroencephalogram"> electroencephalogram</a>, <a href="https://publications.waset.org/abstracts/search?q=regression%20model" title=" regression model"> regression model</a>, <a href="https://publications.waset.org/abstracts/search?q=stress" title=" stress"> stress</a>, <a href="https://publications.waset.org/abstracts/search?q=word%20search" title=" word search"> word search</a> </p> <a href="https://publications.waset.org/abstracts/139736/electroencephalogram-based-approach-for-mental-stress-detection-during-gameplay-with-level-prediction" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/139736.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">187</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4675</span> A Real Time Set Up for Retrieval of Emotional States from Human Neural Responses</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Rashima%20Mahajan">Rashima Mahajan</a>, <a href="https://publications.waset.org/abstracts/search?q=Dipali%20Bansal"> Dipali Bansal</a>, <a href="https://publications.waset.org/abstracts/search?q=Shweta%20Singh"> Shweta Singh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Real time non-invasive Brain Computer Interfaces have a significant progressive role in restoring or maintaining a quality life for medically challenged people. This manuscript provides a comprehensive review of emerging research in the field of cognitive/affective computing in context of human neural responses. The perspectives of different emotion assessment modalities like face expressions, speech, text, gestures, and human physiological responses have also been discussed. Focus has been paid to explore the ability of EEG (Electroencephalogram) signals to portray thoughts, feelings, and unspoken words. An automated workflow-based protocol to design an EEG-based real time Brain Computer Interface system for analysis and classification of human emotions elicited by external audio/visual stimuli has been proposed. The front end hardware includes a cost effective and portable Emotive EEG Neuroheadset unit, a personal computer and a set of external stimulators. Primary signal analysis and processing of real time acquired EEG shall be performed using MATLAB based advanced brain mapping toolbox EEGLab/BCILab. This shall be followed by the development of MATLAB based self-defined algorithm to capture and characterize temporal and spectral variations in EEG under emotional stimulations. The extracted hybrid feature set shall be used to classify emotional states using artificial intelligence tools like Artificial Neural Network. The final system would result in an inexpensive, portable and more intuitive Brain Computer Interface in real time scenario to control prosthetic devices by translating different brain states into operative control signals. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface" title="brain computer interface">brain computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=electroencephalogram" title=" electroencephalogram"> electroencephalogram</a>, <a href="https://publications.waset.org/abstracts/search?q=EEGLab" title=" EEGLab"> EEGLab</a>, <a href="https://publications.waset.org/abstracts/search?q=BCILab" title=" BCILab"> BCILab</a>, <a href="https://publications.waset.org/abstracts/search?q=emotive" title=" emotive"> emotive</a>, <a href="https://publications.waset.org/abstracts/search?q=emotions" title=" emotions"> emotions</a>, <a href="https://publications.waset.org/abstracts/search?q=interval%20features" title=" interval features"> interval features</a>, <a href="https://publications.waset.org/abstracts/search?q=spectral%20features" title=" spectral features"> spectral features</a>, <a href="https://publications.waset.org/abstracts/search?q=artificial%20neural%20network" title=" artificial neural network"> artificial neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=control%20applications" title=" control applications"> control applications</a> </p> <a href="https://publications.waset.org/abstracts/6428/a-real-time-set-up-for-retrieval-of-emotional-states-from-human-neural-responses" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/6428.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">317</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4674</span> An ANOVA-based Sequential Forward Channel Selection Framework for Brain-Computer Interface Application based on EEG Signals Driven by Motor Imagery</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Forouzan%20Salehi%20Fergeni">Forouzan Salehi Fergeni</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Converting the movement intents of a person into commands for action employing brain signals like electroencephalogram signals is a brain-computer interface (BCI) system. When left or right-hand motions are imagined, different patterns of brain activity appear, which can be employed as BCI signals for control. To make better the brain-computer interface (BCI) structures, effective and accurate techniques for increasing the classifying precision of motor imagery (MI) based on electroencephalography (EEG) are greatly needed. Subject dependency and non-stationary are two features of EEG signals. So, EEG signals must be effectively processed before being used in BCI applications. In the present study, after applying an 8 to 30 band-pass filter, a car spatial filter is rendered for the purpose of denoising, and then, a method of analysis of variance is used to select more appropriate and informative channels from a category of a large number of different channels. After ordering channels based on their efficiencies, a sequential forward channel selection is employed to choose just a few reliable ones. Features from two domains of time and wavelet are extracted and shortlisted with the help of a statistical technique, namely the t-test. Finally, the selected features are classified with different machine learning and neural network classifiers being k-nearest neighbor, Probabilistic neural network, support-vector-machine, Extreme learning machine, decision tree, Multi-layer perceptron, and linear discriminant analysis with the purpose of comparing their performance in this application. Utilizing a ten-fold cross-validation approach, tests are performed on a motor imagery dataset found in the BCI competition III. Outcomes demonstrated that the SVM classifier got the greatest classification precision of 97% when compared to the other available approaches. The entire investigative findings confirm that the suggested framework is reliable and computationally effective for the construction of BCI systems and surpasses the existing methods. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=brain-computer%20interface" title="brain-computer interface">brain-computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=channel%20selection" title=" channel selection"> channel selection</a>, <a href="https://publications.waset.org/abstracts/search?q=motor%20imagery" title=" motor imagery"> motor imagery</a>, <a href="https://publications.waset.org/abstracts/search?q=support-vector-machine" title=" support-vector-machine"> support-vector-machine</a> </p> <a href="https://publications.waset.org/abstracts/186038/an-anova-based-sequential-forward-channel-selection-framework-for-brain-computer-interface-application-based-on-eeg-signals-driven-by-motor-imagery" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/186038.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">50</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4673</span> A Brain Controlled Robotic Gait Trainer for Neurorehabilitation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Qazi%20Umer%20Jamil">Qazi Umer Jamil</a>, <a href="https://publications.waset.org/abstracts/search?q=Abubakr%20Siddique"> Abubakr Siddique</a>, <a href="https://publications.waset.org/abstracts/search?q=Mubeen%20Ur%20Rehman"> Mubeen Ur Rehman</a>, <a href="https://publications.waset.org/abstracts/search?q=Nida%20Aziz"> Nida Aziz</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohsin%20I.%20Tiwana"> Mohsin I. Tiwana</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper discusses a brain controlled robotic gait trainer for neurorehabilitation of Spinal Cord Injury (SCI) patients. Patients suffering from Spinal Cord Injuries (SCI) become unable to execute motion control of their lower proximities due to degeneration of spinal cord neurons. The presented approach can help SCI patients in neuro-rehabilitation training by directly translating patient motor imagery into walkers motion commands and thus bypassing spinal cord neurons completely. A non-invasive EEG based brain-computer interface is used for capturing patient neural activity. For signal processing and classification, an open source software (OpenVibe) is used. Classifiers categorize the patient motor imagery (MI) into a specific set of commands that are further translated into walker motion commands. The robotic walker also employs fall detection for ensuring safety of patient during gait training and can act as a support for SCI patients. The gait trainer is tested with subjects, and satisfactory results were achieved. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface%20%28BCI%29" title="brain computer interface (BCI)">brain computer interface (BCI)</a>, <a href="https://publications.waset.org/abstracts/search?q=gait%20trainer" title=" gait trainer"> gait trainer</a>, <a href="https://publications.waset.org/abstracts/search?q=spinal%20cord%20injury%20%28SCI%29" title=" spinal cord injury (SCI)"> spinal cord injury (SCI)</a>, <a href="https://publications.waset.org/abstracts/search?q=neurorehabilitation" title=" neurorehabilitation"> neurorehabilitation</a> </p> <a href="https://publications.waset.org/abstracts/107088/a-brain-controlled-robotic-gait-trainer-for-neurorehabilitation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/107088.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">161</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4672</span> Comparative Analysis of Spectral Estimation Methods for Brain-Computer Interfaces</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Rafik%20Djemili">Rafik Djemili</a>, <a href="https://publications.waset.org/abstracts/search?q=Hocine%20Bourouba"> Hocine Bourouba</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20C.%20Amara%20Korba"> M. C. Amara Korba</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, we present a method in order to classify EEG signals for Brain-Computer Interfaces (BCI). EEG signals are first processed by means of spectral estimation methods to derive reliable features before classification step. Spectral estimation methods used are standard periodogram and the periodogram calculated by the Welch method; both methods are compared with Logarithm of Band Power (logBP) features. In the method proposed, we apply Linear Discriminant Analysis (LDA) followed by Support Vector Machine (SVM). Classification accuracy reached could be as high as 85%, which proves the effectiveness of classification of EEG signals based BCI using spectral methods. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=brain-computer%20interface" title="brain-computer interface">brain-computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=motor%20imagery" title=" motor imagery"> motor imagery</a>, <a href="https://publications.waset.org/abstracts/search?q=electroencephalogram" title=" electroencephalogram"> electroencephalogram</a>, <a href="https://publications.waset.org/abstracts/search?q=linear%20discriminant%20analysis" title=" linear discriminant analysis"> linear discriminant analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=support%20vector%20machine" title=" support vector machine"> support vector machine</a> </p> <a href="https://publications.waset.org/abstracts/6971/comparative-analysis-of-spectral-estimation-methods-for-brain-computer-interfaces" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/6971.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">499</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4671</span> Non-Uniform Filter Banks-based Minimum Distance to Riemannian Mean Classifition in Motor Imagery Brain-Computer Interface</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ping%20Tan">Ping Tan</a>, <a href="https://publications.waset.org/abstracts/search?q=Xiaomeng%20Su"> Xiaomeng Su</a>, <a href="https://publications.waset.org/abstracts/search?q=Yi%20Shen"> Yi Shen</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The motion intention in the motor imagery braincomputer interface is identified by classifying the event-related desynchronization (ERD) and event-related synchronization ERS characteristics of sensorimotor rhythm (SMR) in EEG signals. When the subject imagines different limbs or different parts moving, the rhythm components and bandwidth will change, which varies from person to person. How to find the effective sensorimotor frequency band of subjects is directly related to the classification accuracy of brain-computer interface. To solve this problem, this paper proposes a Minimum Distance to Riemannian Mean Classification method based on Non-Uniform Filter Banks. During the training phase, the EEG signals are decomposed into multiple different bandwidt signals by using multiple band-pass filters firstly; Then the spatial covariance characteristics of each frequency band signal are computered to be as the feature vectors. these feature vectors will be classified by the MDRM (Minimum Distance to Riemannian Mean) method, and cross validation is employed to obtain the effective sensorimotor frequency bands. During the test phase, the test signals are filtered by the bandpass filter of the effective sensorimotor frequency bands, and the extracted spatial covariance feature vectors will be classified by using the MDRM. Experiments on the BCI competition IV 2a dataset show that the proposed method is superior to other classification methods. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=non-uniform%20filter%20banks" title="non-uniform filter banks">non-uniform filter banks</a>, <a href="https://publications.waset.org/abstracts/search?q=motor%20imagery" title=" motor imagery"> motor imagery</a>, <a href="https://publications.waset.org/abstracts/search?q=brain-computer%20interface" title=" brain-computer interface"> brain-computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=minimum%20distance%20to%20Riemannian%20mean" title=" minimum distance to Riemannian mean"> minimum distance to Riemannian mean</a> </p> <a href="https://publications.waset.org/abstracts/162018/non-uniform-filter-banks-based-minimum-distance-to-riemannian-mean-classifition-in-motor-imagery-brain-computer-interface" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/162018.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">126</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4670</span> Optimized Brain Computer Interface System for Unspoken Speech Recognition: Role of Wernicke Area</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Nassib%20Abdallah">Nassib Abdallah</a>, <a href="https://publications.waset.org/abstracts/search?q=Pierre%20Chauvet"> Pierre Chauvet</a>, <a href="https://publications.waset.org/abstracts/search?q=Abd%20El%20Salam%20Hajjar"> Abd El Salam Hajjar</a>, <a href="https://publications.waset.org/abstracts/search?q=Bassam%20Daya"> Bassam Daya</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, we propose an optimized brain computer interface (BCI) system for unspoken speech recognition, based on the fact that the constructions of unspoken words rely strongly on the Wernicke area, situated in the temporal lobe. Our BCI system has four modules: (i) the EEG Acquisition module based on a non-invasive headset with 14 electrodes; (ii) the Preprocessing module to remove noise and artifacts, using the Common Average Reference method; (iii) the Features Extraction module, using Wavelet Packet Transform (WPT); (iv) the Classification module based on a one-hidden layer artificial neural network. The present study consists of comparing the recognition accuracy of 5 Arabic words, when using all the headset electrodes or only the 4 electrodes situated near the Wernicke area, as well as the selection effect of the subbands produced by the WPT module. After applying the articial neural network on the produced database, we obtain, on the test dataset, an accuracy of 83.4% with all the electrodes and all the subbands of 8 levels of the WPT decomposition. However, by using only the 4 electrodes near Wernicke Area and the 6 middle subbands of the WPT, we obtain a high reduction of the dataset size, equal to approximately 19% of the total dataset, with 67.5% of accuracy rate. This reduction appears particularly important to improve the design of a low cost and simple to use BCI, trained for several words. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=brain-computer%20interface" title="brain-computer interface">brain-computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=speech%20recognition" title=" speech recognition"> speech recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=artificial%20neural%20network" title=" artificial neural network"> artificial neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=electroencephalography" title=" electroencephalography"> electroencephalography</a>, <a href="https://publications.waset.org/abstracts/search?q=EEG" title=" EEG"> EEG</a>, <a href="https://publications.waset.org/abstracts/search?q=wernicke%20area" title=" wernicke area"> wernicke area</a> </p> <a href="https://publications.waset.org/abstracts/86773/optimized-brain-computer-interface-system-for-unspoken-speech-recognition-role-of-wernicke-area" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/86773.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">272</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4669</span> An Analysis of OpenSim Graphical User Interface Effectiveness</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sina%20Saadati">Sina Saadati</a> </p> <p class="card-text"><strong>Abstract:</strong></p> OpenSim is a well-known software in biomechanical studies. There are worthy algorithms developed in this program which are used for modeling and simulation of human motions. In this research, we analyze the OpenSim application from the computer science perspective. It is important that every application have a user-friendly interface. An effective user interface can decrease the time, costs, and energy needed to learn how to use a program. In this paper, we survey the user interface of OpenSim as an important factor of the software. Finally, we infer that there are many challenges to be addressed in the development of OpenSim. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=biomechanics" title="biomechanics">biomechanics</a>, <a href="https://publications.waset.org/abstracts/search?q=computer%20engineering" title=" computer engineering"> computer engineering</a>, <a href="https://publications.waset.org/abstracts/search?q=graphical%20user%20interface" title=" graphical user interface"> graphical user interface</a>, <a href="https://publications.waset.org/abstracts/search?q=modeling%20and%20simulation" title=" modeling and simulation"> modeling and simulation</a>, <a href="https://publications.waset.org/abstracts/search?q=interface%20effectiveness" title=" interface effectiveness"> interface effectiveness</a> </p> <a href="https://publications.waset.org/abstracts/168517/an-analysis-of-opensim-graphical-user-interface-effectiveness" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/168517.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">95</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4668</span> Computer Aided Diagnostic System for Detection and Classification of a Brain Tumor through MRI Using Level Set Based Segmentation Technique and ANN Classifier</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Atanu%20K%20Samanta">Atanu K Samanta</a>, <a href="https://publications.waset.org/abstracts/search?q=Asim%20Ali%20Khan"> Asim Ali Khan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Due to the acquisition of huge amounts of brain tumor magnetic resonance images (MRI) in clinics, it is very difficult for radiologists to manually interpret and segment these images within a reasonable span of time. Computer-aided diagnosis (CAD) systems can enhance the diagnostic capabilities of radiologists and reduce the time required for accurate diagnosis. An intelligent computer-aided technique for automatic detection of a brain tumor through MRI is presented in this paper. The technique uses the following computational methods; the Level Set for segmentation of a brain tumor from other brain parts, extraction of features from this segmented tumor portion using gray level co-occurrence Matrix (GLCM), and the Artificial Neural Network (ANN) to classify brain tumor images according to their respective types. The entire work is carried out on 50 images having five types of brain tumor. The overall classification accuracy using this method is found to be 98% which is significantly good. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=brain%20tumor" title="brain tumor">brain tumor</a>, <a href="https://publications.waset.org/abstracts/search?q=computer-aided%20diagnostic%20%28CAD%29%20system" title=" computer-aided diagnostic (CAD) system"> computer-aided diagnostic (CAD) system</a>, <a href="https://publications.waset.org/abstracts/search?q=gray-level%20co-occurrence%20matrix%20%28GLCM%29" title=" gray-level co-occurrence matrix (GLCM)"> gray-level co-occurrence matrix (GLCM)</a>, <a href="https://publications.waset.org/abstracts/search?q=tumor%20segmentation" title=" tumor segmentation"> tumor segmentation</a>, <a href="https://publications.waset.org/abstracts/search?q=level%20set%20method" title=" level set method"> level set method</a> </p> <a href="https://publications.waset.org/abstracts/61237/computer-aided-diagnostic-system-for-detection-and-classification-of-a-brain-tumor-through-mri-using-level-set-based-segmentation-technique-and-ann-classifier" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/61237.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">512</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4667</span> Noninvasive Brain-Machine Interface to Control Both Mecha TE Robotic Hands Using Emotiv EEG Neuroheadset</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Adrienne%20Kline">Adrienne Kline</a>, <a href="https://publications.waset.org/abstracts/search?q=Jaydip%20Desai"> Jaydip Desai</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Electroencephalogram (EEG) is a noninvasive technique that registers signals originating from the firing of neurons in the brain. The Emotiv EEG Neuroheadset is a consumer product comprised of 14 EEG channels and was used to record the reactions of the neurons within the brain to two forms of stimuli in 10 participants. These stimuli consisted of auditory and visual formats that provided directions of ‘right’ or ‘left.’ Participants were instructed to raise their right or left arm in accordance with the instruction given. A scenario in OpenViBE was generated to both stimulate the participants while recording their data. In OpenViBE, the Graz Motor BCI Stimulator algorithm was configured to govern the duration and number of visual stimuli. Utilizing EEGLAB under the cross platform MATLAB®, the electrodes most stimulated during the study were defined. Data outputs from EEGLAB were analyzed using IBM SPSS Statistics® Version 20. This aided in determining the electrodes to use in the development of a brain-machine interface (BMI) using real-time EEG signals from the Emotiv EEG Neuroheadset. Signal processing and feature extraction were accomplished via the Simulink® signal processing toolbox. An Arduino™ Duemilanove microcontroller was used to link the Emotiv EEG Neuroheadset and the right and left Mecha TE™ Hands. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=brain-machine%20interface" title="brain-machine interface">brain-machine interface</a>, <a href="https://publications.waset.org/abstracts/search?q=EEGLAB" title=" EEGLAB"> EEGLAB</a>, <a href="https://publications.waset.org/abstracts/search?q=emotiv%20EEG%20neuroheadset" title=" emotiv EEG neuroheadset"> emotiv EEG neuroheadset</a>, <a href="https://publications.waset.org/abstracts/search?q=OpenViBE" title=" OpenViBE"> OpenViBE</a>, <a href="https://publications.waset.org/abstracts/search?q=simulink" title=" simulink"> simulink</a> </p> <a href="https://publications.waset.org/abstracts/28333/noninvasive-brain-machine-interface-to-control-both-mecha-te-robotic-hands-using-emotiv-eeg-neuroheadset" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/28333.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">502</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4666</span> Analysis of Matching Pursuit Features of EEG Signal for Mental Tasks Classification </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Zin%20Mar%20Lwin">Zin Mar Lwin</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Brain Computer Interface (BCI) Systems have developed for people who suffer from severe motor disabilities and challenging to communicate with their environment. BCI allows them for communication by a non-muscular way. For communication between human and computer, BCI uses a type of signal called Electroencephalogram (EEG) signal which is recorded from the human„s brain by means of an electrode. The electroencephalogram (EEG) signal is an important information source for knowing brain processes for the non-invasive BCI. Translating human‟s thought, it needs to classify acquired EEG signal accurately. This paper proposed a typical EEG signal classification system which experiments the Dataset from “Purdue University.” Independent Component Analysis (ICA) method via EEGLab Tools for removing artifacts which are caused by eye blinks. For features extraction, the Time and Frequency features of non-stationary EEG signals are extracted by Matching Pursuit (MP) algorithm. The classification of one of five mental tasks is performed by Multi_Class Support Vector Machine (SVM). For SVMs, the comparisons have been carried out for both 1-against-1 and 1-against-all methods. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=BCI" title="BCI">BCI</a>, <a href="https://publications.waset.org/abstracts/search?q=EEG" title=" EEG"> EEG</a>, <a href="https://publications.waset.org/abstracts/search?q=ICA" title=" ICA"> ICA</a>, <a href="https://publications.waset.org/abstracts/search?q=SVM" title=" SVM"> SVM</a> </p> <a href="https://publications.waset.org/abstracts/19307/analysis-of-matching-pursuit-features-of-eeg-signal-for-mental-tasks-classification" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/19307.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">278</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4665</span> Brain-Computer Interface Based Real-Time Control of Fixed Wing and Multi-Rotor Unmanned Aerial Vehicles</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ravi%20Vishwanath">Ravi Vishwanath</a>, <a href="https://publications.waset.org/abstracts/search?q=Saumya%20Kumaar"> Saumya Kumaar</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20N.%20Omkar"> S. N. Omkar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Brain-computer interfacing (BCI) is a technology that is almost four decades old, and it was developed solely for the purpose of developing and enhancing the impact of neuroprosthetics. However, in the recent times, with the commercialization of non-invasive electroencephalogram (EEG) headsets, the technology has seen a wide variety of applications like home automation, wheelchair control, vehicle steering, etc. One of the latest developed applications is the mind-controlled quadrotor unmanned aerial vehicle. These applications, however, do not require a very high-speed response and give satisfactory results when standard classification methods like Support Vector Machine (SVM) and Multi-Layer Perceptron (MLPC). Issues are faced when there is a requirement for high-speed control in the case of fixed-wing unmanned aerial vehicles where such methods are rendered unreliable due to the low speed of classification. Such an application requires the system to classify data at high speeds in order to retain the controllability of the vehicle. This paper proposes a novel method of classification which uses a combination of Common Spatial Paradigm and Linear Discriminant Analysis that provides an improved classification accuracy in real time. A non-linear SVM based classification technique has also been discussed. Further, this paper discusses the implementation of the proposed method on a fixed-wing and VTOL unmanned aerial vehicles. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=brain-computer%20interface" title="brain-computer interface">brain-computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=classification" title=" classification"> classification</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title=" machine learning"> machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20vehicles" title=" unmanned aerial vehicles"> unmanned aerial vehicles</a> </p> <a href="https://publications.waset.org/abstracts/87914/brain-computer-interface-based-real-time-control-of-fixed-wing-and-multi-rotor-unmanned-aerial-vehicles" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/87914.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">283</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4664</span> Development of a Computer Aided Diagnosis Tool for Brain Tumor Extraction and Classification</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Fathi%20Kallel">Fathi Kallel</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdulelah%20Alabd%20Uljabbar"> Abdulelah Alabd Uljabbar</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdulrahman%20Aldukhail"> Abdulrahman Aldukhail</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdulaziz%20Alomran"> Abdulaziz Alomran</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The brain is an important organ in our body since it is responsible about the majority actions such as vision, memory, etc. However, different diseases such as Alzheimer and tumors could affect the brain and conduct to a partial or full disorder. Regular diagnosis are necessary as a preventive measure and could help doctors to early detect a possible trouble and therefore taking the appropriate treatment, especially in the case of brain tumors. Different imaging modalities are proposed for diagnosis of brain tumor. The powerful and most used modality is the Magnetic Resonance Imaging (MRI). MRI images are analyzed by doctor in order to locate eventual tumor in the brain and describe the appropriate and needed treatment. Diverse image processing methods are also proposed for helping doctors in identifying and analyzing the tumor. In fact, a large Computer Aided Diagnostic (CAD) tools including developed image processing algorithms are proposed and exploited by doctors as a second opinion to analyze and identify the brain tumors. In this paper, we proposed a new advanced CAD for brain tumor identification, classification and feature extraction. Our proposed CAD includes three main parts. Firstly, we load the brain MRI. Secondly, a robust technique for brain tumor extraction is proposed. This technique is based on both Discrete Wavelet Transform (DWT) and Principal Component Analysis (PCA). DWT is characterized by its multiresolution analytic property, that’s why it was applied on MRI images with different decomposition levels for feature extraction. Nevertheless, this technique suffers from a main drawback since it necessitates a huge storage and is computationally expensive. To decrease the dimensions of the feature vector and the computing time, PCA technique is considered. In the last stage, according to different extracted features, the brain tumor is classified into either benign or malignant tumor using Support Vector Machine (SVM) algorithm. A CAD tool for brain tumor detection and classification, including all above-mentioned stages, is designed and developed using MATLAB guide user interface. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=MRI" title="MRI">MRI</a>, <a href="https://publications.waset.org/abstracts/search?q=brain%20tumor" title=" brain tumor"> brain tumor</a>, <a href="https://publications.waset.org/abstracts/search?q=CAD" title=" CAD"> CAD</a>, <a href="https://publications.waset.org/abstracts/search?q=feature%20extraction" title=" feature extraction"> feature extraction</a>, <a href="https://publications.waset.org/abstracts/search?q=DWT" title=" DWT"> DWT</a>, <a href="https://publications.waset.org/abstracts/search?q=PCA" title=" PCA"> PCA</a>, <a href="https://publications.waset.org/abstracts/search?q=classification" title=" classification"> classification</a>, <a href="https://publications.waset.org/abstracts/search?q=SVM" title=" SVM"> SVM</a> </p> <a href="https://publications.waset.org/abstracts/81523/development-of-a-computer-aided-diagnosis-tool-for-brain-tumor-extraction-and-classification" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/81523.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">250</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4663</span> Review and Evaluation of Trending Canonical Correlation Analyses-Based Brain Computer Interface Methods</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Bayar%20Shahab">Bayar Shahab</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The fast development of technology that has advanced neuroscience and human interaction with computers has enabled solutions to various problems, and issues of this new era have been found and are being found like no other time in history. Brain-computer interface so-called BCI has opened the door to several new research areas and have been able to provide solutions to critical and important issues such as supporting a paralyzed patient to interact with the outside world, controlling a robot arm, playing games in VR with the brain, driving a wheelchair or even a car and neurotechnology enabled the rehabilitation of the lost memory, etc. This review work presents state-of-the-art methods and improvements of canonical correlation analyses (CCA), which is an SSVEP-based BCI method. These are the methods used to extract EEG signal features or, to be said in a different way, the features of interest that we are looking for in the EEG analyses. Each of the methods from oldest to newest has been discussed while comparing their advantages and disadvantages. This would create a great context and help researchers to understand the most state-of-the-art methods available in this field with their pros and cons, along with their mathematical representations and usage. This work makes a vital contribution to the existing field of study. It differs from other similar recently published works by providing the following: (1) stating most of the prominent methods used in this field in a hierarchical way (2) explaining pros and cons of each method and their performance (3) presenting the gaps that exist at the end of each method that can open the understanding and doors to new research and/or improvements. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=BCI" title="BCI">BCI</a>, <a href="https://publications.waset.org/abstracts/search?q=CCA" title=" CCA"> CCA</a>, <a href="https://publications.waset.org/abstracts/search?q=SSVEP" title=" SSVEP"> SSVEP</a>, <a href="https://publications.waset.org/abstracts/search?q=EEG" title=" EEG"> EEG</a> </p> <a href="https://publications.waset.org/abstracts/142170/review-and-evaluation-of-trending-canonical-correlation-analyses-based-brain-computer-interface-methods" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/142170.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">145</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4662</span> Development of a Real-Time Brain-Computer Interface for Interactive Robot Therapy: An Exploration of EEG and EMG Features during Hypnosis</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Maryam%20Alimardani">Maryam Alimardani</a>, <a href="https://publications.waset.org/abstracts/search?q=Kazuo%20Hiraki"> Kazuo Hiraki</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This study presents a framework for development of a new generation of therapy robots that can interact with users by monitoring their physiological and mental states. Here, we focused on one of the controversial methods of therapy, hypnotherapy. Hypnosis has shown to be useful in treatment of many clinical conditions. But, even for healthy people, it can be used as an effective technique for relaxation or enhancement of memory and concentration. Our aim is to develop a robot that collects information about user&rsquo;s mental and physical states using electroencephalogram (EEG) and electromyography (EMG) signals and performs costeffective hypnosis at the comfort of user&rsquo;s house. The presented framework consists of three main steps: (1) Find the EEG-correlates of mind state before, during, and after hypnosis and establish a cognitive model for state changes, (2) Develop a system that can track the changes in EEG and EMG activities in real time and determines if the user is ready for suggestion, and (3) Implement our system in a humanoid robot that will talk and conduct hypnosis on users based on their mental states. This paper presents a pilot study in regard to the first stage, detection of EEG and EMG features during hypnosis. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=hypnosis" title="hypnosis">hypnosis</a>, <a href="https://publications.waset.org/abstracts/search?q=EEG" title=" EEG"> EEG</a>, <a href="https://publications.waset.org/abstracts/search?q=robotherapy" title=" robotherapy"> robotherapy</a>, <a href="https://publications.waset.org/abstracts/search?q=brain-computer%20interface%20%28BCI%29" title=" brain-computer interface (BCI)"> brain-computer interface (BCI)</a> </p> <a href="https://publications.waset.org/abstracts/53787/development-of-a-real-time-brain-computer-interface-for-interactive-robot-therapy-an-exploration-of-eeg-and-emg-features-during-hypnosis" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/53787.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">256</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4661</span> Effect of Signal Acquisition Procedure on Imagined Speech Classification Accuracy</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=M.R%20Asghari%20Bejestani">M.R Asghari Bejestani</a>, <a href="https://publications.waset.org/abstracts/search?q=Gh.%20R.%20Mohammad%20Khani"> Gh. R. Mohammad Khani</a>, <a href="https://publications.waset.org/abstracts/search?q=V.R.%20Nafisi"> V.R. Nafisi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Imagined speech recognition is one of the most interesting approaches to BCI development and a lot of works have been done in this area. Many different experiments have been designed and hundreds of combinations of feature extraction methods and classifiers have been examined. Reported classification accuracies range from the chance level to more than 90%. Based on non-stationary nature of brain signals, we have introduced 3 classification modes according to time difference in inter and intra-class samples. The modes can explain the diversity of reported results and predict the range of expected classification accuracies from the brain signal accusation procedure. In this paper, a few samples are illustrated by inspecting results of some previous works. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface" title="brain computer interface">brain computer interface</a>, <a href="https://publications.waset.org/abstracts/search?q=silent%20talk" title=" silent talk"> silent talk</a>, <a href="https://publications.waset.org/abstracts/search?q=imagined%20speech" title=" imagined speech"> imagined speech</a>, <a href="https://publications.waset.org/abstracts/search?q=classification" title=" classification"> classification</a>, <a href="https://publications.waset.org/abstracts/search?q=signal%20processing" title=" signal processing"> signal processing</a> </p> <a href="https://publications.waset.org/abstracts/154214/effect-of-signal-acquisition-procedure-on-imagined-speech-classification-accuracy" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/154214.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">153</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4660</span> Real Time Acquisition and Psychoacoustic Analysis of Brain Wave</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Shweta%20Singh">Shweta Singh</a>, <a href="https://publications.waset.org/abstracts/search?q=Dipali%20Bansal"> Dipali Bansal</a>, <a href="https://publications.waset.org/abstracts/search?q=Rashima%20Mahajan"> Rashima Mahajan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Psychoacoustics has become a potential area of research due to the growing interest of both laypersons and medical and mental health professionals. Non-invasive brain computer interface like Electroencephalography (EEG) is widely being used in this field. An attempt has been made in this paper to examine the response of EEG signals to acoustic stimuli further analysing the brain electrical activity. The real time EEG is acquired for 6 participants using a cost effective and portable EMOTIV EEG neuron headset. EEG data analysis is further done using EMOTIV test bench, EDF browser and EEGLAB (MATLAB Tool) application software platforms. Spectral analysis of acquired neural signals (AF3 channel) using these software platforms are clearly indicative of increased brain activity in various bands. The inferences drawn from such an analysis have significant correlation with subject’s subjective reporting of the experiences. The results suggest that the methodology adopted can further be used to assist patients with sleeping and depressive disorders. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=OM%20chant" title="OM chant">OM chant</a>, <a href="https://publications.waset.org/abstracts/search?q=spectral%20analysis" title=" spectral analysis"> spectral analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=EDF%20browser" title=" EDF browser"> EDF browser</a>, <a href="https://publications.waset.org/abstracts/search?q=EEGLAB" title=" EEGLAB"> EEGLAB</a>, <a href="https://publications.waset.org/abstracts/search?q=EMOTIV" title=" EMOTIV"> EMOTIV</a>, <a href="https://publications.waset.org/abstracts/search?q=real%20time%20acquisition" title=" real time acquisition"> real time acquisition</a> </p> <a href="https://publications.waset.org/abstracts/6427/real-time-acquisition-and-psychoacoustic-analysis-of-brain-wave" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/6427.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">281</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4659</span> The Use of Network Tool for Brain Signal Data Analysis: A Case Study with Blind and Sighted Individuals</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Cleiton%20Pons%20Ferreira">Cleiton Pons Ferreira</a>, <a href="https://publications.waset.org/abstracts/search?q=Diana%20Francisca%20Adamatti"> Diana Francisca Adamatti</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Advancements in computers technology have allowed to obtain information for research in biology and neuroscience. In order to transform the data from these surveys, networks have long been used to represent important biological processes, changing the use of this tools from purely illustrative and didactic to more analytic, even including interaction analysis and hypothesis formulation. Many studies have involved this application, but not directly for interpretation of data obtained from brain functions, asking for new perspectives of development in neuroinformatics using existent models of tools already disseminated by the bioinformatics. This study includes an analysis of neurological data through electroencephalogram (EEG) signals, using the Cytoscape, an open source software tool for visualizing complex networks in biological databases. The data were obtained from a comparative case study developed in a research from the University of Rio Grande (FURG), using the EEG signals from a Brain Computer Interface (BCI) with 32 eletrodes prepared in the brain of a blind and a sighted individuals during the execution of an activity that stimulated the spatial ability. This study intends to present results that lead to better ways for use and adapt techniques that support the data treatment of brain signals for elevate the understanding and learning in neuroscience. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=neuroinformatics" title="neuroinformatics">neuroinformatics</a>, <a href="https://publications.waset.org/abstracts/search?q=bioinformatics" title=" bioinformatics"> bioinformatics</a>, <a href="https://publications.waset.org/abstracts/search?q=network%20tools" title=" network tools"> network tools</a>, <a href="https://publications.waset.org/abstracts/search?q=brain%20mapping" title=" brain mapping"> brain mapping</a> </p> <a href="https://publications.waset.org/abstracts/105037/the-use-of-network-tool-for-brain-signal-data-analysis-a-case-study-with-blind-and-sighted-individuals" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/105037.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">182</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">&lsaquo;</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface%20%28BCI%29&amp;page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface%20%28BCI%29&amp;page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface%20%28BCI%29&amp;page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface%20%28BCI%29&amp;page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface%20%28BCI%29&amp;page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface%20%28BCI%29&amp;page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface%20%28BCI%29&amp;page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface%20%28BCI%29&amp;page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface%20%28BCI%29&amp;page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface%20%28BCI%29&amp;page=156">156</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface%20%28BCI%29&amp;page=157">157</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=brain%20computer%20interface%20%28BCI%29&amp;page=2" rel="next">&rsaquo;</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">&copy; 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">&times;</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10