CINXE.COM
Search results for: Public emotion recognition
<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: Public emotion recognition</title> <meta name="description" content="Search results for: Public emotion recognition"> <meta name="keywords" content="Public emotion recognition"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="Public emotion recognition" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="Public emotion recognition"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 1926</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: Public emotion recognition</h1> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1926</span> Comparing Emotion Recognition from Voice and Facial Data Using Time Invariant Features</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Vesna%20Kirandziska">Vesna Kirandziska</a>, <a href="https://publications.waset.org/search?q=Nevena%20Ackovska"> Nevena Ackovska</a>, <a href="https://publications.waset.org/search?q=Ana%20Madevska%20Bogdanova"> Ana Madevska Bogdanova</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>The problem of emotion recognition is a challenging problem. It is still an open problem from the aspect of both intelligent systems and psychology. In this paper, both voice features and facial features are used for building an emotion recognition system. A Support Vector Machine classifiers are built by using raw data from video recordings. In this paper, the results obtained for the emotion recognition are given, and a discussion about the validity and the expressiveness of different emotions is presented. A comparison between the classifiers build from facial data only, voice data only and from the combination of both data is made here. The need for a better combination of the information from facial expression and voice data is argued.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Emotion%20recognition" title="Emotion recognition">Emotion recognition</a>, <a href="https://publications.waset.org/search?q=facial%20recognition" title=" facial recognition"> facial recognition</a>, <a href="https://publications.waset.org/search?q=signal%20processing" title=" signal processing"> signal processing</a>, <a href="https://publications.waset.org/search?q=machine%20learning." title=" machine learning. "> machine learning. </a> </p> <a href="https://publications.waset.org/10004221/comparing-emotion-recognition-from-voice-and-facial-data-using-time-invariant-features" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10004221/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10004221/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10004221/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10004221/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10004221/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10004221/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10004221/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10004221/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10004221/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10004221/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10004221.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">2018</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1925</span> Deep-Learning Based Approach to Facial Emotion Recognition Through Convolutional Neural Network</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Nouha%20Khediri">Nouha Khediri</a>, <a href="https://publications.waset.org/search?q=Mohammed%20Ben%20Ammar"> Mohammed Ben Ammar</a>, <a href="https://publications.waset.org/search?q=Monji%20Kherallah"> Monji Kherallah</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>Recently, facial emotion recognition (FER) has become increasingly essential to understand the state of the human mind. However, accurately classifying emotion from the face is a challenging task. In this paper, we present a facial emotion recognition approach named CV-FER benefiting from deep learning, especially CNN and VGG16. First, the data are pre-processed with data cleaning and data rotation. Then, we augment the data and proceed to our FER model, which contains five convolutions layers and five pooling layers. Finally, a softmax classifier is used in the output layer to recognize emotions. Based on the above contents, this paper reviews the works of facial emotion recognition based on deep learning. Experiments show that our model outperforms the other methods using the same FER2013 database and yields a recognition rate of 92%. We also put forward some suggestions for future work. </p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=CNN" title="CNN">CNN</a>, <a href="https://publications.waset.org/search?q=deep-learning" title=" deep-learning"> deep-learning</a>, <a href="https://publications.waset.org/search?q=facial%20emotion%20recognition" title=" facial emotion recognition"> facial emotion recognition</a>, <a href="https://publications.waset.org/search?q=machine%20learning." title=" machine learning."> machine learning.</a> </p> <a href="https://publications.waset.org/10012968/deep-learning-based-approach-to-facial-emotion-recognition-through-convolutional-neural-network" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10012968/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10012968/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10012968/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10012968/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10012968/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10012968/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10012968/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10012968/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10012968/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10012968/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10012968.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">710</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1924</span> Composite Kernels for Public Emotion Recognition from Twitter</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Chien-Hung%20Chen">Chien-Hung Chen</a>, <a href="https://publications.waset.org/search?q=Yan-Chun%20Hsing"> Yan-Chun Hsing</a>, <a href="https://publications.waset.org/search?q=Yung-Chun%20Chang"> Yung-Chun Chang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>The Internet has grown into a powerful medium for information dispersion and social interaction that leads to a rapid growth of social media which allows users to easily post their emotions and perspectives regarding certain topics online. Our research aims at using natural language processing and text mining techniques to explore the public emotions expressed on Twitter by analyzing the sentiment behind tweets. In this paper, we propose a composite kernel method that integrates tree kernel with the linear kernel to simultaneously exploit both the tree representation and the distributed emotion keyword representation to analyze the syntactic and content information in tweets. The experiment results demonstrate that our method can effectively detect public emotion of tweets while outperforming the other compared methods.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Public%20emotion%20recognition" title="Public emotion recognition">Public emotion recognition</a>, <a href="https://publications.waset.org/search?q=natural%20language%20processing" title=" natural language processing"> natural language processing</a>, <a href="https://publications.waset.org/search?q=composite%20kernel" title=" composite kernel"> composite kernel</a>, <a href="https://publications.waset.org/search?q=sentiment%20analysis" title=" sentiment analysis"> sentiment analysis</a>, <a href="https://publications.waset.org/search?q=text%20mining." title=" text mining."> text mining.</a> </p> <a href="https://publications.waset.org/10009602/composite-kernels-for-public-emotion-recognition-from-twitter" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10009602/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10009602/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10009602/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10009602/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10009602/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10009602/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10009602/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10009602/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10009602/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10009602/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10009602.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">773</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1923</span> Analysis of Feature Space for a 2d/3d Vision based Emotion Recognition Method</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Robert%20Niese">Robert Niese</a>, <a href="https://publications.waset.org/search?q=Ayoub%20Al-Hamadi"> Ayoub Al-Hamadi</a>, <a href="https://publications.waset.org/search?q=Bernd%20Michaelis"> Bernd Michaelis</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In modern human computer interaction systems (HCI), emotion recognition is becoming an imperative characteristic. The quest for effective and reliable emotion recognition in HCI has resulted in a need for better face detection, feature extraction and classification. In this paper we present results of feature space analysis after briefly explaining our fully automatic vision based emotion recognition method. We demonstrate the compactness of the feature space and show how the 2d/3d based method achieves superior features for the purpose of emotion classification. Also it is exposed that through feature normalization a widely person independent feature space is created. As a consequence, the classifier architecture has only a minor influence on the classification result. This is particularly elucidated with the help of confusion matrices. For this purpose advanced classification algorithms, such as Support Vector Machines and Artificial Neural Networks are employed, as well as the simple k- Nearest Neighbor classifier. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Facial%20expression%20analysis" title="Facial expression analysis">Facial expression analysis</a>, <a href="https://publications.waset.org/search?q=Feature%20extraction" title=" Feature extraction"> Feature extraction</a>, <a href="https://publications.waset.org/search?q=Image%20processing" title=" Image processing"> Image processing</a>, <a href="https://publications.waset.org/search?q=Pattern%20Recognition" title=" Pattern Recognition"> Pattern Recognition</a>, <a href="https://publications.waset.org/search?q=Application." title=" Application."> Application.</a> </p> <a href="https://publications.waset.org/4039/analysis-of-feature-space-for-a-2d3d-vision-based-emotion-recognition-method" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/4039/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/4039/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/4039/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/4039/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/4039/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/4039/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/4039/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/4039/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/4039/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/4039/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/4039.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1923</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1922</span> An Artificial Emotion Model For Visualizing Emotion of Characters</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Junseok%20Ham">Junseok Ham</a>, <a href="https://publications.waset.org/search?q=Chansun%20Jung"> Chansun Jung</a>, <a href="https://publications.waset.org/search?q=Junhyung%20Park"> Junhyung Park</a>, <a href="https://publications.waset.org/search?q=Jihye%20Ryeo"> Jihye Ryeo</a>, <a href="https://publications.waset.org/search?q=Ilju%20Ko"> Ilju Ko</a> </p> <p class="card-text"><strong>Abstract:</strong></p> It is hard to express emotion through only speech when we watch a character in a movie or a play because we cannot estimate the size, kind, and quantity of emotion. So this paper proposes an artificial emotion model for visualizing current emotion with color and location in emotion model. The artificial emotion model is designed considering causality of generated emotion, difference of personality, difference of continual emotional stimulus, and co-relation of various emotions. This paper supposed the Emotion Field for visualizing current emotion with location, and current emotion is expressed by location and color in the Emotion Field. For visualizing changes within current emotion, the artificial emotion model is adjusted to characters in Hamlet. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Emotion" title="Emotion">Emotion</a>, <a href="https://publications.waset.org/search?q=Artificial%20Emotion" title=" Artificial Emotion"> Artificial Emotion</a>, <a href="https://publications.waset.org/search?q=Visualizing" title=" Visualizing"> Visualizing</a>, <a href="https://publications.waset.org/search?q=EmotionModel." title=" EmotionModel."> EmotionModel.</a> </p> <a href="https://publications.waset.org/9375/an-artificial-emotion-model-for-visualizing-emotion-of-characters" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/9375/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/9375/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/9375/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/9375/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/9375/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/9375/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/9375/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/9375/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/9375/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/9375/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/9375.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1250</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1921</span> Multimodal Database of Emotional Speech, Video and Gestures</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Tomasz%20Sapi%C5%84ski">Tomasz Sapi艅ski</a>, <a href="https://publications.waset.org/search?q=Dorota%20Kami%C5%84ska"> Dorota Kami艅ska</a>, <a href="https://publications.waset.org/search?q=Adam%20Pelikant"> Adam Pelikant</a>, <a href="https://publications.waset.org/search?q=Egils%20Avots"> Egils Avots</a>, <a href="https://publications.waset.org/search?q=Cagri%20Ozcinar"> Cagri Ozcinar</a>, <a href="https://publications.waset.org/search?q=Gholamreza%20Anbarjafari"> Gholamreza Anbarjafari</a> </p> <p class="card-text"><strong>Abstract:</strong></p> People express emotions through different modalities. Integration of verbal and non-verbal communication channels creates a system in which the message is easier to understand. Expanding the focus to several expression forms can facilitate research on emotion recognition as well as human-machine interaction. In this article, the authors present a Polish emotional database composed of three modalities: facial expressions, body movement and gestures, and speech. The corpora contains recordings registered in studio conditions, acted out by 16 professional actors (8 male and 8 female). The data is labeled with six basic emotions categories, according to Ekman’s emotion categories. To check the quality of performance, all recordings are evaluated by experts and volunteers. The database is available to academic community and might be useful in the study on audio-visual emotion recognition. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Body%20movement" title="Body movement">Body movement</a>, <a href="https://publications.waset.org/search?q=emotion%20recognition" title=" emotion recognition"> emotion recognition</a>, <a href="https://publications.waset.org/search?q=emotional%0D%0Acorpus" title=" emotional corpus"> emotional corpus</a>, <a href="https://publications.waset.org/search?q=facial%20expressions" title=" facial expressions"> facial expressions</a>, <a href="https://publications.waset.org/search?q=gestures" title=" gestures"> gestures</a>, <a href="https://publications.waset.org/search?q=multimodal%20database" title=" multimodal database"> multimodal database</a>, <a href="https://publications.waset.org/search?q=speech." title=" speech."> speech.</a> </p> <a href="https://publications.waset.org/10009589/multimodal-database-of-emotional-speech-video-and-gestures" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10009589/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10009589/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10009589/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10009589/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10009589/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10009589/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10009589/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10009589/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10009589/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10009589/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10009589.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1125</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1920</span> Hand Gestures Based Emotion Identification Using Flex Sensors</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=S.%20Ali">S. Ali</a>, <a href="https://publications.waset.org/search?q=R.%20Yunus"> R. Yunus</a>, <a href="https://publications.waset.org/search?q=A.%20Arif"> A. Arif</a>, <a href="https://publications.waset.org/search?q=Y.%20Ayaz"> Y. Ayaz</a>, <a href="https://publications.waset.org/search?q=M.%20Baber%20Sial"> M. Baber Sial</a>, <a href="https://publications.waset.org/search?q=R.%20Asif"> R. Asif</a>, <a href="https://publications.waset.org/search?q=N.%20Naseer"> N. Naseer</a>, <a href="https://publications.waset.org/search?q=M.%20Jawad%20Khan"> M. Jawad Khan </a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>In this study, we have proposed a gesture to emotion recognition method using flex sensors mounted on metacarpophalangeal joints. The flex sensors are fixed in a wearable glove. The data from the glove are sent to PC using Wi-Fi. Four gestures: finger pointing, thumbs up, fist open and fist close are performed by five subjects. Each gesture is categorized into sad, happy, and excited class based on the velocity and acceleration of the hand gesture. Seventeen inspectors observed the emotions and hand gestures of the five subjects. The emotional state based on the investigators assessment and acquired movement speed data is compared. Overall, we achieved 77% accurate results. Therefore, the proposed design can be used for emotional state detection applications.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Emotion%20identification" title="Emotion identification">Emotion identification</a>, <a href="https://publications.waset.org/search?q=emotion%20models" title=" emotion models"> emotion models</a>, <a href="https://publications.waset.org/search?q=gesture%20recognition" title=" gesture recognition"> gesture recognition</a>, <a href="https://publications.waset.org/search?q=user%20perception." title=" user perception."> user perception.</a> </p> <a href="https://publications.waset.org/10009483/hand-gestures-based-emotion-identification-using-flex-sensors" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10009483/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10009483/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10009483/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10009483/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10009483/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10009483/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10009483/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10009483/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10009483/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10009483/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10009483.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">940</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1919</span> Emotion Recognition Using Neural Network: A Comparative Study</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Nermine%20Ahmed%20Hendy">Nermine Ahmed Hendy</a>, <a href="https://publications.waset.org/search?q=Hania%20Farag"> Hania Farag</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>Emotion recognition is an important research field that finds lots of applications nowadays. This work emphasizes on recognizing different emotions from speech signal. The extracted features are related to statistics of pitch, formants, and energy contours, as well as spectral, perceptual and temporal features, jitter, and shimmer. The Artificial Neural Networks (ANN) was chosen as the classifier. Working on finding a robust and fast ANN classifier suitable for different real life application is our concern. Several experiments were carried out on different ANN to investigate the different factors that impact the classification success rate. Using a database containing 7 different emotions, it will be shown that with a proper and careful adjustment of features format, training data sorting, number of features selected and even the ANN type and architecture used, a success rate of 85% or even more can be achieved without increasing the system complicity and the computation time</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Classification" title="Classification">Classification</a>, <a href="https://publications.waset.org/search?q=emotion%20recognition" title=" emotion recognition"> emotion recognition</a>, <a href="https://publications.waset.org/search?q=features%20extraction" title=" features extraction"> features extraction</a>, <a href="https://publications.waset.org/search?q=feature%20selection" title=" feature selection"> feature selection</a>, <a href="https://publications.waset.org/search?q=neural%20network" title=" neural network"> neural network</a> </p> <a href="https://publications.waset.org/11358/emotion-recognition-using-neural-network-a-comparative-study" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/11358/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/11358/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/11358/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/11358/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/11358/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/11358/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/11358/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/11358/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/11358/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/11358/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/11358.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">4698</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1918</span> Facial Emotion Recognition with Convolutional Neural Network Based Architecture</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Koray%20U.%20Erbas">Koray U. Erbas</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>Neural networks are appealing for many applications since they are able to learn complex non-linear relationships between input and output data. As the number of neurons and layers in a neural network increase, it is possible to represent more complex relationships with automatically extracted features. Nowadays Deep Neural Networks (DNNs) are widely used in Computer Vision problems such as; classification, object detection, segmentation image editing etc. In this work, Facial Emotion Recognition task is performed by proposed Convolutional Neural Network (CNN)-based DNN architecture using FER2013 Dataset. Moreover, the effects of different hyperparameters (activation function, kernel size, initializer, batch size and network size) are investigated and ablation study results for Pooling Layer, Dropout and Batch Normalization are presented.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Convolutional%20Neural%20Network" title="Convolutional Neural Network">Convolutional Neural Network</a>, <a href="https://publications.waset.org/search?q=Deep%20Learning" title=" Deep Learning"> Deep Learning</a>, <a href="https://publications.waset.org/search?q=Deep%20Learning%20Based%20FER" title=" Deep Learning Based FER"> Deep Learning Based FER</a>, <a href="https://publications.waset.org/search?q=Facial%20Emotion%20Recognition." title=" Facial Emotion Recognition."> Facial Emotion Recognition.</a> </p> <a href="https://publications.waset.org/10011791/facial-emotion-recognition-with-convolutional-neural-network-based-architecture" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10011791/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10011791/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10011791/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10011791/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10011791/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10011791/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10011791/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10011791/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10011791/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10011791/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10011791.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1371</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1917</span> Using Speech Emotion Recognition as a Longitudinal Biomarker for Alzheimer鈥檚 Disease</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Yishu%20Gong">Yishu Gong</a>, <a href="https://publications.waset.org/search?q=Liangliang%20Yang"> Liangliang Yang</a>, <a href="https://publications.waset.org/search?q=Jianyu%20Zhang"> Jianyu Zhang</a>, <a href="https://publications.waset.org/search?q=Zhengyu%20Chen"> Zhengyu Chen</a>, <a href="https://publications.waset.org/search?q=Sihong%20He"> Sihong He</a>, <a href="https://publications.waset.org/search?q=Xusheng%20Zhang"> Xusheng Zhang</a>, <a href="https://publications.waset.org/search?q=Wei%20Zhang"> Wei Zhang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>Alzheimer鈥檚 disease (AD) is a progressive neurodegenerative disorder that affects millions of people worldwide and is characterized by cognitive decline and behavioral changes. People living with Alzheimer鈥檚 disease often find it hard to complete routine tasks. However, there are limited objective assessments that aim to quantify the difficulty of certain tasks for AD patients compared to non-AD people. In this study, we propose to use speech emotion recognition (SER), especially the frustration level as a potential biomarker for quantifying the difficulty patients experience when describing a picture. We build an SER model using data from the IEMOCAP dataset and apply the model to the DementiaBank data to detect the AD/non-AD group difference and perform longitudinal analysis to track the AD disease progression. Our results show that the frustration level detected from the SER model can possibly be used as a cost-effective tool for objective tracking of AD progression in addition to the Mini-Mental State Examination (MMSE) score.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Alzheimer%E2%80%99s%20disease" title="Alzheimer鈥檚 disease">Alzheimer鈥檚 disease</a>, <a href="https://publications.waset.org/search?q=Speech%20Emotion%20Recognition" title=" Speech Emotion Recognition"> Speech Emotion Recognition</a>, <a href="https://publications.waset.org/search?q=longitudinal%20biomarker" title=" longitudinal biomarker"> longitudinal biomarker</a>, <a href="https://publications.waset.org/search?q=machine%20learning." title=" machine learning."> machine learning.</a> </p> <a href="https://publications.waset.org/10013336/using-speech-emotion-recognition-as-a-longitudinal-biomarker-for-alzheimers-disease" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10013336/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10013336/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10013336/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10013336/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10013336/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10013336/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10013336/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10013336/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10013336/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10013336/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10013336.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">274</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1916</span> Improving the Performance of Deep Learning in Facial Emotion Recognition with Image Sharpening</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Ksheeraj%20Sai%20Vepuri">Ksheeraj Sai Vepuri</a>, <a href="https://publications.waset.org/search?q=Nada%20Attar"> Nada Attar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> We as humans use words with accompanying visual and facial cues to communicate effectively. Classifying facial emotion using computer vision methodologies has been an active research area in the computer vision field. In this paper, we propose a simple method for facial expression recognition that enhances accuracy. We tested our method on the FER-2013 dataset that contains static images. Instead of using Histogram equalization to preprocess the dataset, we used Unsharp Mask to emphasize texture and details and sharpened the edges. We also used ImageDataGenerator from Keras library for data augmentation. Then we used Convolutional Neural Networks (CNN) model to classify the images into 7 different facial expressions, yielding an accuracy of 69.46% on the test set. Our results show that using image preprocessing such as the sharpening technique for a CNN model can improve the performance, even when the CNN model is relatively simple. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Facial%20expression%20recognition" title="Facial expression recognition">Facial expression recognition</a>, <a href="https://publications.waset.org/search?q=image%20pre-processing" title=" image pre-processing"> image pre-processing</a>, <a href="https://publications.waset.org/search?q=deep%20learning" title=" deep learning"> deep learning</a>, <a href="https://publications.waset.org/search?q=CNN." title=" CNN. "> CNN. </a> </p> <a href="https://publications.waset.org/10011940/improving-the-performance-of-deep-learning-in-facial-emotion-recognition-with-image-sharpening" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10011940/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10011940/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10011940/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10011940/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10011940/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10011940/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10011940/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10011940/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10011940/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10011940/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10011940.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">544</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1915</span> Automatic Recognition of Emotionally Coloured Speech</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Theologos%20Athanaselis">Theologos Athanaselis</a>, <a href="https://publications.waset.org/search?q=Stelios%20Bakamidis"> Stelios Bakamidis</a>, <a href="https://publications.waset.org/search?q=Ioannis%20Dologlou"> Ioannis Dologlou</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Emotion in speech is an issue that has been attracting the interest of the speech community for many years, both in the context of speech synthesis as well as in automatic speech recognition (ASR). In spite of the remarkable recent progress in Large Vocabulary Recognition (LVR), it is still far behind the ultimate goal of recognising free conversational speech uttered by any speaker in any environment. Current experimental tests prove that using state of the art large vocabulary recognition systems the error rate increases substantially when applied to spontaneous/emotional speech. This paper shows that recognition rate for emotionally coloured speech can be improved by using a language model based on increased representation of emotional utterances. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Statistical%20language%20model" title="Statistical language model">Statistical language model</a>, <a href="https://publications.waset.org/search?q=N-grams" title=" N-grams"> N-grams</a>, <a href="https://publications.waset.org/search?q=emotionallycoloured%20speech" title=" emotionallycoloured speech"> emotionallycoloured speech</a> </p> <a href="https://publications.waset.org/1891/automatic-recognition-of-emotionally-coloured-speech" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/1891/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/1891/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/1891/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/1891/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/1891/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/1891/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/1891/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/1891/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/1891/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/1891/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/1891.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1618</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1914</span> Cognitive Emotion Regulation in Children Is Attributable to Parenting Style, Not to Family Type and Child鈥檚 Gender</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=AKM%20Rezaul%20Karim">AKM Rezaul Karim</a>, <a href="https://publications.waset.org/search?q=Tania%20Sharafat"> Tania Sharafat</a>, <a href="https://publications.waset.org/search?q=Abu%20Yusuf%20Mahmud"> Abu Yusuf Mahmud</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>The study aimed to investigate whether cognitive emotion regulation in children varies with parenting style, family type and gender. Toward this end, cognitive emotion regulation and perceived parenting style of 206 school children were measured. Standard regression analyses of data revealed that the models were significant and explained 17.3% of the variance in <em>adaptive</em> emotion regulation (Adjusted <em>R²</em>=0.173; <em>F</em>=9.579, <em>p</em><.001), and 7.1% of the variance in <em>less adaptive</em> emotion regulation (Adjusted <em>R²</em>=.071, <em>F</em>=4.135, <em>p</em>=.001). Results showed that children’s cognitive emotion regulation is functionally associated with parenting style, but not with family type and their gender. Amongst three types of parenting, authoritative parenting was the strongest predictor of the overall <em>adaptive</em> emotion regulation while authoritarian parenting was the strongest predictor of the overall <em>less adaptive</em> emotion regulation. Permissive parenting has impact neither on <em>adaptive</em> nor on <em>less adaptive</em> emotion regulation. The findings would have important implications for parents, caregivers, child psychologists, and other professionals working with children or adolescents.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Cognitive%20Emotion%20Regulation" title="Cognitive Emotion Regulation">Cognitive Emotion Regulation</a>, <a href="https://publications.waset.org/search?q=Adaptive" title=" Adaptive"> Adaptive</a>, <a href="https://publications.waset.org/search?q=Less%20Adaptive" title=" Less Adaptive"> Less Adaptive</a>, <a href="https://publications.waset.org/search?q=Parenting%20Style" title=" Parenting Style"> Parenting Style</a>, <a href="https://publications.waset.org/search?q=Family%20Type." title=" Family Type."> Family Type.</a> </p> <a href="https://publications.waset.org/9996937/cognitive-emotion-regulation-in-children-is-attributable-to-parenting-style-not-to-family-type-and-childs-gender" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/9996937/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/9996937/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/9996937/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/9996937/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/9996937/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/9996937/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/9996937/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/9996937/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/9996937/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/9996937/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/9996937.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">3700</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1913</span> Intelligent Agent System Simulation Using Fear Emotion</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Latifeh%20PourMohammadBagher">Latifeh PourMohammadBagher</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper I have developed a system for evaluating the degree of fear emotion that the intelligent agent-based system may feel when it encounters to a persecuting event. In this paper I want to describe behaviors of emotional agents using human behavior in terms of the way their emotional states evolve over time. I have implemented a fuzzy inference system using Java environment. As the inputs of this system, I have considered three parameters related on human fear emotion. The system outputs can be used in agent decision making process or choosing a person for team working systems by combination the intensity of fear to other emotion intensities. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Emotion%20simulation" title="Emotion simulation">Emotion simulation</a>, <a href="https://publications.waset.org/search?q=Fear" title=" Fear"> Fear</a>, <a href="https://publications.waset.org/search?q=Fuzzy%20intelligent%20agent" title=" Fuzzy intelligent agent"> Fuzzy intelligent agent</a> </p> <a href="https://publications.waset.org/7247/intelligent-agent-system-simulation-using-fear-emotion" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/7247/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/7247/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/7247/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/7247/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/7247/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/7247/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/7247/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/7247/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/7247/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/7247/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/7247.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1462</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1912</span> Analyzing Artificial Emotion in Game Characters Using Soft Computing</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Musbah%20M.%20Aqel">Musbah M. Aqel</a>, <a href="https://publications.waset.org/search?q=P.%20K.%20Mahanti"> P. K. Mahanti</a>, <a href="https://publications.waset.org/search?q=Soumya%20Banerjee"> Soumya Banerjee</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>This paper describes a simulation model for analyzing artificial emotion injected to design the game characters. Most of the game storyboard is interactive in nature and the virtual characters of the game are equipped with an individual personality and dynamic emotion value which is similar to real life emotion and behavior. The uncertainty in real expression, mood and behavior is also exhibited in game paradigm and this is focused in the present paper through a fuzzy logic based agent and storyboard. Subsequently, a pheromone distribution or labeling is presented mimicking the behavior of social insects.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Artificial%20Emotion" title="Artificial Emotion">Artificial Emotion</a>, <a href="https://publications.waset.org/search?q=Fuzzy%20logic" title=" Fuzzy logic"> Fuzzy logic</a>, <a href="https://publications.waset.org/search?q=Game%20character" title=" Game character"> Game character</a>, <a href="https://publications.waset.org/search?q=Pheromone%20label" title=" Pheromone label"> Pheromone label</a> </p> <a href="https://publications.waset.org/13041/analyzing-artificial-emotion-in-game-characters-using-soft-computing" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/13041/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/13041/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/13041/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/13041/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/13041/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/13041/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/13041/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/13041/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/13041/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/13041/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/13041.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1311</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1911</span> Facial Expression Phoenix (FePh): An Annotated Sequenced Dataset for Facial and Emotion-Specified Expressions in Sign Language</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Marie%20Alaghband">Marie Alaghband</a>, <a href="https://publications.waset.org/search?q=Niloofar%20Yousefi"> Niloofar Yousefi</a>, <a href="https://publications.waset.org/search?q=Ivan%20Garibay"> Ivan Garibay</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Facial expressions are important parts of both gesture and sign language recognition systems. Despite the recent advances in both fields, annotated facial expression datasets in the context of sign language are still scarce resources. In this manuscript, we introduce an annotated sequenced facial expression dataset in the context of sign language, comprising over 3000 facial images extracted from the daily news and weather forecast of the public tv-station PHOENIX. Unlike the majority of currently existing facial expression datasets, FePh provides sequenced semi-blurry facial images with different head poses, orientations, and movements. In addition, in the majority of images, identities are mouthing the words, which makes the data more challenging. To annotate this dataset we consider primary, secondary, and tertiary dyads of seven basic emotions of "sad", "surprise", "fear", "angry", "neutral", "disgust", and "happy". We also considered the "None" class if the image’s facial expression could not be described by any of the aforementioned emotions. Although we provide FePh as a facial expression dataset of signers in sign language, it has a wider application in gesture recognition and Human Computer Interaction (HCI) systems. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Annotated%20Facial%20Expression%20Dataset" title="Annotated Facial Expression Dataset">Annotated Facial Expression Dataset</a>, <a href="https://publications.waset.org/search?q=Sign%20Language%0D%0ARecognition" title=" Sign Language Recognition"> Sign Language Recognition</a>, <a href="https://publications.waset.org/search?q=Gesture%20Recognition" title=" Gesture Recognition"> Gesture Recognition</a>, <a href="https://publications.waset.org/search?q=Sequenced%20Facial%20Expression%0D%0ADataset." title=" Sequenced Facial Expression Dataset."> Sequenced Facial Expression Dataset.</a> </p> <a href="https://publications.waset.org/10011933/facial-expression-phoenix-feph-an-annotated-sequenced-dataset-for-facial-and-emotion-specified-expressions-in-sign-language" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10011933/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10011933/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10011933/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10011933/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10011933/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10011933/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10011933/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10011933/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10011933/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10011933/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10011933.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">720</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1910</span> Face Recognition: A Literature Review</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=A.%20S.%20Tolba">A. S. Tolba</a>, <a href="https://publications.waset.org/search?q=A.H.%20El-Baz"> A.H. El-Baz</a>, <a href="https://publications.waset.org/search?q=A.A.%20El-Harby"> A.A. El-Harby</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The task of face recognition has been actively researched in recent years. This paper provides an up-to-date review of major human face recognition research. We first present an overview of face recognition and its applications. Then, a literature review of the most recent face recognition techniques is presented. Description and limitations of face databases which are used to test the performance of these face recognition algorithms are given. A brief summary of the face recognition vendor test (FRVT) 2002, a large scale evaluation of automatic face recognition technology, and its conclusions are also given. Finally, we give a summary of the research results. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Combined%20classifiers" title="Combined classifiers">Combined classifiers</a>, <a href="https://publications.waset.org/search?q=face%20recognition" title=" face recognition"> face recognition</a>, <a href="https://publications.waset.org/search?q=graph%20matching" title=" graph matching"> graph matching</a>, <a href="https://publications.waset.org/search?q=neural%20networks." title=" neural networks."> neural networks.</a> </p> <a href="https://publications.waset.org/7912/face-recognition-a-literature-review" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/7912/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/7912/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/7912/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/7912/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/7912/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/7912/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/7912/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/7912/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/7912/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/7912/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/7912.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">7723</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1909</span> Locus of Control, Emotion Venting Strategy and Internet Addiction</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Jia-Ru%20Li">Jia-Ru Li</a>, <a href="https://publications.waset.org/search?q=Chih-Hung%20Wang"> Chih-Hung Wang</a>, <a href="https://publications.waset.org/search?q=Ching-Wen%20Lin"> Ching-Wen Lin</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Internet addiction has become a critical problem on adolescents in Taiwan, and its negative effects on various dimensions of adolescent development caught the attention of educational and psychological experts. This study examined the correlation between cognitive (locus of control) and emotion (emotion venting strategies) factors on internet addiction of adolescents in Taiwan. Using the Compulsive Internet Use (CIU) and the Emotion Venting Strategy scales, a survey was conducted and 215 effective samples (students ranging from12 to14 years old) returned. Quantitative analysis methods such as descriptive statistics, t-test, ANOVA, Pearson correlations and multiple regression were adopted. The results were as follows: 1. Severity of Internet addiction has significant gender differences; boys were at a higher risk than girls in becoming addicted to the Internet. 2. Emotion venting, locus of control and internet addiction have been shown to be positive correlated with one another. 3. Setting the locus of control as the control variable, emotion venting strategy has positive and significant contribution to internet addiction. The results of this study suggest that coaching deconstructive emotion strategies and cognitive believes are encouraged to integrate with actual field work. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Emotion%20venting%20strategy" title="Emotion venting strategy">Emotion venting strategy</a>, <a href="https://publications.waset.org/search?q=locus%20of%20control" title=" locus of control"> locus of control</a>, <a href="https://publications.waset.org/search?q=adolescent%20internet%20addiction." title=" adolescent internet addiction."> adolescent internet addiction.</a> </p> <a href="https://publications.waset.org/14547/locus-of-control-emotion-venting-strategy-and-internet-addiction" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/14547/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/14547/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/14547/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/14547/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/14547/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/14547/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/14547/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/14547/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/14547/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/14547/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/14547.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">3104</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1908</span> Emotions and Message Sharing on the Chinese Microblog</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Yungeng%20Xie">Yungeng Xie</a>, <a href="https://publications.waset.org/search?q=Cong%20Liu"> Cong Liu</a>, <a href="https://publications.waset.org/search?q=Yi%20Liu"> Yi Liu</a>, <a href="https://publications.waset.org/search?q=Xuanao%20Wan"> Xuanao Wan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>The study aims to explore microblog users’ emotion expression and sharing behaviors on the Chinese microblog (Weibo). The first theme of study analyzed whether microblog emotions impact readers’ message sharing behaviors, specifically, how the strength of emotion (positive and negative) in microblog messages facilitate/inhibit readers’ sharing behaviors. The second theme compared the differences among the three types of microblog users (i.e., verified enterprise users, verified individual users and unverified users) in terms of their profiles and microblog behaviors. A total of 7114 microblog messages about 24 hot public events in China were sampled from Sina Weibo. The first study results show that strength of negative emotions that microblog messages carry significantly increase the possibility of the message being shared. The second study results indicate that there are significant differences across the three types of users in terms of their emotion expression and its influence on microblog behaviors.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Microblog" title="Microblog">Microblog</a>, <a href="https://publications.waset.org/search?q=emotion%20expression" title=" emotion expression"> emotion expression</a>, <a href="https://publications.waset.org/search?q=information%20diffusion." title=" information diffusion."> information diffusion.</a> </p> <a href="https://publications.waset.org/10007070/emotions-and-message-sharing-on-the-chinese-microblog" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10007070/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10007070/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10007070/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10007070/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10007070/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10007070/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10007070/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10007070/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10007070/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10007070/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10007070.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">813</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1907</span> The Development of Positive Emotion Regulation Strategies Scale for Children and Adolescents</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Jia-Ru%20Li">Jia-Ru Li</a>, <a href="https://publications.waset.org/search?q=Ching-Wen%20Lin"> Ching-Wen Lin</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The study was designed to develop a measurement of the positive emotion regulation questionnaire (PERQ) that assesses positive emotion regulation strategies through self-report. The 14 items developed for the surveying instrument of the study were based upon literatures regarding elements of positive regulation strategies. 319 elementary students (age ranging from 12 to14) were recruited among three public elementary schools to survey on their use of positive emotion regulation strategies. Of 319 subjects, 20 invalid questionnaire s yielded a response rate of 92%. The data collected wasanalyzed through methods such as item analysis, factor analysis, and structural equation models. In reference to the results from item analysis, the formal survey instrument was reduced to 11 items. A principal axis factor analysis with varimax was performed on responses, resulting in a 2-factor equation (savoring strategy and neutralizing strategy), which accounted for 55.5% of the total variance. Then, the two-factor structure of scale was also identified by structural equation models. Finally, the reliability coefficients of the two factors were Cronbach-s 伪 .92 and .74. Gender difference was only found in savoring strategy. In conclusion, the positive emotion regulation strategies questionnaire offers a brief, internally consistent, and valid self-report measure for understanding the emotional regulation strategies of children that may be useful to researchers and applied professionals. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Emotional%20regulation" title="Emotional regulation">Emotional regulation</a>, <a href="https://publications.waset.org/search?q=emotional%20regulation%20strategies" title=" emotional regulation strategies"> emotional regulation strategies</a>, <a href="https://publications.waset.org/search?q=scale" title=" scale"> scale</a>, <a href="https://publications.waset.org/search?q=SEM." title=" SEM."> SEM.</a> </p> <a href="https://publications.waset.org/6898/the-development-of-positive-emotion-regulation-strategies-scale-for-children-and-adolescents" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/6898/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/6898/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/6898/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/6898/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/6898/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/6898/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/6898/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/6898/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/6898/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/6898/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/6898.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1991</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1906</span> Comparing Arabic and Latin Handwritten Digits Recognition Problems</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Sherif%20Abdelazeem">Sherif Abdelazeem</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A comparison between the performance of Latin and Arabic handwritten digits recognition problems is presented. The performance of ten different classifiers is tested on two similar Arabic and Latin handwritten digits databases. The analysis shows that Arabic handwritten digits recognition problem is easier than that of Latin digits. This is because the interclass difference in case of Latin digits is smaller than in Arabic digits and variances in writing Latin digits are larger. Consequently, weaker yet fast classifiers are expected to play more prominent role in Arabic handwritten digits recognition. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Handwritten%20recognition" title="Handwritten recognition">Handwritten recognition</a>, <a href="https://publications.waset.org/search?q=Arabic%20recognition" title=" Arabic recognition"> Arabic recognition</a>, <a href="https://publications.waset.org/search?q=Digits%20recognition" title=" Digits recognition"> Digits recognition</a>, <a href="https://publications.waset.org/search?q=Document%20recognition" title=" Document recognition"> Document recognition</a> </p> <a href="https://publications.waset.org/5070/comparing-arabic-and-latin-handwritten-digits-recognition-problems" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/5070/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/5070/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/5070/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/5070/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/5070/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/5070/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/5070/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/5070/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/5070/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/5070/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/5070.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1986</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1905</span> The Relationship between Adolescent Emotional Inhibition and Depression Disorder: The Moderate Effect of Gender</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Jia-Ru%20Li">Jia-Ru Li</a>, <a href="https://publications.waset.org/search?q=Chih-Hung%20Wang"> Chih-Hung Wang</a>, <a href="https://publications.waset.org/search?q=Ching-Wen%20Lin"> Ching-Wen Lin</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>The association between emotional inhibition strategies linked to depression has been showed inconsistent among studies. Mild emotional inhibition maybe benefit for social interaction, especially for female among East Asian cultures. The present study aimed to examine whether the inhibition–depression relationship is dependent on level of emotion inhibition and gender context, given differing value of suppressing emotional displays. We hypothesized that the negative associations between inhibition and adolescent depression would not directly, in which affected by interaction between emotion inhibition and gender. To test this hypothesis, we asked 309 junior high school students which age range from 12 to14 years old to report on their use of emotion inhibition and depression syndrome. A multiple regressions analysis revealed that significant interaction that gender as a moderator to the relationships between emotion inhibition and adolescent depression. The group with the highest level of depression was girls with high levels of emotion inhibition, whose depression score was higher than that of boys with high levels of emotion inhibition. The result highlights that the importance of context in understanding the inhibition-depression relationship.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Emotional%20inhibition%20strategies" title="Emotional inhibition strategies">Emotional inhibition strategies</a>, <a href="https://publications.waset.org/search?q=gender" title=" gender"> gender</a>, <a href="https://publications.waset.org/search?q=adolescent%20depression." title=" adolescent depression."> adolescent depression.</a> </p> <a href="https://publications.waset.org/8577/the-relationship-between-adolescent-emotional-inhibition-and-depression-disorder-the-moderate-effect-of-gender" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/8577/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/8577/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/8577/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/8577/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/8577/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/8577/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/8577/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/8577/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/8577/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/8577/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/8577.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">2082</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1904</span> 3D Face Recognition Using Modified PCA Methods</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Omid%20Gervei">Omid Gervei</a>, <a href="https://publications.waset.org/search?q=Ahmad%20Ayatollahi"> Ahmad Ayatollahi</a>, <a href="https://publications.waset.org/search?q=Navid%20Gervei"> Navid Gervei</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper we present an approach for 3D face recognition based on extracting principal components of range images by utilizing modified PCA methods namely 2DPCA and bidirectional 2DPCA also known as (2D) 2 PCA.A preprocessing stage was implemented on the images to smooth them using median and Gaussian filtering. In the normalization stage we locate the nose tip to lay it at the center of images then crop each image to a standard size of 100*100. In the face recognition stage we extract the principal component of each image using both 2DPCA and (2D) 2 PCA. Finally, we use Euclidean distance to measure the minimum distance between a given test image to the training images in the database. We also compare the result of using both methods. The best result achieved by experiments on a public face database shows that 83.3 percent is the rate of face recognition for a random facial expression. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=3D%20face%20recognition" title="3D face recognition">3D face recognition</a>, <a href="https://publications.waset.org/search?q=2DPCA" title=" 2DPCA"> 2DPCA</a>, <a href="https://publications.waset.org/search?q=%282D%29%202%20PCA" title=" (2D) 2 PCA"> (2D) 2 PCA</a>, <a href="https://publications.waset.org/search?q=Rangeimage" title=" Rangeimage"> Rangeimage</a> </p> <a href="https://publications.waset.org/5789/3d-face-recognition-using-modified-pca-methods" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/5789/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/5789/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/5789/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/5789/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/5789/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/5789/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/5789/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/5789/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/5789/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/5789/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/5789.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">3066</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1903</span> Interactive Agents with Artificial Mind</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Hirohide%20Ushida">Hirohide Ushida</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper discusses an artificial mind model and its applications. The mind model is based on some theories which assert that emotion is an important function in human decision making. An artificial mind model with emotion is built, and the model is applied to action selection of autonomous agents. In three examples, the agents interact with humans and their environments. The examples show the proposed model effectively work in both virtual agents and real robots. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Artificial%20mind" title="Artificial mind">Artificial mind</a>, <a href="https://publications.waset.org/search?q=emotion" title=" emotion"> emotion</a>, <a href="https://publications.waset.org/search?q=interactive%20agent" title=" interactive agent"> interactive agent</a>, <a href="https://publications.waset.org/search?q=pet%20robot" title=" pet robot"> pet robot</a> </p> <a href="https://publications.waset.org/12469/interactive-agents-with-artificial-mind" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/12469/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/12469/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/12469/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/12469/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/12469/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/12469/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/12469/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/12469/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/12469/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/12469/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/12469.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1252</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1902</span> OCR/ICR Text Recognition Using ABBYY FineReader as an Example Text</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=A.%20R.%20Bagirzade">A. R. Bagirzade</a>, <a href="https://publications.waset.org/search?q=A.%20Sh.%20Najafova"> A. Sh. Najafova</a>, <a href="https://publications.waset.org/search?q=S.%20M.%20Yessirkepova"> S. M. Yessirkepova</a>, <a href="https://publications.waset.org/search?q=E.%20S.%20Albert"> E. S. Albert</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>This article describes a text recognition method based on Optical Character Recognition (OCR). The features of the OCR method were examined using the ABBYY FineReader program. It describes automatic text recognition in images. OCR is necessary because optical input devices can only transmit raster graphics as a result. Text recognition describes the task of recognizing letters shown as such, to identify and assign them an assigned numerical value in accordance with the usual text encoding (ASCII, Unicode). The peculiarity of this study conducted by the authors using the example of the ABBYY FineReader, was confirmed and shown in practice, the improvement of digital text recognition platforms developed by Electronic Publication.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=ABBYY%20FineReader%20system" title="ABBYY FineReader system">ABBYY FineReader system</a>, <a href="https://publications.waset.org/search?q=algorithm%20symbol%20recognition" title=" algorithm symbol recognition"> algorithm symbol recognition</a>, <a href="https://publications.waset.org/search?q=OCR%2FICR%20techniques" title=" OCR/ICR techniques"> OCR/ICR techniques</a>, <a href="https://publications.waset.org/search?q=recognition%20technologies." title=" recognition technologies. "> recognition technologies. </a> </p> <a href="https://publications.waset.org/10011852/ocricr-text-recognition-using-abbyy-finereader-as-an-example-text" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10011852/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10011852/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10011852/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10011852/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10011852/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10011852/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10011852/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10011852/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10011852/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10011852/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10011852.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">781</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1901</span> Intention Recognition using a Graph Representation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=So-Jeong%20Youn">So-Jeong Youn</a>, <a href="https://publications.waset.org/search?q=Kyung-Whan%20Oh"> Kyung-Whan Oh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The human friendly interaction is the key function of a human-centered system. Over the years, it has received much attention to develop the convenient interaction through intention recognition. Intention recognition processes multimodal inputs including speech, face images, and body gestures. In this paper, we suggest a novel approach of intention recognition using a graph representation called Intention Graph. A concept of valid intention is proposed, as a target of intention recognition. Our approach has two phases: goal recognition phase and intention recognition phase. In the goal recognition phase, we generate an action graph based on the observed actions, and then the candidate goals and their plans are recognized. In the intention recognition phase, the intention is recognized with relevant goals and user profile. We show that the algorithm has polynomial time complexity. The intention graph is applied to a simple briefcase domain to test our model. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Intention%20recognition" title="Intention recognition">Intention recognition</a>, <a href="https://publications.waset.org/search?q=intention" title=" intention"> intention</a>, <a href="https://publications.waset.org/search?q=graph" title=" graph"> graph</a>, <a href="https://publications.waset.org/search?q=HCI." title=" HCI."> HCI.</a> </p> <a href="https://publications.waset.org/11699/intention-recognition-using-a-graph-representation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/11699/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/11699/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/11699/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/11699/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/11699/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/11699/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/11699/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/11699/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/11699/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/11699/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/11699.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">3397</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1900</span> A New Biologically Inspired Pattern Recognition Spproach for Face Recognition</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=V.%20Kabeer">V. Kabeer</a>, <a href="https://publications.waset.org/search?q=N.K.Narayanan"> N.K.Narayanan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>This paper reports a new pattern recognition approach for face recognition. The biological model of light receptors - cones and rods in human eyes and the way they are associated with pattern vision in human vision forms the basis of this approach. The functional model is simulated using CWD and WPD. The paper also discusses the experiments performed for face recognition using the features extracted from images in the AT & T face database. Artificial Neural Network and k- Nearest Neighbour classifier algorithms are employed for the recognition purpose. A feature vector is formed for each of the face images in the database and recognition accuracies are computed and compared using the classifiers. Simulation results show that the proposed method outperforms traditional way of feature extraction methods prevailing for pattern recognition in terms of recognition accuracy for face images with pose and illumination variations.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Face%20recognition" title="Face recognition">Face recognition</a>, <a href="https://publications.waset.org/search?q=Image%20analysis" title=" Image analysis"> Image analysis</a>, <a href="https://publications.waset.org/search?q=Wavelet%20feature%20extraction" title=" Wavelet feature extraction"> Wavelet feature extraction</a>, <a href="https://publications.waset.org/search?q=Pattern%20recognition" title=" Pattern recognition"> Pattern recognition</a>, <a href="https://publications.waset.org/search?q=Classifier%20algorithms" title=" Classifier algorithms"> Classifier algorithms</a> </p> <a href="https://publications.waset.org/13389/a-new-biologically-inspired-pattern-recognition-spproach-for-face-recognition" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/13389/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/13389/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/13389/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/13389/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/13389/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/13389/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/13389/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/13389/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/13389/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/13389/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/13389.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1677</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1899</span> Emotions in Health Tweets: Analysis of American Government Official Accounts</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Garc%C3%ADa%20L%C3%B3pez">Garc铆a L贸pez</a> </p> <p class="card-text"><strong>Abstract:</strong></p> <p>The Government Departments of Health have the task of informing and educating citizens about public health issues. For this, they use channels like Twitter, key in the search for health information and the propagation of content. The tweets, important in the virality of the content, may contain emotions that influence the contagion and exchange of knowledge. The goal of this study is to perform an analysis of the emotional projection of health information shared on Twitter by official American accounts: the disease control account <em>CDCgov</em>, National Institutes of Health, <em>NIH</em>, the government agency <em>HHSGov</em>, and the professional organization <em>PublicHealth</em>. For this, we used Tone Analyzer, an International Business Machines Corporation (IBM) tool specialized in emotion detection in text, corresponding to the categorical model of emotion representation. For 15 days, all tweets from these accounts were analyzed with the emotional analysis tool in text. The results showed that their tweets contain an important emotional load, a determining factor in the success of their communications. This exposes that official accounts also use subjective language and contain emotions. The predominance of emotion joy over sadness and the strong presence of emotions in their tweets stimulate the virality of content, a key in the work of informing that government health departments have.</p> <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Emotions%20in%20tweets%20emotion%20detection%20in%20text" title="Emotions in tweets emotion detection in text">Emotions in tweets emotion detection in text</a>, <a href="https://publications.waset.org/search?q=health%20information%20on%20Twitter" title=" health information on Twitter"> health information on Twitter</a>, <a href="https://publications.waset.org/search?q=American%20health%20official%20accounts" title=" American health official accounts"> American health official accounts</a>, <a href="https://publications.waset.org/search?q=emotions%20on%20Twitter" title=" emotions on Twitter"> emotions on Twitter</a>, <a href="https://publications.waset.org/search?q=emotions%20and%20content." title=" emotions and content."> emotions and content.</a> </p> <a href="https://publications.waset.org/10010145/emotions-in-health-tweets-analysis-of-american-government-official-accounts" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10010145/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10010145/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10010145/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10010145/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10010145/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10010145/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10010145/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10010145/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10010145/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10010145/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10010145.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">697</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1898</span> Affective Robots: Evaluation of Automatic Emotion Recognition Approaches on a Humanoid Robot towards Emotionally Intelligent Machines</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Silvia%20Santano%20Guill%C3%A9n">Silvia Santano Guill茅n</a>, <a href="https://publications.waset.org/search?q=Luigi%20Lo%20Iacono"> Luigi Lo Iacono</a>, <a href="https://publications.waset.org/search?q=Christian%20Meder"> Christian Meder</a> </p> <p class="card-text"><strong>Abstract:</strong></p> One of the main aims of current social robotic research is to improve the robots’ abilities to interact with humans. In order to achieve an interaction similar to that among humans, robots should be able to communicate in an intuitive and natural way and appropriately interpret human affects during social interactions. Similarly to how humans are able to recognize emotions in other humans, machines are capable of extracting information from the various ways humans convey emotions—including facial expression, speech, gesture or text—and using this information for improved human computer interaction. This can be described as Affective Computing, an interdisciplinary field that expands into otherwise unrelated fields like psychology and cognitive science and involves the research and development of systems that can recognize and interpret human affects. To leverage these emotional capabilities by embedding them in humanoid robots is the foundation of the concept Affective Robots, which has the objective of making robots capable of sensing the user’s current mood and personality traits and adapt their behavior in the most appropriate manner based on that. In this paper, the emotion recognition capabilities of the humanoid robot Pepper are experimentally explored, based on the facial expressions for the so-called basic emotions, as well as how it performs in contrast to other state-of-the-art approaches with both expression databases compiled in academic environments and real subjects showing posed expressions as well as spontaneous emotional reactions. The experiments’ results show that the detection accuracy amongst the evaluated approaches differs substantially. The introduced experiments offer a general structure and approach for conducting such experimental evaluations. The paper further suggests that the most meaningful results are obtained by conducting experiments with real subjects expressing the emotions as spontaneous reactions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Affective%20computing" title="Affective computing">Affective computing</a>, <a href="https://publications.waset.org/search?q=emotion%20recognition" title=" emotion recognition"> emotion recognition</a>, <a href="https://publications.waset.org/search?q=humanoid%0D%0Arobot" title=" humanoid robot"> humanoid robot</a>, <a href="https://publications.waset.org/search?q=Human-Robot-Interaction%20%28HRI%29" title=" Human-Robot-Interaction (HRI)"> Human-Robot-Interaction (HRI)</a>, <a href="https://publications.waset.org/search?q=social%20robots." title=" social robots."> social robots.</a> </p> <a href="https://publications.waset.org/10009027/affective-robots-evaluation-of-automatic-emotion-recognition-approaches-on-a-humanoid-robot-towards-emotionally-intelligent-machines" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10009027/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10009027/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10009027/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10009027/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10009027/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10009027/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10009027/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10009027/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10009027/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10009027/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10009027.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">1355</span> </span> </div> </div> <div class="card publication-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1897</span> The Influence of Job Recognition and Job Motivation on Organizational Commitment in Public Sector: The Mediation Role of Employee Engagement</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/search?q=Muhammad%20Tayyab">Muhammad Tayyab</a>, <a href="https://publications.waset.org/search?q=Saba%20Saira"> Saba Saira</a> </p> <p class="card-text"><strong>Abstract:</strong></p> It is an established fact that organizations across the globe consider employees as their assets and try to advance their well-being. However, the local firms of developing countries are mostly profit oriented and do not have much concern about their employees’ engagement or commitment. Like other developing countries, the local organizations of Pakistan are also less concerned about the well-being of their employees. Especially public sector organizations lack concern regarding engagement, satisfaction or commitment of the employees. Therefore, this study aimed at investigating the impact of job recognition and job motivation on organizational commitment in the mediation role of employee engagement. The data were collected from land record officers of board of revenue, Punjab, Pakistan. Structured questionnaire was used to collect data through physically visiting land record officers and also through the internet. A total of 318 land record officers’ responses were finalized to perform data analysis. The data were analyzed through confirmatory factor analysis and structural equation modeling technique. The findings revealed that job recognition and job motivation have direct as well as indirect positive and significant impact on organizational commitment. The limitations, practical implications and future research indications are also explained. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/search?q=Job%20motivation" title="Job motivation">Job motivation</a>, <a href="https://publications.waset.org/search?q=job%20recognition" title=" job recognition"> job recognition</a>, <a href="https://publications.waset.org/search?q=employee%20engagement" title=" employee engagement"> employee engagement</a>, <a href="https://publications.waset.org/search?q=employee%20commitment" title=" employee commitment"> employee commitment</a>, <a href="https://publications.waset.org/search?q=public%20sector" title=" public sector"> public sector</a>, <a href="https://publications.waset.org/search?q=land%20record%20officers." title=" land record officers."> land record officers.</a> </p> <a href="https://publications.waset.org/10012007/the-influence-of-job-recognition-and-job-motivation-on-organizational-commitment-in-public-sector-the-mediation-role-of-employee-engagement" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/10012007/apa" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">APA</a> <a href="https://publications.waset.org/10012007/bibtex" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">BibTeX</a> <a href="https://publications.waset.org/10012007/chicago" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Chicago</a> <a href="https://publications.waset.org/10012007/endnote" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">EndNote</a> <a href="https://publications.waset.org/10012007/harvard" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">Harvard</a> <a href="https://publications.waset.org/10012007/json" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">JSON</a> <a href="https://publications.waset.org/10012007/mla" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">MLA</a> <a href="https://publications.waset.org/10012007/ris" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">RIS</a> <a href="https://publications.waset.org/10012007/xml" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">XML</a> <a href="https://publications.waset.org/10012007/iso690" target="_blank" rel="nofollow" class="btn btn-primary btn-sm">ISO 690</a> <a href="https://publications.waset.org/10012007.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">848</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">‹</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=Public%20emotion%20recognition&page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=Public%20emotion%20recognition&page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=Public%20emotion%20recognition&page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=Public%20emotion%20recognition&page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=Public%20emotion%20recognition&page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=Public%20emotion%20recognition&page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=Public%20emotion%20recognition&page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=Public%20emotion%20recognition&page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=Public%20emotion%20recognition&page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=Public%20emotion%20recognition&page=64">64</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=Public%20emotion%20recognition&page=65">65</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/search?q=Public%20emotion%20recognition&page=2" rel="next">›</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">© 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">×</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>