CINXE.COM

Search results for: emotion detection

<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: emotion detection</title> <meta name="description" content="Search results for: emotion detection"> <meta name="keywords" content="emotion detection"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="emotion detection" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="emotion detection"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 3835</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: emotion detection</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3835</span> Speech Detection Model Based on Deep Neural Networks Classifier for Speech Emotions Recognition</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=A.%20Shoiynbek">A. Shoiynbek</a>, <a href="https://publications.waset.org/abstracts/search?q=K.%20Kozhakhmet"> K. Kozhakhmet</a>, <a href="https://publications.waset.org/abstracts/search?q=P.%20Menezes"> P. Menezes</a>, <a href="https://publications.waset.org/abstracts/search?q=D.%20Kuanyshbay"> D. Kuanyshbay</a>, <a href="https://publications.waset.org/abstracts/search?q=D.%20Bayazitov"> D. Bayazitov</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Speech emotion recognition has received increasing research interest all through current years. There was used emotional speech that was collected under controlled conditions in most research work. Actors imitating and artificially producing emotions in front of a microphone noted those records. There are four issues related to that approach, namely, (1) emotions are not natural, and it means that machines are learning to recognize fake emotions. (2) Emotions are very limited by quantity and poor in their variety of speaking. (3) There is language dependency on SER. (4) Consequently, each time when researchers want to start work with SER, they need to find a good emotional database on their language. In this paper, we propose the approach to create an automatic tool for speech emotion extraction based on facial emotion recognition and describe the sequence of actions of the proposed approach. One of the first objectives of the sequence of actions is a speech detection issue. The paper gives a detailed description of the speech detection model based on a fully connected deep neural network for Kazakh and Russian languages. Despite the high results in speech detection for Kazakh and Russian, the described process is suitable for any language. To illustrate the working capacity of the developed model, we have performed an analysis of speech detection and extraction from real tasks. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=deep%20neural%20networks" title="deep neural networks">deep neural networks</a>, <a href="https://publications.waset.org/abstracts/search?q=speech%20detection" title=" speech detection"> speech detection</a>, <a href="https://publications.waset.org/abstracts/search?q=speech%20emotion%20recognition" title=" speech emotion recognition"> speech emotion recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=Mel-frequency%20cepstrum%20coefficients" title=" Mel-frequency cepstrum coefficients"> Mel-frequency cepstrum coefficients</a>, <a href="https://publications.waset.org/abstracts/search?q=collecting%20speech%20emotion%20corpus" title=" collecting speech emotion corpus"> collecting speech emotion corpus</a>, <a href="https://publications.waset.org/abstracts/search?q=collecting%20speech%20emotion%20dataset" title=" collecting speech emotion dataset"> collecting speech emotion dataset</a>, <a href="https://publications.waset.org/abstracts/search?q=Kazakh%20speech%20dataset" title=" Kazakh speech dataset"> Kazakh speech dataset</a> </p> <a href="https://publications.waset.org/abstracts/152814/speech-detection-model-based-on-deep-neural-networks-classifier-for-speech-emotions-recognition" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/152814.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">101</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3834</span> Multimodal Characterization of Emotion within Multimedia Space</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Dayo%20Samuel%20Banjo">Dayo Samuel Banjo</a>, <a href="https://publications.waset.org/abstracts/search?q=Connice%20Trimmingham"> Connice Trimmingham</a>, <a href="https://publications.waset.org/abstracts/search?q=Niloofar%20Yousefi"> Niloofar Yousefi</a>, <a href="https://publications.waset.org/abstracts/search?q=Nitin%20Agarwal"> Nitin Agarwal</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Technological advancement and its omnipresent connection have pushed humans past the boundaries and limitations of a computer screen, physical state, or geographical location. It has provided a depth of avenues that facilitate human-computer interaction that was once inconceivable such as audio and body language detection. Given the complex modularities of emotions, it becomes vital to study human-computer interaction, as it is the commencement of a thorough understanding of the emotional state of users and, in the context of social networks, the producers of multimodal information. This study first acknowledges the accuracy of classification found within multimodal emotion detection systems compared to unimodal solutions. Second, it explores the characterization of multimedia content produced based on their emotions and the coherence of emotion in different modalities by utilizing deep learning models to classify emotion across different modalities. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=affective%20computing" title="affective computing">affective computing</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title=" deep learning"> deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20recognition" title=" emotion recognition"> emotion recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=multimodal" title=" multimodal"> multimodal</a> </p> <a href="https://publications.waset.org/abstracts/157830/multimodal-characterization-of-emotion-within-multimedia-space" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/157830.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">158</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3833</span> Speech Detection Model Based on Deep Neural Networks Classifier for Speech Emotions Recognition</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Aisultan%20Shoiynbek">Aisultan Shoiynbek</a>, <a href="https://publications.waset.org/abstracts/search?q=Darkhan%20Kuanyshbay"> Darkhan Kuanyshbay</a>, <a href="https://publications.waset.org/abstracts/search?q=Paulo%20Menezes"> Paulo Menezes</a>, <a href="https://publications.waset.org/abstracts/search?q=Akbayan%20Bekarystankyzy"> Akbayan Bekarystankyzy</a>, <a href="https://publications.waset.org/abstracts/search?q=Assylbek%20Mukhametzhanov"> Assylbek Mukhametzhanov</a>, <a href="https://publications.waset.org/abstracts/search?q=Temirlan%20Shoiynbek"> Temirlan Shoiynbek</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Speech emotion recognition (SER) has received increasing research interest in recent years. It is a common practice to utilize emotional speech collected under controlled conditions recorded by actors imitating and artificially producing emotions in front of a microphone. There are four issues related to that approach: emotions are not natural, meaning that machines are learning to recognize fake emotions; emotions are very limited in quantity and poor in variety of speaking; there is some language dependency in SER; consequently, each time researchers want to start work with SER, they need to find a good emotional database in their language. This paper proposes an approach to create an automatic tool for speech emotion extraction based on facial emotion recognition and describes the sequence of actions involved in the proposed approach. One of the first objectives in the sequence of actions is the speech detection issue. The paper provides a detailed description of the speech detection model based on a fully connected deep neural network for Kazakh and Russian. Despite the high results in speech detection for Kazakh and Russian, the described process is suitable for any language. To investigate the working capacity of the developed model, an analysis of speech detection and extraction from real tasks has been performed. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=deep%20neural%20networks" title="deep neural networks">deep neural networks</a>, <a href="https://publications.waset.org/abstracts/search?q=speech%20detection" title=" speech detection"> speech detection</a>, <a href="https://publications.waset.org/abstracts/search?q=speech%20emotion%20recognition" title=" speech emotion recognition"> speech emotion recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=Mel-frequency%20cepstrum%20coefficients" title=" Mel-frequency cepstrum coefficients"> Mel-frequency cepstrum coefficients</a>, <a href="https://publications.waset.org/abstracts/search?q=collecting%20speech%20emotion%20corpus" title=" collecting speech emotion corpus"> collecting speech emotion corpus</a>, <a href="https://publications.waset.org/abstracts/search?q=collecting%20speech%20emotion%20dataset" title=" collecting speech emotion dataset"> collecting speech emotion dataset</a>, <a href="https://publications.waset.org/abstracts/search?q=Kazakh%20speech%20dataset" title=" Kazakh speech dataset"> Kazakh speech dataset</a> </p> <a href="https://publications.waset.org/abstracts/189328/speech-detection-model-based-on-deep-neural-networks-classifier-for-speech-emotions-recognition" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/189328.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">26</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3832</span> Emotion Oriented Students&#039; Opinioned Topic Detection for Course Reviews in Massive Open Online Course</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Zhi%20Liu">Zhi Liu</a>, <a href="https://publications.waset.org/abstracts/search?q=Xian%20Peng"> Xian Peng</a>, <a href="https://publications.waset.org/abstracts/search?q=Monika%20Domanska"> Monika Domanska</a>, <a href="https://publications.waset.org/abstracts/search?q=Lingyun%20Kang"> Lingyun Kang</a>, <a href="https://publications.waset.org/abstracts/search?q=Sannyuya%20Liu"> Sannyuya Liu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Massive Open education has become increasingly popular among worldwide learners. An increasing number of course reviews are being generated in Massive Open Online Course (MOOC) platform, which offers an interactive feedback channel for learners to express opinions and feelings in learning. These reviews typically contain subjective emotion and topic information towards the courses. However, it is time-consuming to artificially detect these opinions. In this paper, we propose an emotion-oriented topic detection model to automatically detect the students’ opinioned aspects in course reviews. The known overall emotion orientation and emotional words in each review are used to guide the joint probabilistic modeling of emotion and aspects in reviews. Through the experiment on real-life review data, it is verified that the distribution of course-emotion-aspect can be calculated to capture the most significant opinioned topics in each course unit. This proposed technique helps in conducting intelligent learning analytics for teachers to improve pedagogies and for developers to promote user experiences. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Massive%20Open%20Online%20Course%20%28MOOC%29" title="Massive Open Online Course (MOOC)">Massive Open Online Course (MOOC)</a>, <a href="https://publications.waset.org/abstracts/search?q=course%20reviews" title=" course reviews"> course reviews</a>, <a href="https://publications.waset.org/abstracts/search?q=topic%20model" title=" topic model"> topic model</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20recognition" title=" emotion recognition"> emotion recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=topical%20aspects" title=" topical aspects"> topical aspects</a> </p> <a href="https://publications.waset.org/abstracts/86771/emotion-oriented-students-opinioned-topic-detection-for-course-reviews-in-massive-open-online-course" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/86771.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">262</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3831</span> The Role of Emotion in Attention Allocation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Michaela%20Porubanova">Michaela Porubanova</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this exploratory study to examine the effects of emotional significance on change detection using the flicker paradigm, three different categories of scenes were randomly presented (neutral, positive and negative) in three different blocks. We hypothesized that because of the different effects on attention, performance in change detection tasks differs for scenes with different effective values. We found the greatest accuracy of change detection was for changes occurring in positive and negative scenes (compared with neutral scenes). Secondly and most importantly, changes in negative scenes (and also positive scenes, though not with statistical significance) were detected faster than changes in neutral scenes. Interestingly, women were less accurate than men in detecting changes in emotionally significant scenes (both negative and positive), i.e., women detected fewer changes in emotional scenes in the time limit of 40s. But on the other hand, women were quicker to detect changes in positive and negative images than men. The study makes important contributions to the area of the role of emotions on information processing. The role of emotion in attention will be discussed. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=attention" title="attention">attention</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion" title=" emotion"> emotion</a>, <a href="https://publications.waset.org/abstracts/search?q=flicker%20task" title=" flicker task"> flicker task</a>, <a href="https://publications.waset.org/abstracts/search?q=IAPS" title=" IAPS"> IAPS</a> </p> <a href="https://publications.waset.org/abstracts/10319/the-role-of-emotion-in-attention-allocation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/10319.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">354</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3830</span> Emotion Recognition in Video and Images in the Wild</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Faizan%20Tariq">Faizan Tariq</a>, <a href="https://publications.waset.org/abstracts/search?q=Moayid%20Ali%20Zaidi"> Moayid Ali Zaidi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Facial emotion recognition algorithms are expanding rapidly now a day. People are using different algorithms with different combinations to generate best results. There are six basic emotions which are being studied in this area. Author tried to recognize the facial expressions using object detector algorithms instead of traditional algorithms. Two object detection algorithms were chosen which are Faster R-CNN and YOLO. For pre-processing we used image rotation and batch normalization. The dataset I have chosen for the experiments is Static Facial Expression in Wild (SFEW). Our approach worked well but there is still a lot of room to improve it, which will be a future direction. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=face%20recognition" title="face recognition">face recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20recognition" title=" emotion recognition"> emotion recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title=" deep learning"> deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=CNN" title=" CNN"> CNN</a> </p> <a href="https://publications.waset.org/abstracts/152635/emotion-recognition-in-video-and-images-in-the-wild" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/152635.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">187</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3829</span> An Investigation the Effectiveness of Emotion Regulation Training on the Reduction of Cognitive-Emotion Regulation Problem in Patients with Multiple Sclerosis</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mahboobeh%20Sadeghi">Mahboobeh Sadeghi</a>, <a href="https://publications.waset.org/abstracts/search?q=Zahra%20Izadi%20Khah"> Zahra Izadi Khah</a>, <a href="https://publications.waset.org/abstracts/search?q=Mansour%20Hakim%20Javadi"> Mansour Hakim Javadi</a>, <a href="https://publications.waset.org/abstracts/search?q=Masoud%20Gholamali%20Lavasani"> Masoud Gholamali Lavasani</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Background: Since there is a relation between psychological and physiological factors, the aim of this study was to examine the effect of Emotion Regulation training on cognitive emotion regulation problem in patients with Multiple Sclerosis(MS) Method: In a randomized clinical trial thirty patients diagnosed with Multiple Sclerosis referred to state welfare organization were selected. The sample group was randomized into either an experimental group or a nonintervention control group. The subjects participated in 75-minute treatment sessions held three times a week for 4weeks (12 sessions). All 30 individuals were administered with Cognitive Emotion Regulation questionnaire (CERQ). Participants completed the questionnaire in pretest and post-test. Data obtained from the questionnaire was analyzed using Mancova. Results: Emotion Regulation significantly decreased the Cognitive Emotion Regulation problems patients with Multiple sclerosis (p < 0.001). Conclusions: Emotion Regulation can be used for the treatment of cognitive-emotion regulation problem in Multiple sclerosis. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Multiple%20Sclerosis" title="Multiple Sclerosis">Multiple Sclerosis</a>, <a href="https://publications.waset.org/abstracts/search?q=cognitive-emotion%20regulation" title=" cognitive-emotion regulation"> cognitive-emotion regulation</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20regulation" title=" emotion regulation"> emotion regulation</a>, <a href="https://publications.waset.org/abstracts/search?q=MS" title=" MS"> MS</a> </p> <a href="https://publications.waset.org/abstracts/8075/an-investigation-the-effectiveness-of-emotion-regulation-training-on-the-reduction-of-cognitive-emotion-regulation-problem-in-patients-with-multiple-sclerosis" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/8075.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">459</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3828</span> Hand Gestures Based Emotion Identification Using Flex Sensors</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=S.%20Ali">S. Ali</a>, <a href="https://publications.waset.org/abstracts/search?q=R.%20Yunus"> R. Yunus</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Arif"> A. Arif</a>, <a href="https://publications.waset.org/abstracts/search?q=Y.%20Ayaz"> Y. Ayaz</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Baber%20Sial"> M. Baber Sial</a>, <a href="https://publications.waset.org/abstracts/search?q=R.%20Asif"> R. Asif</a>, <a href="https://publications.waset.org/abstracts/search?q=N.%20Naseer"> N. Naseer</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Jawad%20Khan"> M. Jawad Khan </a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this study, we have proposed a gesture to emotion recognition method using flex sensors mounted on metacarpophalangeal joints. The flex sensors are fixed in a wearable glove. The data from the glove are sent to PC using Wi-Fi. Four gestures: finger pointing, thumbs up, fist open and fist close are performed by five subjects. Each gesture is categorized into sad, happy, and excited class based on the velocity and acceleration of the hand gesture. Seventeen inspectors observed the emotions and hand gestures of the five subjects. The emotional state based on the investigators assessment and acquired movement speed data is compared. Overall, we achieved 77% accurate results. Therefore, the proposed design can be used for emotional state detection applications. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=emotion%20identification" title="emotion identification">emotion identification</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20models" title=" emotion models"> emotion models</a>, <a href="https://publications.waset.org/abstracts/search?q=gesture%20recognition" title=" gesture recognition"> gesture recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=user%20perception" title=" user perception"> user perception</a> </p> <a href="https://publications.waset.org/abstracts/98297/hand-gestures-based-emotion-identification-using-flex-sensors" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/98297.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">285</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3827</span> Emotions in Health Tweets: Analysis of American Government Official Accounts</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Garc%C3%ADa%20L%C3%B3pez">García López</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The Government Departments of Health have the task of informing and educating citizens about public health issues. For this, they use channels like Twitter, key in the search for health information and the propagation of content. The tweets, important in the virality of the content, may contain emotions that influence the contagion and exchange of knowledge. The goal of this study is to perform an analysis of the emotional projection of health information shared on Twitter by official American accounts: the disease control account <em>CDCgov</em>, National Institutes of Health, <em>NIH</em>, the government agency <em>HHSGov</em>, and the professional organization <em>PublicHealth</em>. For this, we used Tone Analyzer, an International Business Machines Corporation (IBM) tool specialized in emotion detection in text, corresponding to the categorical model of emotion representation. For 15 days, all tweets from these accounts were analyzed with the emotional analysis tool in text. The results showed that their tweets contain an important emotional load, a determining factor in the success of their communications. This exposes that official accounts also use subjective language and contain emotions. The predominance of emotion joy over sadness and the strong presence of emotions in their tweets stimulate the virality of content, a key in the work of informing that government health departments have. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=emotions%20in%20tweets" title="emotions in tweets">emotions in tweets</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20detection%20in%20the%20text" title=" emotion detection in the text"> emotion detection in the text</a>, <a href="https://publications.waset.org/abstracts/search?q=health%20information%20on%20Twitter" title=" health information on Twitter"> health information on Twitter</a>, <a href="https://publications.waset.org/abstracts/search?q=American%20health%20official%20accounts" title=" American health official accounts"> American health official accounts</a>, <a href="https://publications.waset.org/abstracts/search?q=emotions%20on%20Twitter" title=" emotions on Twitter"> emotions on Twitter</a>, <a href="https://publications.waset.org/abstracts/search?q=emotions%20and%20content" title=" emotions and content"> emotions and content</a> </p> <a href="https://publications.waset.org/abstracts/95743/emotions-in-health-tweets-analysis-of-american-government-official-accounts" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/95743.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">142</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3826</span> Parental Bonding and Cognitive Emotion Regulation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Fariea%20Bakul">Fariea Bakul</a>, <a href="https://publications.waset.org/abstracts/search?q=Chhanda%20Karmaker"> Chhanda Karmaker</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The present study was designed to investigate the effects of parental bonding on adult’s cognitive emotion regulation and also to investigate gender differences in parental bonding and cognitive emotion regulation. Data were collected by using convenience sampling technique from 100 adult students (50 males and 50 females) of different universities of Dhaka city, ages between 20 to 25 years, using Bengali version of Parental Bonding Inventory and Bengali version of Cognitive Emotion Regulation Questionnaire. The obtained data were analyzed by using multiple regression analysis and independent samples t-test. The results revealed that fathers care (β =0.317, p < 0.05) was only significantly positively associated with adult’s cognitive emotion regulation. Adjusted R² indicated that the model explained 30% of the variance in adult’s adaptive cognitive emotion regulation. No significant association was found between parental bonding and less adaptive cognitive emotion regulations. Results from independent samples t-test also revealed that there was no significant gender difference in both parental bonding and cognitive emotion regulations. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=cognitive%20emotion%20regulation" title="cognitive emotion regulation">cognitive emotion regulation</a>, <a href="https://publications.waset.org/abstracts/search?q=parental%20bonding" title=" parental bonding"> parental bonding</a>, <a href="https://publications.waset.org/abstracts/search?q=parental%20care" title=" parental care"> parental care</a>, <a href="https://publications.waset.org/abstracts/search?q=parental%20over-protection" title=" parental over-protection"> parental over-protection</a> </p> <a href="https://publications.waset.org/abstracts/66673/parental-bonding-and-cognitive-emotion-regulation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/66673.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">371</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3825</span> Emotion Detection in a General Human-Robot Interaction System Optimized for Embedded Platforms</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Julio%20Vega">Julio Vega</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Expression recognition is a field of Artificial Intelligence whose main objectives are to recognize basic forms of affective expression that appear on people’s faces and contributing to behavioral studies. In this work, a ROS node has been developed that, based on Deep Learning techniques, is capable of detecting the facial expressions of the people that appear in the image. These algorithms were optimized so that they can be executed in real time on an embedded platform. The experiments were carried out in a PC with a USB camera and in a Raspberry Pi 4 with a PiCamera. The final results shows a plausible system, which is capable to work in real time even in an embedded platform. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=python" title="python">python</a>, <a href="https://publications.waset.org/abstracts/search?q=low-cost" title=" low-cost"> low-cost</a>, <a href="https://publications.waset.org/abstracts/search?q=raspberry%20pi" title=" raspberry pi"> raspberry pi</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20detection" title=" emotion detection"> emotion detection</a>, <a href="https://publications.waset.org/abstracts/search?q=human-robot%20interaction" title=" human-robot interaction"> human-robot interaction</a>, <a href="https://publications.waset.org/abstracts/search?q=ROS%20node" title=" ROS node"> ROS node</a> </p> <a href="https://publications.waset.org/abstracts/151311/emotion-detection-in-a-general-human-robot-interaction-system-optimized-for-embedded-platforms" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/151311.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">129</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3824</span> Optimizing Machine Learning Through Python Based Image Processing Techniques</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Srinidhi.%20A">Srinidhi. A</a>, <a href="https://publications.waset.org/abstracts/search?q=Naveed%20Ahmed"> Naveed Ahmed</a>, <a href="https://publications.waset.org/abstracts/search?q=Twinkle%20Hareendran"> Twinkle Hareendran</a>, <a href="https://publications.waset.org/abstracts/search?q=Vriksha%20Prakash"> Vriksha Prakash</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This work reviews some of the advanced image processing techniques for deep learning applications. Object detection by template matching, image denoising, edge detection, and super-resolution modelling are but a few of the tasks. The paper looks in into great detail, given that such tasks are crucial preprocessing steps that increase the quality and usability of image datasets in subsequent deep learning tasks. We review some of the methods for the assessment of image quality, more specifically sharpness, which is crucial to ensure a robust performance of models. Further, we will discuss the development of deep learning models specific to facial emotion detection, age classification, and gender classification, which essentially includes the preprocessing techniques interrelated with model performance. Conclusions from this study pinpoint the best practices in the preparation of image datasets, targeting the best trade-off between computational efficiency and retaining important image features critical for effective training of deep learning models. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=image%20processing" title="image processing">image processing</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning%20applications" title=" machine learning applications"> machine learning applications</a>, <a href="https://publications.waset.org/abstracts/search?q=template%20matching" title=" template matching"> template matching</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20detection" title=" emotion detection"> emotion detection</a> </p> <a href="https://publications.waset.org/abstracts/193107/optimizing-machine-learning-through-python-based-image-processing-techniques" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/193107.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">15</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3823</span> Age Related Changes in the Neural Substrates of Emotion Regulation: Mechanisms, Consequences, and Interventions</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yasaman%20Mohammadi">Yasaman Mohammadi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Emotion regulation is a complex process that allows individuals to manage and modulate their emotional responses in order to adaptively respond to environmental demands. As individuals age, emotion regulation abilities may decline, leading to an increased vulnerability to mood disorders and other negative health outcomes. Advances in neuroimaging techniques have greatly enhanced our understanding of the neural substrates underlying emotion regulation and age-related changes in these neural systems. Additionally, genetic research has identified several candidate genes that may influence age-related changes in emotion regulation. In this paper, we review recent findings from neuroimaging and genetic research on age-related changes in the neural substrates of emotion regulation, highlighting the mechanisms and consequences of these changes. We also discuss potential interventions, including cognitive and behavioral approaches, that may be effective in mitigating age-related declines in emotion regulation. We propose that a better understanding of the mechanisms underlying age-related changes in emotion regulation may lead to the development of more targeted interventions aimed at promoting healthy emotional functioning in older adults. Overall, this paper highlights the importance of studying age-related changes in emotion regulation and provides a roadmap for future research in this field. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=emotion%20regulation" title="emotion regulation">emotion regulation</a>, <a href="https://publications.waset.org/abstracts/search?q=aging" title=" aging"> aging</a>, <a href="https://publications.waset.org/abstracts/search?q=neural%20substrates" title=" neural substrates"> neural substrates</a>, <a href="https://publications.waset.org/abstracts/search?q=neuroimaging" title=" neuroimaging"> neuroimaging</a>, <a href="https://publications.waset.org/abstracts/search?q=emotional%20functioning" title=" emotional functioning"> emotional functioning</a>, <a href="https://publications.waset.org/abstracts/search?q=healthy%20aging" title=" healthy aging"> healthy aging</a> </p> <a href="https://publications.waset.org/abstracts/166512/age-related-changes-in-the-neural-substrates-of-emotion-regulation-mechanisms-consequences-and-interventions" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/166512.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">112</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3822</span> Analysis of Nonlinear and Non-Stationary Signal to Extract the Features Using Hilbert Huang Transform</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=A.%20N.%20Paithane">A. N. Paithane</a>, <a href="https://publications.waset.org/abstracts/search?q=D.%20S.%20Bormane"> D. S. Bormane</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20D.%20Shirbahadurkar"> S. D. Shirbahadurkar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> It has been seen that emotion recognition is an important research topic in the field of Human and computer interface. A novel technique for Feature Extraction (FE) has been presented here, further a new method has been used for human emotion recognition which is based on HHT method. This method is feasible for analyzing the nonlinear and non-stationary signals. Each signal has been decomposed into the IMF using the EMD. These functions are used to extract the features using fission and fusion process. The decomposition technique which we adopt is a new technique for adaptively decomposing signals. In this perspective, we have reported here potential usefulness of EMD based techniques.We evaluated the algorithm on Augsburg University Database; the manually annotated database. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=intrinsic%20mode%20function%20%28IMF%29" title="intrinsic mode function (IMF)">intrinsic mode function (IMF)</a>, <a href="https://publications.waset.org/abstracts/search?q=Hilbert-Huang%20transform%20%28HHT%29" title=" Hilbert-Huang transform (HHT)"> Hilbert-Huang transform (HHT)</a>, <a href="https://publications.waset.org/abstracts/search?q=empirical%20mode%20decomposition%20%28EMD%29" title=" empirical mode decomposition (EMD)"> empirical mode decomposition (EMD)</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20detection" title=" emotion detection"> emotion detection</a>, <a href="https://publications.waset.org/abstracts/search?q=electrocardiogram%20%28ECG%29" title=" electrocardiogram (ECG)"> electrocardiogram (ECG)</a> </p> <a href="https://publications.waset.org/abstracts/19551/analysis-of-nonlinear-and-non-stationary-signal-to-extract-the-features-using-hilbert-huang-transform" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/19551.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">580</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3821</span> The Role of Parental Stress and Emotion Regulation in Responding to Children’s Expression of Negative Emotion</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Lizel%20Bertie">Lizel Bertie</a>, <a href="https://publications.waset.org/abstracts/search?q=Kim%20Johnston"> Kim Johnston</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Parental emotion regulation plays a central role in the socialisation of emotion, especially when teaching young children to cope with negative emotions. Despite evidence which shows non-supportive parental responses to children’s expression of negative emotions has implications for the social and emotional development of the child, few studies have investigated risk factors which impact parental emotion socialisation processes. The current study aimed to explore the extent to which parental stress contributes to both difficulties in parental emotion regulation and non-supportive parental responses to children’s expression of negative emotions. In addition, the study examined whether parental use of expressive suppression as an emotion regulation strategy facilitates the influence of parental stress on non-supportive responses by testing the relations in a mediation model. A sample of 140 Australian adults, who identified as parents with children aged 5 to 10 years, completed an online questionnaire. The measures explored recent symptoms of depression, anxiety, and stress, the use of expressive suppression as an emotion regulation strategy, and hypothetical parental responses to scenarios related to children’s expression of negative emotions. A mediated regression indicated that parents who reported higher levels of stress also reported higher levels of expressive suppression as an emotion regulation strategy and increased use of non-supportive responses in relation to young children’s expression of negative emotions. These findings suggest that parents who experience heightened symptoms of stress are more likely to both suppress their emotions in parent-child interaction and engage in non-supportive responses. Furthermore, higher use of expressive suppression strongly predicted the use of non-supportive responses, despite the presence of parental stress. Contrary to expectation, no indirect effect of stress on non-supportive responses was observed via expressive suppression. The findings from the study suggest that parental stress may become a more salient manifestation of psychological distress in a sub-clinical population of parents while contributing to impaired parental responses. As such, the study offers support for targeting overarching factors such as difficulties in parental emotion regulation and stress management, not only as an intervention for parental psychological distress, but also the detection and prevention of maladaptive parenting practices. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=emotion%20regulation" title="emotion regulation">emotion regulation</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20socialisation" title=" emotion socialisation"> emotion socialisation</a>, <a href="https://publications.waset.org/abstracts/search?q=expressive%20suppression" title=" expressive suppression"> expressive suppression</a>, <a href="https://publications.waset.org/abstracts/search?q=non-supportive%20responses" title=" non-supportive responses"> non-supportive responses</a>, <a href="https://publications.waset.org/abstracts/search?q=parental%20stress" title=" parental stress"> parental stress</a> </p> <a href="https://publications.waset.org/abstracts/110523/the-role-of-parental-stress-and-emotion-regulation-in-responding-to-childrens-expression-of-negative-emotion" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/110523.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">160</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3820</span> Deep-Learning Based Approach to Facial Emotion Recognition through Convolutional Neural Network</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Nouha%20Khediri">Nouha Khediri</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohammed%20Ben%20Ammar"> Mohammed Ben Ammar</a>, <a href="https://publications.waset.org/abstracts/search?q=Monji%20Kherallah"> Monji Kherallah</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Recently, facial emotion recognition (FER) has become increasingly essential to understand the state of the human mind. Accurately classifying emotion from the face is a challenging task. In this paper, we present a facial emotion recognition approach named CV-FER, benefiting from deep learning, especially CNN and VGG16. First, the data is pre-processed with data cleaning and data rotation. Then, we augment the data and proceed to our FER model, which contains five convolutions layers and five pooling layers. Finally, a softmax classifier is used in the output layer to recognize emotions. Based on the above contents, this paper reviews the works of facial emotion recognition based on deep learning. Experiments show that our model outperforms the other methods using the same FER2013 database and yields a recognition rate of 92%. We also put forward some suggestions for future work. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=CNN" title="CNN">CNN</a>, <a href="https://publications.waset.org/abstracts/search?q=deep-learning" title=" deep-learning"> deep-learning</a>, <a href="https://publications.waset.org/abstracts/search?q=facial%20emotion%20recognition" title=" facial emotion recognition"> facial emotion recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title=" machine learning"> machine learning</a> </p> <a href="https://publications.waset.org/abstracts/150291/deep-learning-based-approach-to-facial-emotion-recognition-through-convolutional-neural-network" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/150291.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">95</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3819</span> Comparing Emotion Recognition from Voice and Facial Data Using Time Invariant Features</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Vesna%20Kirandziska">Vesna Kirandziska</a>, <a href="https://publications.waset.org/abstracts/search?q=Nevena%20Ackovska"> Nevena Ackovska</a>, <a href="https://publications.waset.org/abstracts/search?q=Ana%20Madevska%20Bogdanova"> Ana Madevska Bogdanova</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The problem of emotion recognition is a challenging problem. It is still an open problem from the aspect of both intelligent systems and psychology. In this paper, both voice features and facial features are used for building an emotion recognition system. A Support Vector Machine classifiers are built by using raw data from video recordings. In this paper, the results obtained for the emotion recognition are given, and a discussion about the validity and the expressiveness of different emotions is presented. A comparison between the classifiers build from facial data only, voice data only and from the combination of both data is made here. The need for a better combination of the information from facial expression and voice data is argued. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=emotion%20recognition" title="emotion recognition">emotion recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=facial%20recognition" title=" facial recognition"> facial recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=signal%20processing" title=" signal processing"> signal processing</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title=" machine learning"> machine learning</a> </p> <a href="https://publications.waset.org/abstracts/42384/comparing-emotion-recognition-from-voice-and-facial-data-using-time-invariant-features" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/42384.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">316</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3818</span> Job Characteristics, Emotion Regulation and University Teachers&#039; Well-Being: A Job Demands-Resources Analysis</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jiying%20Han">Jiying Han</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Teaching is widely known to be an emotional endeavor, and teachers’ ability to regulate their emotions is important for their well-being and the effectiveness of their classroom management. Considering that teachers’ emotion regulation is an underexplored issue in the field of educational research, some studies have attempted to explore the role of emotion regulation in teachers’ work and to explore the links between teachers’ emotion regulation, job characteristics, and well-being, based on the Job Demands-Resources (JD-R) model. However, those studies targeted primary or secondary teachers. So far, very little is known about the relationships between university teachers’ emotion regulation and its antecedents and effects on teacher well-being. Based on the job demands-resources model and emotion regulation theory, this study examined the relationships between job characteristics of university teaching (i.e., emotional job demands and teaching support), emotion regulation strategies (i.e., reappraisal and suppression), and university teachers’ well-being. Data collected from a questionnaire survey of 643 university teachers in China were analysed. The results indicated that (1) both emotional job demands and teaching support had desirable effects on university teachers’ well-being; (2) both emotional job demands and teaching support facilitated university teachers’ use of reappraisal strategies; and (3) reappraisal was beneficial to university teachers’ well-being, whereas suppression was harmful. These findings support the applicability of the job demands-resources model to the contexts of higher education and highlight the mediating role of emotion regulation. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=emotional%20job%20demands" title="emotional job demands">emotional job demands</a>, <a href="https://publications.waset.org/abstracts/search?q=teaching%20support" title=" teaching support"> teaching support</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20regulation%20strategies" title=" emotion regulation strategies"> emotion regulation strategies</a>, <a href="https://publications.waset.org/abstracts/search?q=the%20job%20demands-resources%20model" title=" the job demands-resources model"> the job demands-resources model</a> </p> <a href="https://publications.waset.org/abstracts/119662/job-characteristics-emotion-regulation-and-university-teachers-well-being-a-job-demands-resources-analysis" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/119662.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">157</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3817</span> Documents Emotions Classification Model Based on TF-IDF Weighting Measure</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Amr%20Mansour%20Mohsen">Amr Mansour Mohsen</a>, <a href="https://publications.waset.org/abstracts/search?q=Hesham%20Ahmed%20Hassan"> Hesham Ahmed Hassan</a>, <a href="https://publications.waset.org/abstracts/search?q=Amira%20M.%20Idrees"> Amira M. Idrees</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Emotions classification of text documents is applied to reveal if the document expresses a determined emotion from its writer. As different supervised methods are previously used for emotion documents&rsquo; classification, in this research we present a novel model that supports the classification algorithms for more accurate results by the support of TF-IDF measure. Different experiments have been applied to reveal the applicability of the proposed model, the model succeeds in raising the accuracy percentage according to the determined metrics (precision, recall, and f-measure) based on applying the refinement of the lexicon, integration of lexicons using different perspectives, and applying the TF-IDF weighting measure over the classifying features. The proposed model has also been compared with other research to prove its competence in raising the results&rsquo; accuracy. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=emotion%20detection" title="emotion detection">emotion detection</a>, <a href="https://publications.waset.org/abstracts/search?q=TF-IDF" title=" TF-IDF"> TF-IDF</a>, <a href="https://publications.waset.org/abstracts/search?q=WEKA%20tool" title=" WEKA tool"> WEKA tool</a>, <a href="https://publications.waset.org/abstracts/search?q=classification%20algorithms" title=" classification algorithms"> classification algorithms</a> </p> <a href="https://publications.waset.org/abstracts/41563/documents-emotions-classification-model-based-on-tf-idf-weighting-measure" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/41563.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">484</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3816</span> The Relationships among Learning Emotion, Major Satisfaction, Learning Flow, and Academic Achievement in Medical School Students</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=S.%20J.%20Yune">S. J. Yune</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20Y.%20Lee"> S. Y. Lee</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20J.%20Im"> S. J. Im</a>, <a href="https://publications.waset.org/abstracts/search?q=B.%20S.%20Kam"> B. S. Kam</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20Y.%20Baek"> S. Y. Baek</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This study explored whether academic emotion, major satisfaction, and learning flow are associated with academic achievement in medical school. We know that emotion and affective factors are important factors in students' learning and performance. Emotion has taken the stage in much of contemporary educational psychology literature, no longer relegated to secondary status behind traditionally studied cognitive constructs. Medical school students (n=164) completed academic emotion, major satisfaction, and learning flow online survey. Academic performance was operationalized as students' average grade on two semester exams. For data analysis, correlation analysis, multiple regression analysis, hierarchical multiple regression analyses and ANOVA were conducted. The results largely confirmed the hypothesized relations among academic emotion, major satisfaction, learning flow and academic achievement. Positive academic emotion had a correlation with academic achievement (β=.191). Positive emotion had 8.5% explanatory power for academic achievement. Especially, sense of accomplishment had a significant impact on learning performance (β=.265). On the other hand, negative emotion, major satisfaction, and learning flow did not affect academic performance. Also, there were differences in sense of great (F=5.446, p=.001) and interest (F=2.78, p=.043) among positive emotion, boredom (F=3.55, p=.016), anger (F=4.346, p=.006), and petulance (F=3.779, p=.012) among negative emotion by grade. This study suggested that medical students' positive emotion was an important contributor to their academic achievement. At the same time, it is important to consider that some negative emotions can act to increase one’s motivation. Of particular importance is the notion that instructors can and should create learning environment that foster positive emotion for students. In doing so, instructors improve their chances of positively impacting students’ achievement emotions, as well as their subsequent motivation, learning, and performance. This result had an implication for medical educators striving to understand the personal emotional factors that influence learning and performance in medical training. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=academic%20achievement" title="academic achievement">academic achievement</a>, <a href="https://publications.waset.org/abstracts/search?q=learning%20emotion" title=" learning emotion"> learning emotion</a>, <a href="https://publications.waset.org/abstracts/search?q=learning%20flow" title=" learning flow"> learning flow</a>, <a href="https://publications.waset.org/abstracts/search?q=major%20satisfaction" title=" major satisfaction"> major satisfaction</a> </p> <a href="https://publications.waset.org/abstracts/58446/the-relationships-among-learning-emotion-major-satisfaction-learning-flow-and-academic-achievement-in-medical-school-students" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/58446.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">272</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3815</span> Facial Emotion Recognition with Convolutional Neural Network Based Architecture</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Koray%20U.%20Erbas">Koray U. Erbas</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Neural networks are appealing for many applications since they are able to learn complex non-linear relationships between input and output data. As the number of neurons and layers in a neural network increase, it is possible to represent more complex relationships with automatically extracted features. Nowadays Deep Neural Networks (DNNs) are widely used in Computer Vision problems such as; classification, object detection, segmentation image editing etc. In this work, Facial Emotion Recognition task is performed by proposed Convolutional Neural Network (CNN)-based DNN architecture using FER2013 Dataset. Moreover, the effects of different hyperparameters (activation function, kernel size, initializer, batch size and network size) are investigated and ablation study results for Pooling Layer, Dropout and Batch Normalization are presented. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=convolutional%20neural%20network" title="convolutional neural network">convolutional neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title=" deep learning"> deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning%20based%20FER" title=" deep learning based FER"> deep learning based FER</a>, <a href="https://publications.waset.org/abstracts/search?q=facial%20emotion%20recognition" title=" facial emotion recognition"> facial emotion recognition</a> </p> <a href="https://publications.waset.org/abstracts/128197/facial-emotion-recognition-with-convolutional-neural-network-based-architecture" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/128197.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">274</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3814</span> A Systematic Review Emotion Regulation through Music in Children, Adults, and Elderly</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Fabiana%20Ribeiro">Fabiana Ribeiro</a>, <a href="https://publications.waset.org/abstracts/search?q=Ana%20Moreno"> Ana Moreno</a>, <a href="https://publications.waset.org/abstracts/search?q=Antonio%20Oliveira"> Antonio Oliveira</a>, <a href="https://publications.waset.org/abstracts/search?q=Patricia%20Oliveira-Silva"> Patricia Oliveira-Silva</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Music is present in our daily lives, and to our knowledge music is often used to change the emotions in the listeners. For this reason, the objective of this study was to explore and synthesize results examining the use and effects of music on emotion regulation in children, adults, and elderly, and clarify if the music is effective across ages to promote emotion regulation. A literature search was conducted using ISI Web of Knowledge, Pubmed, PsycINFO, and Scopus, inclusion criteria comprised children, adolescents, young, and old adults, including health population. Articles applying musical intervention, specifically musical listening, and assessing the emotion regulation directly through reports or neurophysiological measures were included in this review. Results showed age differences in the function of musical listening; initially, adolescents revealed age increments in emotional listening compared to children, and young adults in comparison to older adults, in which the first use music aiming to emotion regulation and social connection, while older adults also utilize music as emotion regulation searching for personal growth. Moreover, some of the studies showed that personal characteristics also would determine the efficiency of the emotion regulation strategy. In conclusion, it was observed that music could beneficiate all ages investigated, however, this review detected a necessity to develop adequate paradigms to explore the use of music for emotion regulation. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=music" title="music">music</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion" title=" emotion"> emotion</a>, <a href="https://publications.waset.org/abstracts/search?q=regulation" title=" regulation"> regulation</a>, <a href="https://publications.waset.org/abstracts/search?q=musical%20listening" title=" musical listening"> musical listening</a> </p> <a href="https://publications.waset.org/abstracts/98460/a-systematic-review-emotion-regulation-through-music-in-children-adults-and-elderly" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/98460.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">171</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3813</span> Automatic Detection and Filtering of Negative Emotion-Bearing Contents from Social Media in Amharic Using Sentiment Analysis and Deep Learning Methods</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Derejaw%20Lake%20Melie">Derejaw Lake Melie</a>, <a href="https://publications.waset.org/abstracts/search?q=Alemu%20Kumlachew%20Tegegne"> Alemu Kumlachew Tegegne</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The increasing prevalence of social media in Ethiopia has exacerbated societal challenges by fostering the proliferation of negative emotional posts and comments. Illicit use of social media has further exacerbated divisions among the population. Addressing these issues through manual identification and aggregation of emotions from millions of users for swift decision-making poses significant challenges, particularly given the rapid growth of Amharic language usage on social platforms. Consequently, there is a critical need to develop an intelligent system capable of automatically detecting and categorizing negative emotional content into social, religious, and political categories while also filtering out toxic online content. This paper aims to leverage sentiment analysis techniques to achieve automatic detection and filtering of negative emotional content from Amharic social media texts, employing a comparative study of deep learning algorithms. The study utilized a dataset comprising 29,962 comments collected from social media platforms using comment exporter software. Data pre-processing techniques were applied to enhance data quality, followed by the implementation of deep learning methods for training, testing, and evaluation. The results showed that CNN, GRU, LSTM, and Bi-LSTM classification models achieved accuracies of 83%, 50%, 84%, and 86%, respectively. Among these models, Bi-LSTM demonstrated the highest accuracy of 86% in the experiment. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=negative%20emotion" title="negative emotion">negative emotion</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20detection" title=" emotion detection"> emotion detection</a>, <a href="https://publications.waset.org/abstracts/search?q=social%20media%20filtering%20sentiment%20analysis" title=" social media filtering sentiment analysis"> social media filtering sentiment analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning." title=" deep learning."> deep learning.</a> </p> <a href="https://publications.waset.org/abstracts/191945/automatic-detection-and-filtering-of-negative-emotion-bearing-contents-from-social-media-in-amharic-using-sentiment-analysis-and-deep-learning-methods" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/191945.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">23</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3812</span> Emotion Regulation Mediates the Relationship between Affective Disposition and Depression</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Valentina%20Colonnello">Valentina Colonnello</a>, <a href="https://publications.waset.org/abstracts/search?q=Paolo%20Maria%20Russo"> Paolo Maria Russo</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Studies indicate a link between individual differences in affective disposition and depression, as well as between emotion dysregulation and depression. However, the specific role of emotion dysregulation domains in mediating the relationship between affective disposition and depression remains largely unexplored. In three cross-sectional quantitative studies (total n = 1350), we explored the extent to which specific emotion regulation difficulties mediate the relationship between personal distress disposition (Study 1), separation distress as a primary emotional trait (Study 2), and an insecure, anxious attachment style (Study 3) and depression. Across all studies, we found that the relationship between affective disposition and depression was mediated by difficulties in accessing adaptive emotion regulation strategies. These findings underscore the potential for modifiable abilities that could be targeted through preventive interventions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=emotions" title="emotions">emotions</a>, <a href="https://publications.waset.org/abstracts/search?q=mental%20health" title=" mental health"> mental health</a>, <a href="https://publications.waset.org/abstracts/search?q=individual%20traits" title=" individual traits"> individual traits</a>, <a href="https://publications.waset.org/abstracts/search?q=personality" title=" personality"> personality</a> </p> <a href="https://publications.waset.org/abstracts/183449/emotion-regulation-mediates-the-relationship-between-affective-disposition-and-depression" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/183449.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">66</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3811</span> A Comparison of South East Asian Face Emotion Classification based on Optimized Ellipse Data Using Clustering Technique </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=M.%20Karthigayan">M. Karthigayan</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Rizon"> M. Rizon</a>, <a href="https://publications.waset.org/abstracts/search?q=Sazali%20Yaacob"> Sazali Yaacob</a>, <a href="https://publications.waset.org/abstracts/search?q=R.%20Nagarajan"> R. Nagarajan</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Muthukumaran"> M. Muthukumaran</a>, <a href="https://publications.waset.org/abstracts/search?q=Thinaharan%20Ramachandran"> Thinaharan Ramachandran</a>, <a href="https://publications.waset.org/abstracts/search?q=Sargunam%20Thirugnanam"> Sargunam Thirugnanam </a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, using a set of irregular and regular ellipse fitting equations using Genetic algorithm (GA) are applied to the lip and eye features to classify the human emotions. Two South East Asian (SEA) faces are considered in this work for the emotion classification. There are six emotions and one neutral are considered as the output. Each subject shows unique characteristic of the lip and eye features for various emotions. GA is adopted to optimize irregular ellipse characteristics of the lip and eye features in each emotion. That is, the top portion of lip configuration is a part of one ellipse and the bottom of different ellipse. Two ellipse based fitness equations are proposed for the lip configuration and relevant parameters that define the emotions are listed. The GA method has achieved reasonably successful classification of emotion. In some emotions classification, optimized data values of one emotion are messed or overlapped to other emotion ranges. In order to overcome the overlapping problem between the emotion optimized values and at the same time to improve the classification, a fuzzy clustering method (FCM) of approach has been implemented to offer better classification. The GA-FCM approach offers a reasonably good classification within the ranges of clusters and it had been proven by applying to two SEA subjects and have improved the classification rate. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ellipse%20fitness%20function" title="ellipse fitness function">ellipse fitness function</a>, <a href="https://publications.waset.org/abstracts/search?q=genetic%20algorithm" title=" genetic algorithm"> genetic algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20recognition" title=" emotion recognition"> emotion recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=fuzzy%20clustering" title=" fuzzy clustering "> fuzzy clustering </a> </p> <a href="https://publications.waset.org/abstracts/16362/a-comparison-of-south-east-asian-face-emotion-classification-based-on-optimized-ellipse-data-using-clustering-technique" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/16362.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">546</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3810</span> The Effectiveness of Dialectical Behavior Therapy in Developing Emotion Regulation Skill for Adolescent with Intellectual Disability</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Shahnaz%20Safitri">Shahnaz Safitri</a>, <a href="https://publications.waset.org/abstracts/search?q=Rose%20Mini%20Agoes%20Salim"> Rose Mini Agoes Salim</a>, <a href="https://publications.waset.org/abstracts/search?q=Pratiwi%20Widyasari"> Pratiwi Widyasari</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Intellectual disability is characterized by significant limitations in intellectual functioning and adaptive behavior that appears before the age of 18 years old. The prominent impacts of intellectual disability in adolescents are failure to establish interpersonal relationships as socially expected and lower academic achievement. Meanwhile, it is known that emotion regulation skills have a role in supporting the functioning of individual, either by nourishing the development of social skills as well as by facilitating the process of learning and adaptation in school. This study aims to look for the effectiveness of Dialectical Behavior Therapy (DBT) in developing emotion regulation skills for adolescents with intellectual disability. DBT's special consideration toward clients’ social environment and their biological condition is foreseen to be the key for developing emotion regulation capacity for subjects with intellectual disability. Through observations on client's behavior, conducted before and after the completion of DBT intervention program, it was found that there is an improvement in client's knowledge and attitudes related to the mastery of emotion regulation skills. In addition, client's consistency to actually practice emotion regulation techniques over time is largely influenced by the support received from the client's social circles. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=adolescent" title="adolescent">adolescent</a>, <a href="https://publications.waset.org/abstracts/search?q=dialectical%20behavior%20therapy" title=" dialectical behavior therapy"> dialectical behavior therapy</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20regulation" title=" emotion regulation"> emotion regulation</a>, <a href="https://publications.waset.org/abstracts/search?q=intellectual%20disability" title=" intellectual disability"> intellectual disability</a> </p> <a href="https://publications.waset.org/abstracts/72895/the-effectiveness-of-dialectical-behavior-therapy-in-developing-emotion-regulation-skill-for-adolescent-with-intellectual-disability" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/72895.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">304</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3809</span> Various Perspectives for the Concept of the Emotion Labor</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jae%20Soo%20Do">Jae Soo Do</a>, <a href="https://publications.waset.org/abstracts/search?q=Kyoung-Seok%20Kim"> Kyoung-Seok Kim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Radical changes in the industrial environment, and spectacular developments of IT have changed the current of managements from people-centered to technology- or IT-centered. Interpersonal emotion exchanges have long become insipid and interactive services have also come as mechanical reactions. This study offers various concepts for the emotional labor based on traditional studies on emotional labor. Especially the present day, on which human emotions are subject to being served as machinized thing, is the time when the study on human emotions comes momentous. Precedent researches on emotional labors commonly and basically dealt with the relationship between the active group who performs actions and the passive group who is done with the action. This study focuses on the passive group and tries to offer a new perspective of 'liquid emotion' as a defence mechanism for the passive group from the external environment. Especially, this addresses a concrete discussion on directions of following studies on the liquid labor as a newly suggested perspective. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=emotion%20labor" title="emotion labor">emotion labor</a>, <a href="https://publications.waset.org/abstracts/search?q=surface%20acting" title=" surface acting"> surface acting</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20acting" title=" deep acting"> deep acting</a>, <a href="https://publications.waset.org/abstracts/search?q=liquid%20emotion" title=" liquid emotion"> liquid emotion</a> </p> <a href="https://publications.waset.org/abstracts/68091/various-perspectives-for-the-concept-of-the-emotion-labor" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/68091.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">346</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3808</span> Emotion Expression of the Leader and Collective Efficacy: Pride and Guilt</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hsiu-Tsu%20Cho">Hsiu-Tsu Cho</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Collective efficacy refers to a group’s sense of its capacity to complete a task successfully or to reach objectives. Little effort has been expended on investigating the relationship between the emotion expression of a leader and collective efficacy. In this study, we examined the impact of the different emotions and emotion expression of a group leader on collective efficacy and explored whether the emotion–expressive effects differed under conditions of negative and positive emotions. A total of 240 undergraduate and graduate students recruited using Facebook and posters at a university participated in this research. The participants were separated randomly into 80 groups of four persons consisting of three participants and a confederate. They were randomly assigned to one of five conditions in a 2 (pride vs. guilt) × 2 (emotion expression of group leader vs. no emotion expression of group leader) factorial design and a control condition. Each four-person group was instructed to get the reward in a group competition of solving the five-disk Tower of Hanoi puzzle and making decisions on an investment case. We surveyed the participants by employing the emotional measure revised from previous researchers and collective efficacy questionnaire on a 5-point scale. To induce an emotion of pride (or guilt), the experimenter announced whether the group performance was good enough to have a chance of getting the reward (ranking the top or bottom 20% among all groups) after group task. The leader (confederate) could either express or not express a feeling of pride (or guilt) following the instruction according to the assigned condition. To check manipulation of emotion, we added a control condition under which the experimenter revealed no results regarding group performance in maintaining a neutral emotion. One-way ANOVAs and post hoc pairwise comparisons among the three emotion conditions (pride, guilt, and control condition) involved assigning pride and guilt scores (pride: F(1,75) = 32.41, p < .001; guilt: F(1,75) = 6.75, p < .05). The results indicated that manipulations of emotion were successful. A two-way between-measures ANOVA was conducted to examine the predictions of the main effects of emotion types and emotion expression as well as the interaction effect of these two variables on collective efficacy. The experimental findings suggest that pride did not affect collective efficacy (F(1,60) = 1.90, ns.) more than guilt did and that the group leader did not motivate collective efficacy regardless of whether he or she expressed emotion (F(1,60) = .89, ns.). However, the interaction effect of emotion types and emotion expression was statistically significant (F(1,60) = 4.27, p < .05, ω2 = .066); the effects accounted for 6.6% of the variance. Additional results revealed that, under the pride condition, the leader enhanced group efficacy when expressing emotion, whereas, under the guilt condition, an expression of emotion could reduce collective efficacy. Overall, these findings challenge the assumption that the effect of expression emotion are the same on all emotions and suggest that a leader should be cautious when expressing negative emotions toward a group to avoid reducing group effectiveness. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=collective%20efficacy" title="collective efficacy">collective efficacy</a>, <a href="https://publications.waset.org/abstracts/search?q=group%20leader" title=" group leader"> group leader</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20expression" title=" emotion expression"> emotion expression</a>, <a href="https://publications.waset.org/abstracts/search?q=pride" title=" pride"> pride</a>, <a href="https://publications.waset.org/abstracts/search?q=guilty" title=" guilty"> guilty</a> </p> <a href="https://publications.waset.org/abstracts/44290/emotion-expression-of-the-leader-and-collective-efficacy-pride-and-guilt" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/44290.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">330</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3807</span> The Effect of Heart Rate and Valence of Emotions on Perceived Intensity of Emotion</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Madeleine%20Nicole%20G.%20Bernardo">Madeleine Nicole G. Bernardo</a>, <a href="https://publications.waset.org/abstracts/search?q=Katrina%20T.%20Feliciano"> Katrina T. Feliciano</a>, <a href="https://publications.waset.org/abstracts/search?q=Marcelo%20Nonato%20A.%20Nacionales%20III"> Marcelo Nonato A. Nacionales III</a>, <a href="https://publications.waset.org/abstracts/search?q=Diane%20Frances%20M.%20Peralta"> Diane Frances M. Peralta</a>, <a href="https://publications.waset.org/abstracts/search?q=Denise%20Nicole%20V.%20Profeta"> Denise Nicole V. Profeta</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This study aims to find out if heart rate variability and valence of emotion have an effect on perceived intensity of emotion. Psychology undergraduates (N = 60) from the University of the Philippines Diliman were shown 10 photographs from the Japanese Female Facial Expression (JAFFE) Database, along with a corresponding questionnaire with a Likert scale on perceived intensity of emotion. In this 3 x 2 mixed subjects factorial design, each group was either made to do a simple exercise prior to answering the questionnaire in order to increase the heart rate, listen to a heart rate of 120 bpm, or colour a drawing to keep the heart rate stable. After doing the activity, the participants then answered the questionnaire, providing a rating of the faces according to the participants’ perceived emotional intensity on the photographs. The photographs presented were either of positive or negative emotional valence. The results of the experiment showed that neither an induced fast heart rate or perceived fast heart rate had any significant effect on the participants’ perceived intensity of emotion. There was also no interaction effect of heart rate variability and valence of emotion. The insignificance of results was explained by the Philippines’ high context culture, accompanied by the prevalence of both intensely valenced positive and negative emotions in Philippine society. Insignificance in the effects were also attributed to the Cannon-Bard theory, Schachter-Singer theory and various methodological limitations. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=heart%20rate%20variability" title="heart rate variability">heart rate variability</a>, <a href="https://publications.waset.org/abstracts/search?q=perceived%20intensity%20of%20emotion" title=" perceived intensity of emotion"> perceived intensity of emotion</a>, <a href="https://publications.waset.org/abstracts/search?q=Philippines" title=" Philippines "> Philippines </a>, <a href="https://publications.waset.org/abstracts/search?q=valence%20of%20emotion" title=" valence of emotion"> valence of emotion</a> </p> <a href="https://publications.waset.org/abstracts/92075/the-effect-of-heart-rate-and-valence-of-emotions-on-perceived-intensity-of-emotion" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/92075.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">253</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3806</span> Investigating the Acquisition of English Emotion Terms by Moroccan EFL Learners</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Khalid%20El%20Asri">Khalid El Asri</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Culture influences lexicalization of salient concepts in a society. Hence, languages often have different degrees of equivalence regarding lexical items of different fields. The present study focuses on the field of emotions in English and Moroccan Arabic. Findings of a comparative study that involved fifty English emotions revealed that Moroccan Arabic has equivalence of some English emotion terms, partial equivalence of some emotion terms, and no equivalence for some other terms. It is hypothesized then that emotion terms that have near equivalence in Moroccan Arabic will be easier to acquire for EFL learners, while partially equivalent terms will be difficult to acquire, and those that have no equivalence will be even more difficult to acquire. In order to test these hypotheses, the participants (104 advanced Moroccan EFL learners and 104 native speakers of English) were given two tests: the first is a receptive one in which the participants were asked to choose, among four emotion terms, the term that is appropriate to fill in the blanks for a given situation indicating certain kind of feelings. The second test is a productive one in which the participants were asked to give the emotion term that best described the feelings of the people in the situations given. The results showed that conceptually equivalent terms do not pose any problems for Moroccan EFL learners since they can link the concept to an already existing linguistic category; whereas the results concerning the acquisition of partially equivalent terms indicated that this type of emotion terms were difficult for Moroccan EFL learners to acquire, because they need to restructure the boundaries of the target linguistic categories by expanding them when the term includes other range of meanings that are not subsumed in the L1 term. Surprisingly however, the results concerning the case of non-equivalence revealed that Moroccan EFL learners could internalize the target L2 concepts that have no equivalence in their L1. Thus, it is the category of emotion terms that have partial equivalence in the learners’ L1 that pose problems for them. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=acquisition" title="acquisition">acquisition</a>, <a href="https://publications.waset.org/abstracts/search?q=culture" title=" culture"> culture</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20terms" title=" emotion terms"> emotion terms</a>, <a href="https://publications.waset.org/abstracts/search?q=lexical%20equivalence" title=" lexical equivalence"> lexical equivalence</a> </p> <a href="https://publications.waset.org/abstracts/77812/investigating-the-acquisition-of-english-emotion-terms-by-moroccan-efl-learners" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/77812.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">227</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">&lsaquo;</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=emotion%20detection&amp;page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=emotion%20detection&amp;page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=emotion%20detection&amp;page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=emotion%20detection&amp;page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=emotion%20detection&amp;page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=emotion%20detection&amp;page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=emotion%20detection&amp;page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=emotion%20detection&amp;page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=emotion%20detection&amp;page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=emotion%20detection&amp;page=127">127</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=emotion%20detection&amp;page=128">128</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=emotion%20detection&amp;page=2" rel="next">&rsaquo;</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">&copy; 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">&times;</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10