CINXE.COM
Search results for: disgust
<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: disgust</title> <meta name="description" content="Search results for: disgust"> <meta name="keywords" content="disgust"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="disgust" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="disgust"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 24</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: disgust</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">24</span> Linking Disgust and Misophonia: The Role of Mental Contamination</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Laurisa%20Peters">Laurisa Peters</a>, <a href="https://publications.waset.org/abstracts/search?q=Usha%20Barahmand"> Usha Barahmand</a>, <a href="https://publications.waset.org/abstracts/search?q=Maria%20Stalias-Mantzikos"> Maria Stalias-Mantzikos</a>, <a href="https://publications.waset.org/abstracts/search?q=Naila%20Shamsina"> Naila Shamsina</a>, <a href="https://publications.waset.org/abstracts/search?q=Kerry%20Aguero"> Kerry Aguero</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In the current study, the authors sought to examine whether the links between moral and sexual disgust and misophonia are mediated by mental contamination. An internationally diverse sample of 283 adults (193 females, 76 males, and 14 non-binary individuals) ranging in age from 18 to 60 years old was recruited from online social media platforms and survey recruitment sites. The sample completed an online battery of scales that consisted of the New York Misophonia Scale, State Mental Contamination Scale, and the Three-Domain Disgust Scale. The hypotheses were evaluated using a series of mediations performed using the PROCESS add-on in SPSS. Correlations were found between emotional and aggressive-avoidant reactions in misophonia, mental contamination, pathogen disgust, and sexual disgust. Moral disgust and non-aggressive reactions in misophonia failed to correlate significantly with any of the other constructs. Sexual disgust had direct and indirect effects, while pathogen disgust had only direct effects on aspects of misophonia. These findings partially support our hypothesis that mental contamination mediates the link between disgust propensity and misophonia while also confirming that pathogen-based disgust is not associated with mental contamination. Findings imply that misophonia is distinct from obsessive-compulsive disorder. Further research into the conceptualization of moral disgust is warranted. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=misophonia" title="misophonia">misophonia</a>, <a href="https://publications.waset.org/abstracts/search?q=moral%20disgust" title=" moral disgust"> moral disgust</a>, <a href="https://publications.waset.org/abstracts/search?q=pathogen%20disgust" title=" pathogen disgust"> pathogen disgust</a>, <a href="https://publications.waset.org/abstracts/search?q=sexual%20disgust" title=" sexual disgust"> sexual disgust</a>, <a href="https://publications.waset.org/abstracts/search?q=mental%20contamination" title=" mental contamination"> mental contamination</a> </p> <a href="https://publications.waset.org/abstracts/156112/linking-disgust-and-misophonia-the-role-of-mental-contamination" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/156112.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">96</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">23</span> Unconscious Bias in Judicial Decisions: Legal Genealogy and Disgust in Cases of Private, Adult, Consensual Sexual Acts Leading to Injury</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Susanna%20Menis">Susanna Menis</a> </p> <p class="card-text"><strong>Abstract:</strong></p> ‘Unconscious’ bias is widespread, affecting society on all levels of decision-making and beyond. Placed in the law context, this study will explore the direct effect of the psycho-social and cultural evolution of unconscious bias on how a judicial decision was made. The aim of this study is to contribute to socio-legal scholarship by examining the formation of unconscious bias and its influence on the creation of legal rules that judges believe reflect social solidarity and protect against violence. The study seeks to understand how concepts like criminalization and unlawfulness are constructed by the common law. The study methodology follows two theoretical approaches: historical genealogy and emotions as sociocultural phenomena. Both methods have the ‘tracing back’ of the original formation of a social way of seeing and doing things in common. The significance of this study lies in the importance of reflecting on the ways unconscious bias may be formed; placing judges’ decisions under this spotlight forces us to challenge the status quo, interrogate justice, and seek refinement of the law. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=legal%20geneology" title="legal geneology">legal geneology</a>, <a href="https://publications.waset.org/abstracts/search?q=emotions" title=" emotions"> emotions</a>, <a href="https://publications.waset.org/abstracts/search?q=disgust" title=" disgust"> disgust</a>, <a href="https://publications.waset.org/abstracts/search?q=criminal%20law" title=" criminal law"> criminal law</a> </p> <a href="https://publications.waset.org/abstracts/173490/unconscious-bias-in-judicial-decisions-legal-genealogy-and-disgust-in-cases-of-private-adult-consensual-sexual-acts-leading-to-injury" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/173490.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">61</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">22</span> Emotion Processing Differences Between People</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Elif%20Unveren">Elif Unveren</a>, <a href="https://publications.waset.org/abstracts/search?q=Ozlem%20Bozkurt"> Ozlem Bozkurt</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Emotion processing happens when someone has a negative, stressful experience and gets over it in time, and it is a different experience for every person. As to look into emotion processing can be categorised by intensity, awareness, coordination, speed, accuracy and response. It may vary depending on people’s age, sex and conditions. Each emotion processing shows different activation patterns in different brain regions. Activation is significantly higher in the right frontal areas. The highest activation happens in extended frontotemporal areas during the processing of happiness, sadness and disgust. Those emotions also show widely disturbed differences and get produced earlier than anger and fear. For different occasions, listed variables may have less or more importance. A borderline personality disorder is a condition that creates an unstable personality, sudden mood swings and unpredictability of actions. According to a study that was made with healthy people and people who had BPD, there were significant differences in some categories of emotion processing, such as intensity, awareness and accuracy. According to another study that was made to show the emotional processing differences between puberty and was made for only females who were between the ages of 11 and 17, it was perceived that for different ages and hormone levels, different parts of the brain are used to understand the given task. Also, in the different study that was made for kids that were between the age of 4 and 15, it was observed that the older kids were processing emotion more intensely and expressing it to a greater extent. There was a significant increase in fear and disgust in those matters. To sum up, we can say that the activity of undertaking negative experiences is a unique thing for everybody for many different reasons. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=age" title="age">age</a>, <a href="https://publications.waset.org/abstracts/search?q=sex" title=" sex"> sex</a>, <a href="https://publications.waset.org/abstracts/search?q=conditions" title=" conditions"> conditions</a>, <a href="https://publications.waset.org/abstracts/search?q=brain%20regions" title=" brain regions"> brain regions</a>, <a href="https://publications.waset.org/abstracts/search?q=emotion%20processing" title=" emotion processing"> emotion processing</a> </p> <a href="https://publications.waset.org/abstracts/164526/emotion-processing-differences-between-people" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/164526.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">85</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">21</span> Development of the New York Misophonia Scale: Implications for Diagnostic Criteria</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Usha%20Barahmand">Usha Barahmand</a>, <a href="https://publications.waset.org/abstracts/search?q=Maria%20Stalias"> Maria Stalias</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdul%20Haq"> Abdul Haq</a>, <a href="https://publications.waset.org/abstracts/search?q=Esther%20Rotlevi"> Esther Rotlevi</a>, <a href="https://publications.waset.org/abstracts/search?q=Ying%20Xiang"> Ying Xiang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Misophonia is a condition in which specific repetitive oral, nasal, or other sounds and movements made by humans trigger impulsive aversive reactions of irritation or disgust that instantly become anger. A few measures exist for the assessment of misophonia, but each has some limitations, and evidence for a formal diagnosis is still lacking. The objective of this study was to develop a reliable and valid measure of misophonia for use in the general population. Adopting a purely descriptive approach, this study focused on developing a self-report measure using all triggers and reactions identified in previous studies on misophonia. A measure with two subscales, one assessing the aversive quality of various triggers and the other assessing reactions of individuals, was developed. Data were gathered from a large sample of both men and women ranging in age from 18 to 65 years. Exploratory factor analysis revealed three main triggers: oral/nasal sounds, hand and leg movements, and environmental sounds. Two clusters of reactions also emerged: nonangry attempts to avoid the impact of the aversive stimuli and angry attempts to stop the aversive stimuli. The examination of the psychometric properties of the scale revealed its internal consistency and test-retest reliability to be excellent. The scale was also found to have very good concurrent and convergent validity. Significant annoyance and disgust in response to the triggers were reported by 12% of the sample, although for some specific triggers, rates as high as 31% were also reported. These findings have implications for the delineation of the criteria for identifying misophonia as a clinical condition. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=adults" title="adults">adults</a>, <a href="https://publications.waset.org/abstracts/search?q=factor%20analysis" title=" factor analysis"> factor analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=misophonia" title=" misophonia"> misophonia</a>, <a href="https://publications.waset.org/abstracts/search?q=psychometric%20properties" title=" psychometric properties"> psychometric properties</a>, <a href="https://publications.waset.org/abstracts/search?q=scale" title=" scale"> scale</a> </p> <a href="https://publications.waset.org/abstracts/131254/development-of-the-new-york-misophonia-scale-implications-for-diagnostic-criteria" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/131254.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">207</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">20</span> Curvelet Features with Mouth and Face Edge Ratios for Facial Expression Identification </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=S.%20Kherchaoui">S. Kherchaoui</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Houacine"> A. Houacine</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper presents a facial expression recognition system. It performs identification and classification of the seven basic expressions; happy, surprise, fear, disgust, sadness, anger, and neutral states. It consists of three main parts. The first one is the detection of a face and the corresponding facial features to extract the most expressive portion of the face, followed by a normalization of the region of interest. Then calculus of curvelet coefficients is performed with dimensionality reduction through principal component analysis. The resulting coefficients are combined with two ratios; mouth ratio and face edge ratio to constitute the whole feature vector. The third step is the classification of the emotional state using the SVM method in the feature space. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=facial%20expression%20identification" title="facial expression identification">facial expression identification</a>, <a href="https://publications.waset.org/abstracts/search?q=curvelet%20coefficient" title=" curvelet coefficient"> curvelet coefficient</a>, <a href="https://publications.waset.org/abstracts/search?q=support%20vector%20machine%20%28SVM%29" title=" support vector machine (SVM)"> support vector machine (SVM)</a>, <a href="https://publications.waset.org/abstracts/search?q=recognition%20system" title=" recognition system"> recognition system</a> </p> <a href="https://publications.waset.org/abstracts/10311/curvelet-features-with-mouth-and-face-edge-ratios-for-facial-expression-identification" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/10311.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">232</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">19</span> Learning to Recommend with Negative Ratings Based on Factorization Machine</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Caihong%20Sun">Caihong Sun</a>, <a href="https://publications.waset.org/abstracts/search?q=Xizi%20Zhang"> Xizi Zhang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Rating prediction is an important problem for recommender systems. The task is to predict the rating for an item that a user would give. Most of the existing algorithms for the task ignore the effect of negative ratings rated by users on items, but the negative ratings have a significant impact on users’ purchasing decisions in practice. In this paper, we present a rating prediction algorithm based on factorization machines that consider the effect of negative ratings inspired by Loss Aversion theory. The aim of this paper is to develop a concave and a convex negative disgust function to evaluate the negative ratings respectively. Experiments are conducted on MovieLens dataset. The experimental results demonstrate the effectiveness of the proposed methods by comparing with other four the state-of-the-art approaches. The negative ratings showed much importance in the accuracy of ratings predictions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=factorization%20machines" title="factorization machines">factorization machines</a>, <a href="https://publications.waset.org/abstracts/search?q=feature%20engineering" title=" feature engineering"> feature engineering</a>, <a href="https://publications.waset.org/abstracts/search?q=negative%20ratings" title=" negative ratings"> negative ratings</a>, <a href="https://publications.waset.org/abstracts/search?q=recommendation%20systems" title=" recommendation systems"> recommendation systems</a> </p> <a href="https://publications.waset.org/abstracts/71527/learning-to-recommend-with-negative-ratings-based-on-factorization-machine" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/71527.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">242</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">18</span> Colada Sweet Like Mercy: Gender Stereotyping in Twitter Conversations by Big Brother Naija 2019 Viewers</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mary-Magdalene%20N.%20Chumbow">Mary-Magdalene N. Chumbow</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This study explores how a reality TV show which aired in Nigeria in 2019 (Big Brother Naija - BBN), played a role in enhancing gender-biased conversations among its viewers and social media followers. Thematic analysis is employed here to study Twitter conversations among BBN 2019 followers, which ensued after the show had stopped airing. The study reveals that the show influenced the way viewers and fans engaged with each other, as well as with the show’s participants, on Twitter, and argues that, despite having aired for a short period of time, BBN 2019 was able to draw people together and provide a community where viewers could engage with each other online. Though the show aired on TV, the viewers found a digital space where they could air their views, react to what was happening on the show, as well as simply catch up on action that they probably missed. Within these digital communities, viewers expressed their attractions, disgust and identities, most of these having a form of reference to sexuality and gender identities and roles, as were also portrayed by the show’s producers both on TV and on social media. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=commodification%20of%20bodies" title="commodification of bodies">commodification of bodies</a>, <a href="https://publications.waset.org/abstracts/search?q=gender%20stereotypes" title=" gender stereotypes"> gender stereotypes</a>, <a href="https://publications.waset.org/abstracts/search?q=Big%20Brother%20Naija" title=" Big Brother Naija"> Big Brother Naija</a>, <a href="https://publications.waset.org/abstracts/search?q=social%20media" title=" social media"> social media</a> </p> <a href="https://publications.waset.org/abstracts/128386/colada-sweet-like-mercy-gender-stereotyping-in-twitter-conversations-by-big-brother-naija-2019-viewers" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/128386.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">133</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">17</span> Love and Money: Societal Attitudes Toward Income Disparities in Age-Gap Relationships</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Victoria%20Scarratt">Victoria Scarratt</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Couples involved in age-gap relationships generally evoke negative stereotypes, opinions, and social disapproval. This research seeks to examine whether financial disparities in age-discrepant relationships cause negative attitudes in study participants. It was hypothesized that an age-gap couple (29 year difference) would receive a greater degree of societal disapproval when the couple also had a large salary gap compared to a similarly aged couple (1 year difference) with a salary gap. Additionally, there would be no significant difference between age-gap couples without a salary-gap compared to a similarly aged couple without a salary gap. To test the hypothesis, participants were given one of four scenarios regarding a couple in a romantic relationship.Then they were asked to respond to nine Likert scale questions. Results indicated that participants perceived age-gap relationships with a salary disparity to be less equitable in regard to a power imbalance between the couple and the financial and general gain that one partner will receive. A significant interaction was also detected for evoking feelings of disgust in participants and how morally correct it is for the couple to continue their relationship. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=age%20gap%20relationships" title="age gap relationships">age gap relationships</a>, <a href="https://publications.waset.org/abstracts/search?q=love" title=" love"> love</a>, <a href="https://publications.waset.org/abstracts/search?q=financial%20disparities" title=" financial disparities"> financial disparities</a>, <a href="https://publications.waset.org/abstracts/search?q=societal%20stigmas" title=" societal stigmas"> societal stigmas</a>, <a href="https://publications.waset.org/abstracts/search?q=relationship%20dynamics" title=" relationship dynamics"> relationship dynamics</a> </p> <a href="https://publications.waset.org/abstracts/157349/love-and-money-societal-attitudes-toward-income-disparities-in-age-gap-relationships" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/157349.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">115</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">16</span> Emotional Intelligence in the Modern World: A Quantitative and Qualitative Study of the UMCS Students</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Anna%20Dabrowska">Anna Dabrowska</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Taking Daniel Goleman’s (1994) belief that success in life depends on IQ in 20% and in 80% on emotional intelligence, and that it is worth considering emotional intelligence as an important factor in human performance and development potential, the aim of the paper is to explore the range of emotions experienced by university students who represent Society 5.0. This quantitative and qualitative study is meant to explore not only the list of the most and least experienced emotions by the students, but also the main reasons behind these feelings. The database of the study consists of 115 respondents out of 129 students of the 1st and 5th year of Applied Linguistics at Maria Curie-Skłodowska University, which constitutes 89% of those being surveyed. The data is extracted from the anonymous questionnaire, which comprises young people’s answers and discourse concerning the causes of their most experienced emotions. Following Robert Plutchik’s theory of eight primary emotions, i.e. anger, fear, sadness, disgust, surprise, anticipation, trust, and joy, we adopt his argument for the primacy of these emotions by showing each to be the trigger of behaviour with high survival value. In fact, all other emotions are mixed or derivative states; that is, they occur as combinations, mixtures, or compounds of the primary emotions. Accordingly, the eight primary emotions, and their mixed states, are checked in the study on the students. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=emotions" title="emotions">emotions</a>, <a href="https://publications.waset.org/abstracts/search?q=intelligence" title=" intelligence"> intelligence</a>, <a href="https://publications.waset.org/abstracts/search?q=students" title=" students"> students</a>, <a href="https://publications.waset.org/abstracts/search?q=discourse%20study" title=" discourse study"> discourse study</a>, <a href="https://publications.waset.org/abstracts/search?q=emotional%20intelligence" title=" emotional intelligence"> emotional intelligence</a> </p> <a href="https://publications.waset.org/abstracts/187320/emotional-intelligence-in-the-modern-world-a-quantitative-and-qualitative-study-of-the-umcs-students" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/187320.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">41</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">15</span> Exploring Visual Methodologies for Measuring Public Perception of Sex Offenders</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sasha%20Goodwin">Sasha Goodwin</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Sex offenders are often viewed as a homogenous group, but they encompass a diverse range of individuals with varying characteristics and offenses. The principal aim of this study was to ascertain how members of the Australian public perceive and define a sex offender while also investigating the emotional underpinnings associated with these attitudes and definitions. To assess public attitude, this study used the innovative utilization of visual methodologies to assess the public's perception of sex offenders. The study employed the iSquare approach, a visual methodology framework that offers unique viewpoints and insights into public attitudes toward sex offenders. Through the utilization of this approach, this study established an academic foundation for a deeper understanding of the public's perception of sex offenders. The data analysis revealed that most participants associated sex offenders with strong negative emotions, primarily disgust and anger. The findings of this research point towards the potential for fostering a social environment characterized by evidence-based discussions instead of reactionary punitive responses. Promoting a comprehensive understanding of the diverse nature of sexual offenders aims to broaden perceptions, fostering constructive attitudes. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=visual%20methodologies" title="visual methodologies">visual methodologies</a>, <a href="https://publications.waset.org/abstracts/search?q=public%20perception" title=" public perception"> public perception</a>, <a href="https://publications.waset.org/abstracts/search?q=sex%20offenders" title=" sex offenders"> sex offenders</a>, <a href="https://publications.waset.org/abstracts/search?q=offender%20characteristics" title=" offender characteristics"> offender characteristics</a>, <a href="https://publications.waset.org/abstracts/search?q=emotional%20attitudes" title=" emotional attitudes"> emotional attitudes</a>, <a href="https://publications.waset.org/abstracts/search?q=isquare%20approach" title=" isquare approach"> isquare approach</a>, <a href="https://publications.waset.org/abstracts/search?q=attitudes" title=" attitudes"> attitudes</a> </p> <a href="https://publications.waset.org/abstracts/181888/exploring-visual-methodologies-for-measuring-public-perception-of-sex-offenders" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/181888.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">63</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">14</span> Analysis and Detection of Facial Expressions in Autism Spectrum Disorder People Using Machine Learning</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Muhammad%20Maisam%20Abbas">Muhammad Maisam Abbas</a>, <a href="https://publications.waset.org/abstracts/search?q=Salman%20Tariq"> Salman Tariq</a>, <a href="https://publications.waset.org/abstracts/search?q=Usama%20Riaz"> Usama Riaz</a>, <a href="https://publications.waset.org/abstracts/search?q=Muhammad%20Tanveer"> Muhammad Tanveer</a>, <a href="https://publications.waset.org/abstracts/search?q=Humaira%20Abdul%20Ghafoor"> Humaira Abdul Ghafoor</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Autism Spectrum Disorder (ASD) refers to a developmental disorder that impairs an individual's communication and interaction ability. Individuals feel difficult to read facial expressions while communicating or interacting. Facial Expression Recognition (FER) is a unique method of classifying basic human expressions, i.e., happiness, fear, surprise, sadness, disgust, neutral, and anger through static and dynamic sources. This paper conducts a comprehensive comparison and proposed optimal method for a continued research project—a system that can assist people who have Autism Spectrum Disorder (ASD) in recognizing facial expressions. Comparison has been conducted on three supervised learning algorithms EigenFace, FisherFace, and LBPH. The JAFFE, CK+, and TFEID (I&II) datasets have been used to train and test the algorithms. The results were then evaluated based on variance, standard deviation, and accuracy. The experiments showed that FisherFace has the highest accuracy for all datasets and is considered the best algorithm to be implemented in our system. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=autism%20spectrum%20disorder" title="autism spectrum disorder">autism spectrum disorder</a>, <a href="https://publications.waset.org/abstracts/search?q=ASD" title=" ASD"> ASD</a>, <a href="https://publications.waset.org/abstracts/search?q=EigenFace" title=" EigenFace"> EigenFace</a>, <a href="https://publications.waset.org/abstracts/search?q=facial%20expression%20recognition" title=" facial expression recognition"> facial expression recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=FisherFace" title=" FisherFace"> FisherFace</a>, <a href="https://publications.waset.org/abstracts/search?q=local%20binary%20pattern%20histogram" title=" local binary pattern histogram"> local binary pattern histogram</a>, <a href="https://publications.waset.org/abstracts/search?q=LBPH" title=" LBPH"> LBPH</a> </p> <a href="https://publications.waset.org/abstracts/129718/analysis-and-detection-of-facial-expressions-in-autism-spectrum-disorder-people-using-machine-learning" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/129718.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">174</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">13</span> The Effect of Mood and Normative Conformity on Prosocial Behavior</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Antoine%20Miguel%20Borromeo">Antoine Miguel Borromeo</a>, <a href="https://publications.waset.org/abstracts/search?q=Kristian%20Anthony%20Menez"> Kristian Anthony Menez</a>, <a href="https://publications.waset.org/abstracts/search?q=Moira%20Louise%20Ordonez"> Moira Louise Ordonez</a>, <a href="https://publications.waset.org/abstracts/search?q=David%20Carl%20Rabaya"> David Carl Rabaya</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This study aimed to test if induced mood and normative conformity have any effect specifically on prosocial behavior, which was operationalized as the willingness to donate to a non-government organization. The effect of current attitude towards the object of the prosocial behavior was also considered with a covariate test. Undergraduates taking an introductory course on psychology (N = 132) from the University of the Philippines Diliman were asked how much money they were willing to donate after being presented a video about coral reef destruction and a website that advocates towards saving the coral reefs. A 3 (Induced mood: Positive vs Fear and Sadness vs Anger, Contempt, and Disgust) x 2 (Normative conformity: Presence vs Absence) between-subjects analysis of covariance was used for experimentation. Prosocial behavior was measured by presenting a circumstance wherein participants were given money and asked if they were willing to donate an amount to the non-government organization. An analysis of covariance revealed that the mood induced has no significant effect on prosocial behavior, F(2,125) = 0.654, p > 0.05. The analysis also showed how normative conformity has no significant effect on prosocial behavior, F(1,125) = 0.238, p > 0.05, as well as their interaction F(2, 125) = 1.580, p > 0.05. However, the covariate, current attitude towards corals was revealed to be significant, F(1,125) = 8.778, p < 0.05. From this, we speculate that inherent attitudes of people have a greater effect on prosocial behavior than temporary factors such as mood and conformity. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=attitude" title="attitude">attitude</a>, <a href="https://publications.waset.org/abstracts/search?q=induced%20mood" title=" induced mood"> induced mood</a>, <a href="https://publications.waset.org/abstracts/search?q=normative%20conformity" title=" normative conformity"> normative conformity</a>, <a href="https://publications.waset.org/abstracts/search?q=prosocial%20behavior" title=" prosocial behavior"> prosocial behavior</a> </p> <a href="https://publications.waset.org/abstracts/92777/the-effect-of-mood-and-normative-conformity-on-prosocial-behavior" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/92777.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">228</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">12</span> Facial Expression Phoenix (FePh): An Annotated Sequenced Dataset for Facial and Emotion-Specified Expressions in Sign Language</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Marie%20Alaghband">Marie Alaghband</a>, <a href="https://publications.waset.org/abstracts/search?q=Niloofar%20Yousefi"> Niloofar Yousefi</a>, <a href="https://publications.waset.org/abstracts/search?q=Ivan%20Garibay"> Ivan Garibay</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Facial expressions are important parts of both gesture and sign language recognition systems. Despite the recent advances in both fields, annotated facial expression datasets in the context of sign language are still scarce resources. In this manuscript, we introduce an annotated sequenced facial expression dataset in the context of sign language, comprising over 3000 facial images extracted from the daily news and weather forecast of the public tv-station PHOENIX. Unlike the majority of currently existing facial expression datasets, FePh provides sequenced semi-blurry facial images with different head poses, orientations, and movements. In addition, in the majority of images, identities are mouthing the words, which makes the data more challenging. To annotate this dataset we consider primary, secondary, and tertiary dyads of seven basic emotions of "sad", "surprise", "fear", "angry", "neutral", "disgust", and "happy". We also considered the "None" class if the image’s facial expression could not be described by any of the aforementioned emotions. Although we provide FePh as a facial expression dataset of signers in sign language, it has a wider application in gesture recognition and Human Computer Interaction (HCI) systems. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=annotated%20facial%20expression%20dataset" title="annotated facial expression dataset">annotated facial expression dataset</a>, <a href="https://publications.waset.org/abstracts/search?q=gesture%20recognition" title=" gesture recognition"> gesture recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=sequenced%20facial%20expression%20dataset" title=" sequenced facial expression dataset"> sequenced facial expression dataset</a>, <a href="https://publications.waset.org/abstracts/search?q=sign%20language%20recognition" title=" sign language recognition"> sign language recognition</a> </p> <a href="https://publications.waset.org/abstracts/129717/facial-expression-phoenix-feph-an-annotated-sequenced-dataset-for-facial-and-emotion-specified-expressions-in-sign-language" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/129717.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">159</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">11</span> Comparison Study of Machine Learning Classifiers for Speech Emotion Recognition</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Aishwarya%20Ravindra%20Fursule">Aishwarya Ravindra Fursule</a>, <a href="https://publications.waset.org/abstracts/search?q=Shruti%20Kshirsagar"> Shruti Kshirsagar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In the intersection of artificial intelligence and human-centered computing, this paper delves into speech emotion recognition (SER). It presents a comparative analysis of machine learning models such as K-Nearest Neighbors (KNN),logistic regression, support vector machines (SVM), decision trees, ensemble classifiers, and random forests, applied to SER. The research employs four datasets: Crema D, SAVEE, TESS, and RAVDESS. It focuses on extracting salient audio signal features like Zero Crossing Rate (ZCR), Chroma_stft, Mel Frequency Cepstral Coefficients (MFCC), root mean square (RMS) value, and MelSpectogram. These features are used to train and evaluate the models’ ability to recognize eight types of emotions from speech: happy, sad, neutral, angry, calm, disgust, fear, and surprise. Among the models, the Random Forest algorithm demonstrated superior performance, achieving approximately 79% accuracy. This suggests its suitability for SER within the parameters of this study. The research contributes to SER by showcasing the effectiveness of various machine learning algorithms and feature extraction techniques. The findings hold promise for the development of more precise emotion recognition systems in the future. This abstract provides a succinct overview of the paper’s content, methods, and results. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=comparison" title="comparison">comparison</a>, <a href="https://publications.waset.org/abstracts/search?q=ML%20classifiers" title=" ML classifiers"> ML classifiers</a>, <a href="https://publications.waset.org/abstracts/search?q=KNN" title=" KNN"> KNN</a>, <a href="https://publications.waset.org/abstracts/search?q=decision%20tree" title=" decision tree"> decision tree</a>, <a href="https://publications.waset.org/abstracts/search?q=SVM" title=" SVM"> SVM</a>, <a href="https://publications.waset.org/abstracts/search?q=random%20forest" title=" random forest"> random forest</a>, <a href="https://publications.waset.org/abstracts/search?q=logistic%20regression" title=" logistic regression"> logistic regression</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20classifiers" title=" ensemble classifiers"> ensemble classifiers</a> </p> <a href="https://publications.waset.org/abstracts/185323/comparison-study-of-machine-learning-classifiers-for-speech-emotion-recognition" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/185323.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">45</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">10</span> Exploring Public Opinions Toward the Use of Generative Artificial Intelligence Chatbot in Higher Education: An Insight from Topic Modelling and Sentiment Analysis</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Samer%20Muthana%20Sarsam">Samer Muthana Sarsam</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdul%20Samad%20Shibghatullah"> Abdul Samad Shibghatullah</a>, <a href="https://publications.waset.org/abstracts/search?q=Chit%20Su%20Mon"> Chit Su Mon</a>, <a href="https://publications.waset.org/abstracts/search?q=Abd%20Aziz%20Alias"> Abd Aziz Alias</a>, <a href="https://publications.waset.org/abstracts/search?q=Hosam%20Al-Samarraie"> Hosam Al-Samarraie</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Generative Artificial Intelligence chatbots (GAI chatbots) have emerged as promising tools in various domains, including higher education. However, their specific role within the educational context and the level of legal support for their implementation remain unclear. Therefore, this study aims to investigate the role of Bard, a newly developed GAI chatbot, in higher education. To achieve this objective, English tweets were collected from Twitter's free streaming Application Programming Interface (API). The Latent Dirichlet Allocation (LDA) algorithm was applied to extract latent topics from the collected tweets. User sentiments, including disgust, surprise, sadness, anger, fear, joy, anticipation, and trust, as well as positive and negative sentiments, were extracted using the NRC Affect Intensity Lexicon and SentiStrength tools. This study explored the benefits, challenges, and future implications of integrating GAI chatbots in higher education. The findings shed light on the potential power of such tools, exemplified by Bard, in enhancing the learning process and providing support to students throughout their educational journey. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=generative%20artificial%20intelligence%20chatbots" title="generative artificial intelligence chatbots">generative artificial intelligence chatbots</a>, <a href="https://publications.waset.org/abstracts/search?q=bard" title=" bard"> bard</a>, <a href="https://publications.waset.org/abstracts/search?q=higher%20education" title=" higher education"> higher education</a>, <a href="https://publications.waset.org/abstracts/search?q=topic%20modelling" title=" topic modelling"> topic modelling</a>, <a href="https://publications.waset.org/abstracts/search?q=sentiment%20analysis" title=" sentiment analysis"> sentiment analysis</a> </p> <a href="https://publications.waset.org/abstracts/167942/exploring-public-opinions-toward-the-use-of-generative-artificial-intelligence-chatbot-in-higher-education-an-insight-from-topic-modelling-and-sentiment-analysis" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/167942.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">83</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">9</span> Dynamic Gabor Filter Facial Features-Based Recognition of Emotion in Video Sequences</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=T.%20Hari%20Prasath">T. Hari Prasath</a>, <a href="https://publications.waset.org/abstracts/search?q=P.%20Ithaya%20Rani"> P. Ithaya Rani</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In the world of visual technology, recognizing emotions from the face images is a challenging task. Several related methods have not utilized the dynamic facial features effectively for high performance. This paper proposes a method for emotions recognition using dynamic facial features with high performance. Initially, local features are captured by Gabor filter with different scale and orientations in each frame for finding the position and scale of face part from different backgrounds. The Gabor features are sent to the ensemble classifier for detecting Gabor facial features. The region of dynamic features is captured from the Gabor facial features in the consecutive frames which represent the dynamic variations of facial appearances. In each region of dynamic features is normalized using Z-score normalization method which is further encoded into binary pattern features with the help of threshold values. The binary features are passed to Multi-class AdaBoost classifier algorithm with the well-trained database contain happiness, sadness, surprise, fear, anger, disgust, and neutral expressions to classify the discriminative dynamic features for emotions recognition. The developed method is deployed on the Ryerson Multimedia Research Lab and Cohn-Kanade databases and they show significant performance improvement owing to their dynamic features when compared with the existing methods. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=detecting%20face" title="detecting face">detecting face</a>, <a href="https://publications.waset.org/abstracts/search?q=Gabor%20filter" title=" Gabor filter"> Gabor filter</a>, <a href="https://publications.waset.org/abstracts/search?q=multi-class%20AdaBoost%20classifier" title=" multi-class AdaBoost classifier"> multi-class AdaBoost classifier</a>, <a href="https://publications.waset.org/abstracts/search?q=Z-score%20normalization" title=" Z-score normalization"> Z-score normalization</a> </p> <a href="https://publications.waset.org/abstracts/85005/dynamic-gabor-filter-facial-features-based-recognition-of-emotion-in-video-sequences" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/85005.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">278</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">8</span> Making Sense of Cyber Pornography among Young Adult Couples</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Marianne%20Lumacang">Marianne Lumacang</a>, <a href="https://publications.waset.org/abstracts/search?q=Jessarine%20Dultra"> Jessarine Dultra</a>, <a href="https://publications.waset.org/abstracts/search?q=Joana%20Fenol"> Joana Fenol</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Filipinos are known to be conservative, sex or pornography is not discussed openly in the Philippines, topic of sex, when raised, will most likely elicit snickers, jokes, and blushes in most Filipino or expressions of disgust. However, a lot of Filipinos are still engaging into this kind of activity for some reasons. The study aims to determine young adult’s point of view about cyber pornography viewing, as well as their reasons for engagement, and its effects on them and their relationship with their partner. Interpretative Phenomenological Analysis was used to explore how young adults make sense of cyber pornography viewing. The study focused on Filipino young adults who are in a romantic or married relationship, engage in cyber pornography viewing, and currently residing in Cavite, Philippines. A total of four young adult couples, four females and four males participated in the study as research participants. Data gathered from a total of four young adult couples resulted to a total of nine superordinate themes focusing on (1) exploring young adult couple’s rationales for cyber pornography viewing, (2) experiences of positive effects in engaging to cyber pornography viewing, (3) experiences of negative effects in engaging to cyber pornography viewing, (4) experience of infidelity, (5) experience of necessity, (6) females perception about cyber pornography viewing towards self, (7) males perception about cyber pornography viewing towards self, (8) males perception about cyber pornography viewing towards romantic partner, and (9) males perception about cyber pornography viewing towards others. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=cyberpornography" title="cyberpornography">cyberpornography</a>, <a href="https://publications.waset.org/abstracts/search?q=Filipino" title=" Filipino"> Filipino</a>, <a href="https://publications.waset.org/abstracts/search?q=interpretative%20phenomenological%20analysis" title=" interpretative phenomenological analysis"> interpretative phenomenological analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=making%20sense%20of%20cyberpornography" title=" making sense of cyberpornography"> making sense of cyberpornography</a>, <a href="https://publications.waset.org/abstracts/search?q=young%20adult" title=" young adult"> young adult</a> </p> <a href="https://publications.waset.org/abstracts/62378/making-sense-of-cyber-pornography-among-young-adult-couples" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/62378.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">313</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7</span> Text Emotion Recognition by Multi-Head Attention based Bidirectional LSTM Utilizing Multi-Level Classification</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Vishwanath%20Pethri%20Kamath">Vishwanath Pethri Kamath</a>, <a href="https://publications.waset.org/abstracts/search?q=Jayantha%20Gowda%20Sarapanahalli"> Jayantha Gowda Sarapanahalli</a>, <a href="https://publications.waset.org/abstracts/search?q=Vishal%20Mishra"> Vishal Mishra</a>, <a href="https://publications.waset.org/abstracts/search?q=Siddhesh%20Balwant%20Bandgar"> Siddhesh Balwant Bandgar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Recognition of emotional information is essential in any form of communication. Growing HCI (Human-Computer Interaction) in recent times indicates the importance of understanding of emotions expressed and becomes crucial for improving the system or the interaction itself. In this research work, textual data for emotion recognition is used. The text being the least expressive amongst the multimodal resources poses various challenges such as contextual information and also sequential nature of the language construction. In this research work, the proposal is made for a neural architecture to resolve not less than 8 emotions from textual data sources derived from multiple datasets using google pre-trained word2vec word embeddings and a Multi-head attention-based bidirectional LSTM model with a one-vs-all Multi-Level Classification. The emotions targeted in this research are Anger, Disgust, Fear, Guilt, Joy, Sadness, Shame, and Surprise. Textual data from multiple datasets were used for this research work such as ISEAR, Go Emotions, Affect datasets for creating the emotions’ dataset. Data samples overlap or conflicts were considered with careful preprocessing. Our results show a significant improvement with the modeling architecture and as good as 10 points improvement in recognizing some emotions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=text%20emotion%20recognition" title="text emotion recognition">text emotion recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=bidirectional%20LSTM" title=" bidirectional LSTM"> bidirectional LSTM</a>, <a href="https://publications.waset.org/abstracts/search?q=multi-head%20attention" title=" multi-head attention"> multi-head attention</a>, <a href="https://publications.waset.org/abstracts/search?q=multi-level%20classification" title=" multi-level classification"> multi-level classification</a>, <a href="https://publications.waset.org/abstracts/search?q=google%20word2vec%20word%20embeddings" title=" google word2vec word embeddings"> google word2vec word embeddings</a> </p> <a href="https://publications.waset.org/abstracts/148957/text-emotion-recognition-by-multi-head-attention-based-bidirectional-lstm-utilizing-multi-level-classification" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/148957.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">174</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6</span> Homosexuality and Culture: A Case Study Depicting the Struggles of a Married Lady</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Athulya%20Jayakumar">Athulya Jayakumar</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Manjula"> M. Manjula</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Though there has been a shift in the understanding of homosexuality from being a sin, crime or pathology in the medical and legal perspectives, the acceptance of homosexuality still remains very scanty in the Indian subcontinent. The present case study is a 24-year-old female who has completed a diploma in polytechnic engineering and residing in the state of Kerala. She initially presented with her husband with complaints of lack of sexual desire and non-cooperation from the index client. After an initial few sessions, the client revealed, in an individual session, about her homosexual orientation which was unknown to her family. She has had multiple short-term relations with females and never had any heterosexual orientation/interest. During her adolescence, she was wondering if she could change herself into a male. However, currently, she accepts her gender. She never wanted a heterosexual marriage; but, had to succumb to the pressure of mother, as a result of a series of unexpected incidents at home and had to agree for the marriage, also with a hope that she may change herself into a bi-sexual. The client was able to bond with the husband emotionally but the multiple attempts at sexual intercourse, at the insistence of the husband, had always been non-pleasurable and induced a sense of disgust. Currently, for several months, there has not been any sexual activity. Also, she actively avoids any chance to have a warm communication with him so that she can avoid chances of him approaching her in a sexual manner. The case study is an attempt to highlight the culture and the struggles of a homosexual individual who comes to therapy for wanting to be a ‘normal wife’ despite having knowledge of legal rights and scenario. There is a scarcity of Indian literature that has systematically investigated issues related to homosexuality. Data on prevalence, emotional problems faced and clinical services available are sparse though it is crucial for increasing understanding of sexual behaviour, orientation and difficulties faced in India. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=case%20study" title="case study">case study</a>, <a href="https://publications.waset.org/abstracts/search?q=culture" title=" culture"> culture</a>, <a href="https://publications.waset.org/abstracts/search?q=cognitive%20behavior%20therapy" title=" cognitive behavior therapy"> cognitive behavior therapy</a>, <a href="https://publications.waset.org/abstracts/search?q=female%20homosexuality" title=" female homosexuality"> female homosexuality</a> </p> <a href="https://publications.waset.org/abstracts/52344/homosexuality-and-culture-a-case-study-depicting-the-struggles-of-a-married-lady" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/52344.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">345</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">5</span> The Association between Affective States and Sexual/Health-Related Status among Men Who Have Sex with Men in China: An Exploration Study Using Social Media Data</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Zhi-Wei%20Zheng">Zhi-Wei Zheng</a>, <a href="https://publications.waset.org/abstracts/search?q=Zhong-Qi%20Liu"> Zhong-Qi Liu</a>, <a href="https://publications.waset.org/abstracts/search?q=Jia-Ling%20Qiu"> Jia-Ling Qiu</a>, <a href="https://publications.waset.org/abstracts/search?q=Shan-Qing%20Guo"> Shan-Qing Guo</a>, <a href="https://publications.waset.org/abstracts/search?q=Zhong-Wei%20Jia"> Zhong-Wei Jia</a>, <a href="https://publications.waset.org/abstracts/search?q=Chun%20Hao"> Chun Hao</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Objectives: The purpose of this study was to understand and examine the association between diurnal mood variation and sexual/health-related status among men who have sex with men (MSM) using data from MSM Chinese Twitter messages. The study consists of 843,745 postings of 377,610 MSM users located in Guangdong that were culled from the MSM Chinese Twitter App. Positive affect, negative affect, sexual related behaviors, and health-related status were measured using the Simplified Chinese Linguistic Inquiry and Word Count. Emotions, including joy, sadness, anger, fear, and disgust were measured using the Weibo Basic Mood Lexicon. A positive sentiment score and a positive emotions score were also calculated. Linear regression models based on a permutation test were used to assess associations between affective states and sexual/health-related status. In the results, 5,871 active MSM users and their 477,374 postings were finally selected. MSM expressed positive affect and joy at 8 a.m. and expressed negative affect and negative emotions between 2 a.m. and 4 a.m. In addition, 25.1% of negative postings were directly related to health and 13.4% reported seeking social support during that sensitive period. MSM who were senior, educated, overweight or obese, self-identified as performing a versatile sex role, and with less followers, more followers, and less chat groups mainly expressed more negative affect and negative emotions. MSM who talked more about sexual-related behaviors had a higher positive sentiment score (β=0.29, p < 0.001) and a higher positive emotions score (β = 0.16, p < 0.001). MSM who reported more on their health status had a lower positive sentiment score (β = -0.83, p < 0.001) and a lower positive emotions score (β = -0.37, p < 0.001). The study concluded that psychological intervention based on an app for MSM should be conducted, as it may improve mental health. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=affect" title="affect">affect</a>, <a href="https://publications.waset.org/abstracts/search?q=men%20who%20have%20sex%20with%20men" title=" men who have sex with men"> men who have sex with men</a>, <a href="https://publications.waset.org/abstracts/search?q=sexual%20related%20behavior" title=" sexual related behavior"> sexual related behavior</a>, <a href="https://publications.waset.org/abstracts/search?q=health-related%20status" title=" health-related status"> health-related status</a>, <a href="https://publications.waset.org/abstracts/search?q=social%20media" title=" social media"> social media</a> </p> <a href="https://publications.waset.org/abstracts/95900/the-association-between-affective-states-and-sexualhealth-related-status-among-men-who-have-sex-with-men-in-china-an-exploration-study-using-social-media-data" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/95900.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">161</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">4</span> The Representation of the Medieval Idea of Ugliness in Messiaen's Saint François d’Assise</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Nana%20Katsia">Nana Katsia</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper explores the ways both medieval and medievalist conceptions of ugliness might be linked to the physical and spiritual transformation of the protagonists and how it is realised through specific musical rhythm, such as the dochmiac rhythm in the opera. As Eco and Henderson note, only one kind of ugliness could be represented in conformity with nature in the Middle Ages without destroying all aesthetic pleasure and, in turn, artistic beauty: namely, a form of ugliness which arouses disgust. Moreover, Eco explores the fact that the enemies of Christ who condemn, martyr, and crucify him are represented as wicked inside. In turn, the representation of inner wickedness and hostility toward God brings with it outward ugliness, coarseness, barbarity, and rage. Ultimately these result in the deformation of the figure. In all these regards, the non-beautiful is represented here as a necessary phase, which is not the case with classical (the ancient Greek) concepts of Beauty. As we can see, the understanding of disfigurement and ugliness in the Middle Ages was both varied and complex. In the Middle Ages, the disfigurement caused by leprosy (and other skin and bodily conditions) was interpreted, in a somewhat contradictory manner, as both a curse and a gift from God. Some saints’ lives even have the saint appealing to be inflicted with the disease as part of their mission toward true humility. We shall explore that this ‘different concept’ of ugliness (non-classical beauty) might be represented in Messiaen’s opera. According to Messiaen, the Leper and Saint François are the principal characters of the third scene, as both of them will be transformed, and a double miracle will take place in the process. Messiaen mirrors the idea of the true humility of Saint’s life and positions Le Baiser au Lépreux as the culmination of the first act. The Leper’s character represents his physical and spiritual disfigurement, which are healed after the miracle. So, the scene can be viewed as an encounter between beauty and ugliness, and that much of it is spent in a study of ugliness. Dochmiac rhythm is one of the most important compositional elements in the opera. It plays a crucial role in the process of creating a dramatic musical narrative and structure in the composition. As such, we shall explore how Messiaen represents the medieval idea of ugliness in the opera through particular musical elements linked to the main protagonists’ spiritual or physical ugliness; why Messiaen makes reference to dochmiac rhythm, and how they create the musical and dramatic context in the opera for the medieval aesthetic category of ugliness. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ugliness%20in%20music" title="ugliness in music">ugliness in music</a>, <a href="https://publications.waset.org/abstracts/search?q=medieval%20time" title=" medieval time"> medieval time</a>, <a href="https://publications.waset.org/abstracts/search?q=saint%20fran%C3%A7ois%20d%E2%80%99assise" title=" saint françois d’assise"> saint françois d’assise</a>, <a href="https://publications.waset.org/abstracts/search?q=messiaen" title=" messiaen"> messiaen</a> </p> <a href="https://publications.waset.org/abstracts/146010/the-representation-of-the-medieval-idea-of-ugliness-in-messiaens-saint-francois-dassise" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/146010.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">146</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3</span> Facial Behavior Modifications Following the Diffusion of the Use of Protective Masks Due to COVID-19</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Andreas%20Aceranti">Andreas Aceranti</a>, <a href="https://publications.waset.org/abstracts/search?q=Simonetta%20Vernocchi"> Simonetta Vernocchi</a>, <a href="https://publications.waset.org/abstracts/search?q=Marco%20Colorato"> Marco Colorato</a>, <a href="https://publications.waset.org/abstracts/search?q=Daniel%20Zaccariello"> Daniel Zaccariello</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Our study explores the usefulness of implementing facial expression recognition capabilities and using the Facial Action Coding System (FACS) in contexts where the other person is wearing a mask. In the communication process, the subjects use a plurality of distinct and autonomous reporting systems. Among them, the system of mimicking facial movements is worthy of attention. Basic emotion theorists have identified the existence of specific and universal patterns of facial expressions related to seven basic emotions -anger, disgust, contempt, fear, sadness, surprise, and happiness- that would distinguish one emotion from another. However, due to the COVID-19 pandemic, we have come up against the problem of having the lower half of the face covered and, therefore, not investigable due to the masks. Facial-emotional behavior is a good starting point for understanding: (1) the affective state (such as emotions), (2) cognitive activity (perplexity, concentration, boredom), (3) temperament and personality traits (hostility, sociability, shyness), (4) psychopathology (such as diagnostic information relevant to depression, mania, schizophrenia, and less severe disorders), (5) psychopathological processes that occur during social interactions patient and analyst. There are numerous methods to measure facial movements resulting from the action of muscles, see for example, the measurement of visible facial actions using coding systems (non-intrusive systems that require the presence of an observer who encodes and categorizes behaviors) and the measurement of electrical "discharges" of contracting muscles (facial electromyography; EMG). However, the measuring system invented by Ekman and Friesen (2002) - "Facial Action Coding System - FACS" is the most comprehensive, complete, and versatile. Our study, carried out on about 1,500 subjects over three years of work, allowed us to highlight how the movements of the hands and upper part of the face change depending on whether the subject wears a mask or not. We have been able to identify specific alterations to the subjects’ hand movement patterns and their upper face expressions while wearing masks compared to when not wearing them. We believe that finding correlations between how body language changes when our facial expressions are impaired can provide a better understanding of the link between the face and body non-verbal language. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=facial%20action%20coding%20system" title="facial action coding system">facial action coding system</a>, <a href="https://publications.waset.org/abstracts/search?q=COVID-19" title=" COVID-19"> COVID-19</a>, <a href="https://publications.waset.org/abstracts/search?q=masks" title=" masks"> masks</a>, <a href="https://publications.waset.org/abstracts/search?q=facial%20analysis" title=" facial analysis"> facial analysis</a> </p> <a href="https://publications.waset.org/abstracts/160896/facial-behavior-modifications-following-the-diffusion-of-the-use-of-protective-masks-due-to-covid-19" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/160896.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">79</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2</span> Optimized Deep Learning-Based Facial Emotion Recognition System</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Erick%20C.%20Valverde">Erick C. Valverde</a>, <a href="https://publications.waset.org/abstracts/search?q=Wansu%20Lim"> Wansu Lim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Facial emotion recognition (FER) system has been recently developed for more advanced computer vision applications. The ability to identify human emotions would enable smart healthcare facility to diagnose mental health illnesses (e.g., depression and stress) as well as better human social interactions with smart technologies. The FER system involves two steps: 1) face detection task and 2) facial emotion recognition task. It classifies the human expression in various categories such as angry, disgust, fear, happy, sad, surprise, and neutral. This system requires intensive research to address issues with human diversity, various unique human expressions, and variety of human facial features due to age differences. These issues generally affect the ability of the FER system to detect human emotions with high accuracy. Early stage of FER systems used simple supervised classification task algorithms like K-nearest neighbors (KNN) and artificial neural networks (ANN). These conventional FER systems have issues with low accuracy due to its inefficiency to extract significant features of several human emotions. To increase the accuracy of FER systems, deep learning (DL)-based methods, like convolutional neural networks (CNN), are proposed. These methods can find more complex features in the human face by means of the deeper connections within its architectures. However, the inference speed and computational costs of a DL-based FER system is often disregarded in exchange for higher accuracy results. To cope with this drawback, an optimized DL-based FER system is proposed in this study.An extreme version of Inception V3, known as Xception model, is leveraged by applying different network optimization methods. Specifically, network pruning and quantization are used to enable lower computational costs and reduce memory usage, respectively. To support low resource requirements, a 68-landmark face detector from Dlib is used in the early step of the FER system.Furthermore, a DL compiler is utilized to incorporate advanced optimization techniques to the Xception model to improve the inference speed of the FER system. In comparison to VGG-Net and ResNet50, the proposed optimized DL-based FER system experimentally demonstrates the objectives of the network optimization methods used. As a result, the proposed approach can be used to create an efficient and real-time FER system. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title="deep learning">deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=face%20detection" title=" face detection"> face detection</a>, <a href="https://publications.waset.org/abstracts/search?q=facial%20emotion%20recognition" title=" facial emotion recognition"> facial emotion recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=network%20optimization%20methods" title=" network optimization methods"> network optimization methods</a> </p> <a href="https://publications.waset.org/abstracts/147341/optimized-deep-learning-based-facial-emotion-recognition-system" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/147341.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">118</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1</span> High-Resolution Facial Electromyography in Freely Behaving Humans</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Lilah%20Inzelberg">Lilah Inzelberg</a>, <a href="https://publications.waset.org/abstracts/search?q=David%20Rand"> David Rand</a>, <a href="https://publications.waset.org/abstracts/search?q=Stanislav%20Steinberg"> Stanislav Steinberg</a>, <a href="https://publications.waset.org/abstracts/search?q=Moshe%20David%20Pur"> Moshe David Pur</a>, <a href="https://publications.waset.org/abstracts/search?q=Yael%20Hanein"> Yael Hanein</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Human facial expressions carry important psychological and neurological information. Facial expressions involve the co-activation of diverse muscles. They depend strongly on personal affective interpretation and on social context and vary between spontaneous and voluntary activations. Smiling, as a special case, is among the most complex facial emotional expressions, involving no fewer than 7 different unilateral muscles. Despite their ubiquitous nature, smiles remain an elusive and debated topic. Smiles are associated with happiness and greeting on one hand and anger or disgust-masking on the other. Accordingly, while high-resolution recording of muscle activation patterns, in a non-interfering setting, offers exciting opportunities, it remains an unmet challenge, as contemporary surface facial electromyography (EMG) methodologies are cumbersome, restricted to the laboratory settings, and are limited in time and resolution. Here we present a wearable and non-invasive method for objective mapping of facial muscle activation and demonstrate its application in a natural setting. The technology is based on a recently developed dry and soft electrode array, specially designed for surface facial EMG technique. Eighteen healthy volunteers (31.58 ± 3.41 years, 13 females), participated in the study. Surface EMG arrays were adhered to participant left and right cheeks. Participants were instructed to imitate three facial expressions: closing the eyes, wrinkling the nose and smiling voluntary and to watch a funny video while their EMG signal is recorded. We focused on muscles associated with 'enjoyment', 'social' and 'masked' smiles; three categories with distinct social meanings. We developed a customized independent component analysis algorithm to construct the desired facial musculature mapping. First, identification of the Orbicularis oculi and the Levator labii superioris muscles was demonstrated from voluntary expressions. Second, recordings of voluntary and spontaneous smiles were used to locate the Zygomaticus major muscle activated in Duchenne and non-Duchenne smiles. Finally, recording with a wireless device in an unmodified natural work setting revealed expressions of neutral, positive and negative emotions in face-to-face interaction. The algorithm outlined here identifies the activation sources in a subject-specific manner, insensitive to electrode placement and anatomical diversity. Our high-resolution and cross-talk free mapping performances, along with excellent user convenience, open new opportunities for affective processing and objective evaluation of facial expressivity, objective psychological and neurological assessment as well as gaming, virtual reality, bio-feedback and brain-machine interface applications. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=affective%20expressions" title="affective expressions">affective expressions</a>, <a href="https://publications.waset.org/abstracts/search?q=affective%20processing" title=" affective processing"> affective processing</a>, <a href="https://publications.waset.org/abstracts/search?q=facial%20EMG" title=" facial EMG"> facial EMG</a>, <a href="https://publications.waset.org/abstracts/search?q=high-resolution%20electromyography" title=" high-resolution electromyography"> high-resolution electromyography</a>, <a href="https://publications.waset.org/abstracts/search?q=independent%20component%20analysis" title=" independent component analysis"> independent component analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=wireless%20electrodes" title=" wireless electrodes"> wireless electrodes</a> </p> <a href="https://publications.waset.org/abstracts/79674/high-resolution-facial-electromyography-in-freely-behaving-humans" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/79674.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">246</span> </span> </div> </div> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">© 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">×</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>