CINXE.COM

Search results for: facial filler

<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: facial filler</title> <meta name="description" content="Search results for: facial filler"> <meta name="keywords" content="facial filler"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="facial filler" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="facial filler"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 524</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: facial filler</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">524</span> Exploring the Efficacy of Nitroglycerin in Filler-Induced Facial Skin Ischemia: A Narrative ‎Review</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Amir%20Feily">Amir Feily</a>, <a href="https://publications.waset.org/abstracts/search?q=Hazhir%20Shahmoradi%20Akram"> Hazhir Shahmoradi Akram</a>, <a href="https://publications.waset.org/abstracts/search?q=Mojtaba%20Ghaedi"> Mojtaba Ghaedi</a>, <a href="https://publications.waset.org/abstracts/search?q=Farshid%20Javdani"> Farshid Javdani</a>, <a href="https://publications.waset.org/abstracts/search?q=Naser%20Hatami"> Naser Hatami</a>, <a href="https://publications.waset.org/abstracts/search?q=Navid%20Kalani"> Navid Kalani</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohammad%20Zarenezhad"> Mohammad Zarenezhad</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Background: Filler-induced facial skin ischemia is a potential complication of dermal filler injections that can result in tissue damage and necrosis. Nitroglycerin has been suggested as a treatment option due to its vasodilatory effects, but its efficacy in this context is unclear. Methods: A narrative review was conducted to examine the available evidence on the efficacy of nitroglycerin in filler-induced facial skin ischemia. Relevant studies were identified through a search of electronic databases and manual searching of reference lists. Results: The review found limited evidence supporting the efficacy of nitroglycerin in this context. While there were case reports where the combination of nitroglycerin and hyaluronidase was successful in treating filler-induced facial skin ischemia, there was only one case report where nitroglycerin alone was successful. Furthermore, a rat model did not demonstrate any benefits of nitroglycerin and showed harmful results. Conclusion: The evidence regarding the efficacy of nitroglycerin in filler-induced facial skin ischemia is inconclusive and seems to be against its application. Further research is needed to determine the effectiveness of nitroglycerin alone and in combination with other treatments for this condition. Clinicians should consider limited evidence bases when deciding on treatment options for patients with filler-induced facial skin ischemia. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=nitroglycerin" title="nitroglycerin">nitroglycerin</a>, <a href="https://publications.waset.org/abstracts/search?q=facial" title=" facial"> facial</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20ischemia" title=" skin ischemia"> skin ischemia</a>, <a href="https://publications.waset.org/abstracts/search?q=fillers" title=" fillers"> fillers</a>, <a href="https://publications.waset.org/abstracts/search?q=efficacy" title=" efficacy"> efficacy</a>, <a href="https://publications.waset.org/abstracts/search?q=narrative%20review" title=" narrative review"> narrative review</a> </p> <a href="https://publications.waset.org/abstracts/171621/exploring-the-efficacy-of-nitroglycerin-in-filler-induced-facial-skin-ischemia-a-narrative-review" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/171621.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">92</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">523</span> Algorithmic Approach to Management of Complications of Permanent Facial Filler: A Saudi Experience</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Luay%20Alsalmi">Luay Alsalmi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Background: Facial filler is the most common type of cosmetic surgery next to botox. Permanent filler is preferred nowadays due to the low cost brought about by non-recurring injection appointments. However, such fillers pose a higher risk for complications, with even greater adverse effects when the procedure is done using unknown dermal filler injections. AIM: This study aimed to establish an algorithm to categorize and manage patients that receive permanent fillers. Materials and Methods: Twelve participants were presented to the service through emergency or as outpatient from November 2015 to May 2021. Demographics such as age, sex, date of injection, time of onset, and types of complications were collected. After examination, all cases were managed based on an algorithm established. FACE-Q was used to measure overall satisfaction and psychological well-being. Results: The algorithm to diagnose and manage these patients effectively with a high satisfaction rate was established in this study. All participants were non-smoker females with no known medical comorbidities. The algorithm presented determined the treatment plan when faced with complications. Results revealed high appearance-related psychosocial distress was observed prior to surgery, while it significantly dropped after surgery. FACE-Q was able to establish evidence of satisfactory ratings among patients prior to and after surgery. Conclusion: This treatment algorithm can guide the surgeon in formulating a suitable plan with fewer complications and a high satisfaction rate. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=facial%20filler" title="facial filler">facial filler</a>, <a href="https://publications.waset.org/abstracts/search?q=FACE-Q" title=" FACE-Q"> FACE-Q</a>, <a href="https://publications.waset.org/abstracts/search?q=psycho-social%20stress" title=" psycho-social stress"> psycho-social stress</a>, <a href="https://publications.waset.org/abstracts/search?q=botox" title=" botox"> botox</a>, <a href="https://publications.waset.org/abstracts/search?q=treatment%20algorithm" title=" treatment algorithm"> treatment algorithm</a> </p> <a href="https://publications.waset.org/abstracts/155489/algorithmic-approach-to-management-of-complications-of-permanent-facial-filler-a-saudi-experience" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/155489.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">84</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">522</span> Use of Computer and Machine Learning in Facial Recognition</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Neha%20Singh">Neha Singh</a>, <a href="https://publications.waset.org/abstracts/search?q=Ananya%20Arora"> Ananya Arora</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Facial expression measurement plays a crucial role in the identification of emotion. Facial expression plays a key role in psychophysiology, neural bases, and emotional disorder, to name a few. The Facial Action Coding System (FACS) has proven to be the most efficient and widely used of the various systems used to describe facial expressions. Coders can manually code facial expressions with FACS and, by viewing video-recorded facial behaviour at a specified frame rate and slow motion, can decompose into action units (AUs). Action units are the most minor visually discriminable facial movements. FACS explicitly differentiates between facial actions and inferences about what the actions mean. Action units are the fundamental unit of FACS methodology. It is regarded as the standard measure for facial behaviour and finds its application in various fields of study beyond emotion science. These include facial neuromuscular disorders, neuroscience, computer vision, computer graphics and animation, and face encoding for digital processing. This paper discusses the conceptual basis for FACS, a numerical listing of discrete facial movements identified by the system, the system's psychometric evaluation, and the software's recommended training requirements. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=facial%20action" title="facial action">facial action</a>, <a href="https://publications.waset.org/abstracts/search?q=action%20units" title=" action units"> action units</a>, <a href="https://publications.waset.org/abstracts/search?q=coding" title=" coding"> coding</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title=" machine learning"> machine learning</a> </p> <a href="https://publications.waset.org/abstracts/161142/use-of-computer-and-machine-learning-in-facial-recognition" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/161142.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">106</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">521</span> Management of Facial Nerve Palsy Following Physiotherapy </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Bassam%20Band">Bassam Band</a>, <a href="https://publications.waset.org/abstracts/search?q=Simon%20Freeman"> Simon Freeman</a>, <a href="https://publications.waset.org/abstracts/search?q=Rohan%20Munir"> Rohan Munir</a>, <a href="https://publications.waset.org/abstracts/search?q=Hisham%20Band"> Hisham Band</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Objective: To determine efficacy of facial physiotherapy provided for patients with facial nerve palsy. Design: Retrospective study Subjects: 54 patients diagnosed with Facial nerve palsy were included in the study after they met the selection criteria including unilateral facial paralysis and start of therapy twelve months after the onset of facial nerve palsy. Interventions: Patients received the treatment offered at a facial physiotherapy clinic consisting of: Trophic electrical stimulation, surface electromyography with biofeedback, neuromuscular re-education and myofascial release. Main measures: The Sunnybrook facial grading scale was used to evaluate the severity of facial paralysis. Results: This study demonstrated the positive impact of physiotherapy for patient with facial nerve palsy with improvement of 24.2% on the Sunnybrook facial grading score from a mean baseline of 34.2% to 58.2%. The greatest improvement looking at different causes was seen in patient who had reconstructive surgery post Acoustic Neuroma at 31.3%. Conclusion: The therapy shows significant improvement for patients with facial nerve palsy even when started 12 months post onset of paralysis across different causes. This highlights the benefit of this non-invasive technique in managing facial nerve paralysis and possibly preventing the need for surgery. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=facial%20nerve%20palsy" title="facial nerve palsy">facial nerve palsy</a>, <a href="https://publications.waset.org/abstracts/search?q=treatment" title=" treatment"> treatment</a>, <a href="https://publications.waset.org/abstracts/search?q=physiotherapy" title=" physiotherapy"> physiotherapy</a>, <a href="https://publications.waset.org/abstracts/search?q=bells%20palsy" title=" bells palsy"> bells palsy</a>, <a href="https://publications.waset.org/abstracts/search?q=acoustic%20neuroma" title=" acoustic neuroma"> acoustic neuroma</a>, <a href="https://publications.waset.org/abstracts/search?q=ramsey-hunt%20syndrome" title=" ramsey-hunt syndrome"> ramsey-hunt syndrome</a> </p> <a href="https://publications.waset.org/abstracts/19940/management-of-facial-nerve-palsy-following-physiotherapy" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/19940.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">535</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">520</span> The Electrical Properties of Polyester Materials as Outdoor Insulators</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=R.%20M.%20EL-Sharkawy">R. M. EL-Sharkawy</a>, <a href="https://publications.waset.org/abstracts/search?q=L.%20S.%20Nasrat"> L. S. Nasrat</a>, <a href="https://publications.waset.org/abstracts/search?q=K.%20B.%20Ewiss"> K. B. Ewiss</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This work presents a study of flashover voltage for outdoor polyester and composite insulators under dry, ultra-violet and contaminated conditions. Cylindrical of polyester composite samples (with different lengths) have been prepared after incorporated with different concentration of inorganic filler e.g. Magnesium Hydroxide [Mg(OH)2] to improve the electrical and thermal properties in addition to maximize surface flashover voltage and decrease tracking phenomena. Results showed that flashover voltage reaches to 46 kV for samples without filler and 52.6 kV for samples containing 40% of [Mg(OH)2] filler in dry condition. A comparison between different concentrations of filler under various environmental conditions (dry and contaminated conditions) showed higher flashover voltage values for samples containing filler with ratio 40% [Mg(OH)2] and length 3cm than that of samples containing filler [Mg(OH)2] with ratios 20%, 30% and lengths 0.5cm, 1cm, 2cm and 2.5cm. Flashover voltage decreases by adding [Mg(OH)2] filler for polyester samples under ultra-violet condition; as the ratio of filler increases, the value of flashover voltage decreases Also, in this study, the effect of thermal performance with respect to surface of the sample under test have been investigated in details. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=flashover%20voltage" title="flashover voltage">flashover voltage</a>, <a href="https://publications.waset.org/abstracts/search?q=filler" title=" filler"> filler</a>, <a href="https://publications.waset.org/abstracts/search?q=polymers" title=" polymers"> polymers</a>, <a href="https://publications.waset.org/abstracts/search?q=ultra-violet%20radiation" title=" ultra-violet radiation"> ultra-violet radiation</a> </p> <a href="https://publications.waset.org/abstracts/40599/the-electrical-properties-of-polyester-materials-as-outdoor-insulators" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/40599.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">315</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">519</span> Automatic Facial Skin Segmentation Using Possibilistic C-Means Algorithm for Evaluation of Facial Surgeries</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Elham%20Alaee">Elham Alaee</a>, <a href="https://publications.waset.org/abstracts/search?q=Mousa%20Shamsi"> Mousa Shamsi</a>, <a href="https://publications.waset.org/abstracts/search?q=Hossein%20Ahmadi"> Hossein Ahmadi</a>, <a href="https://publications.waset.org/abstracts/search?q=Soroosh%20Nazem"> Soroosh Nazem</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohammad%20Hossein%20Sedaaghi"> Mohammad Hossein Sedaaghi </a> </p> <p class="card-text"><strong>Abstract:</strong></p> Human face has a fundamental role in the appearance of individuals. So the importance of facial surgeries is undeniable. Thus, there is a need for the appropriate and accurate facial skin segmentation in order to extract different features. Since Fuzzy C-Means (FCM) clustering algorithm doesn’t work appropriately for noisy images and outliers, in this paper we exploit Possibilistic C-Means (PCM) algorithm in order to segment the facial skin. For this purpose, first, we convert facial images from RGB to YCbCr color space. To evaluate performance of the proposed algorithm, the database of Sahand University of Technology, Tabriz, Iran was used. In order to have a better understanding from the proposed algorithm; FCM and Expectation-Maximization (EM) algorithms are also used for facial skin segmentation. The proposed method shows better results than the other segmentation methods. Results include misclassification error (0.032) and the region’s area error (0.045) for the proposed algorithm. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=facial%20image" title="facial image">facial image</a>, <a href="https://publications.waset.org/abstracts/search?q=segmentation" title=" segmentation"> segmentation</a>, <a href="https://publications.waset.org/abstracts/search?q=PCM" title=" PCM"> PCM</a>, <a href="https://publications.waset.org/abstracts/search?q=FCM" title=" FCM"> FCM</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20error" title=" skin error"> skin error</a>, <a href="https://publications.waset.org/abstracts/search?q=facial%20surgery" title=" facial surgery"> facial surgery</a> </p> <a href="https://publications.waset.org/abstracts/10297/automatic-facial-skin-segmentation-using-possibilistic-c-means-algorithm-for-evaluation-of-facial-surgeries" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/10297.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">586</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">518</span> Saline Aspiration Negative Intravascular Test: Mitigating Risk with Injectable Fillers</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Marcelo%20Lopes%20Dias%20Kolling">Marcelo Lopes Dias Kolling</a>, <a href="https://publications.waset.org/abstracts/search?q=Felipe%20Ferreira%20Laranjeira"> Felipe Ferreira Laranjeira</a>, <a href="https://publications.waset.org/abstracts/search?q=Guilherme%20Augusto%20Hettwer"> Guilherme Augusto Hettwer</a>, <a href="https://publications.waset.org/abstracts/search?q=Pedro%20Salom%C3%A3o%20Piccinini"> Pedro Salomão Piccinini</a>, <a href="https://publications.waset.org/abstracts/search?q=Marwan%20Masri"> Marwan Masri</a>, <a href="https://publications.waset.org/abstracts/search?q=Carlos%20Oscar%20Uebel"> Carlos Oscar Uebel</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Introduction: Injectable fillers are among the most common nonsurgical cosmetic procedures, with significant growth yearly. Knowledge of rheological and mechanical characteristics of fillers, facial anatomy, and injection technique is essential for safety. Concepts such as the use of cannula versus needle, aspiration before injection, and facial danger zones have been well discussed. In case of an accidental intravascular puncture, the pressure inside the vessel may not be sufficient to push blood into the syringe due to the characteristics of the filler product; this is especially true for calcium hydroxyapatite (CaHA) or hyaluronic acid (HA) fillers with high G’. Since viscoelastic properties of normal saline are much lower than those of fillers, aspiration with saline prior to filler injection may decrease the risk of a false negative aspiration and subsequent catastrophic effects. We discuss a technique to add an additional safety step to the procedure with saline aspiration prior to injection, a ‘’reverse Seldinger’’ technique for intravascular access, which we term SANIT: Saline Aspiration Negative Intravascular Test. Objectives: To demonstrate the author’s (PSP) technique which adds an additional safety step to the process of filler injection, with both CaHA and HA, in order to decrease the risk of intravascular injection. Materials and Methods: Normal skin cleansing and topical anesthesia with prilocaine/lidocaine cream are performed; the facial subunits to be treated are marked. A 3mL Luer lock syringe is filled with 2mL of 0.9% normal saline and a 27G needle, which is turned one half rotation. When a cannula is to be used, the Luer lock syringe is attached to a 27G 4cm single hole disposable cannula. After skin puncture, the 3mL syringe is advanced with the plunger pulled back (negative pressure). Progress is made to the desired depth, all the while aspirating. Once the desired location of filler injection is reached, the syringe is exchanged for the syringe containing a filler, securely grabbing the hub of the needle and taking care to not dislodge the needle tip. Prior to this, we remove 0.1mL of filler to allow for space inside the syringe for aspiration. We again aspirate and inject retrograde. SANIT is especially useful for CaHA, since the G’ is much higher than HA, and thus reflux of blood into the syringe is less likely to occur. Results: The technique has been used safely for the past two years with no adverse events; the increase in cost is negligible (only the cost of 2mL of normal saline). Over 100 patients (over 300 syringes) have been treated with this technique. The risk of accidental intravascular puncture has been calculated to be between 1:6410 to 1:40882 syringes among expert injectors; however, the consequences of intravascular injection can be catastrophic even with board-certified physicians. Conclusions: While the risk of intravascular filler injection is low, the consequences can be disastrous. We believe that adding the SANIT technique can help further mitigate risk with no significant untoward effects and could be considered by all performing injectable fillers. Further follow-up is ongoing. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=injectable%20fillers" title="injectable fillers">injectable fillers</a>, <a href="https://publications.waset.org/abstracts/search?q=safety" title=" safety"> safety</a>, <a href="https://publications.waset.org/abstracts/search?q=saline%20aspiration" title=" saline aspiration"> saline aspiration</a>, <a href="https://publications.waset.org/abstracts/search?q=injectable%20filler%20complications" title=" injectable filler complications"> injectable filler complications</a>, <a href="https://publications.waset.org/abstracts/search?q=hyaluronic%20acid" title=" hyaluronic acid"> hyaluronic acid</a>, <a href="https://publications.waset.org/abstracts/search?q=calcium%20hydroxyapatite" title=" calcium hydroxyapatite"> calcium hydroxyapatite</a> </p> <a href="https://publications.waset.org/abstracts/142402/saline-aspiration-negative-intravascular-test-mitigating-risk-with-injectable-fillers" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/142402.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">150</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">517</span> Quantification and Preference of Facial Asymmetry of the Sub-Saharan Africans&#039; 3D Facial Models</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Anas%20Ibrahim%20Yahaya">Anas Ibrahim Yahaya</a>, <a href="https://publications.waset.org/abstracts/search?q=Christophe%20Soligo"> Christophe Soligo</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A substantial body of literature has reported on facial symmetry and asymmetry and their role in human mate choice. However, major gaps persist, with nearly all data originating from the WEIRD (Western, Educated, Industrialised, Rich and Developed) populations, and results remaining largely equivocal when compared across studies. This study is aimed at quantifying facial asymmetry from the 3D faces of the Hausa of northern Nigeria and also aimed at determining their (Hausa) perceptions and judgements of standardised facial images with different levels of asymmetry using questionnaires. Data were analysed using R-studio software and results indicated that individuals with lower levels of facial asymmetry (near facial symmetry) were perceived as more attractive, more suitable as marriage partners and more caring, whereas individuals with higher levels of facial asymmetry were perceived as more aggressive. The study conclusively asserts that all faces are asymmetric including the most beautiful ones, and the preference of less asymmetric faces was not just dependent on single facial trait, but rather on multiple facial traits; thus the study supports that physical attractiveness is not just an arbitrary social construct, but at least in part a cue to general health and possibly related to environmental context. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=face" title="face">face</a>, <a href="https://publications.waset.org/abstracts/search?q=asymmetry" title=" asymmetry"> asymmetry</a>, <a href="https://publications.waset.org/abstracts/search?q=symmetry" title=" symmetry"> symmetry</a>, <a href="https://publications.waset.org/abstracts/search?q=Hausa" title=" Hausa"> Hausa</a>, <a href="https://publications.waset.org/abstracts/search?q=preference" title=" preference"> preference</a> </p> <a href="https://publications.waset.org/abstracts/82975/quantification-and-preference-of-facial-asymmetry-of-the-sub-saharan-africans-3d-facial-models" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/82975.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">193</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">516</span> Facial Expression Phoenix (FePh): An Annotated Sequenced Dataset for Facial and Emotion-Specified Expressions in Sign Language</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Marie%20Alaghband">Marie Alaghband</a>, <a href="https://publications.waset.org/abstracts/search?q=Niloofar%20Yousefi"> Niloofar Yousefi</a>, <a href="https://publications.waset.org/abstracts/search?q=Ivan%20Garibay"> Ivan Garibay</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Facial expressions are important parts of both gesture and sign language recognition systems. Despite the recent advances in both fields, annotated facial expression datasets in the context of sign language are still scarce resources. In this manuscript, we introduce an annotated sequenced facial expression dataset in the context of sign language, comprising over 3000 facial images extracted from the daily news and weather forecast of the public tv-station PHOENIX. Unlike the majority of currently existing facial expression datasets, FePh provides sequenced semi-blurry facial images with different head poses, orientations, and movements. In addition, in the majority of images, identities are mouthing the words, which makes the data more challenging. To annotate this dataset we consider primary, secondary, and tertiary dyads of seven basic emotions of &quot;sad&quot;, &quot;surprise&quot;, &quot;fear&quot;, &quot;angry&quot;, &quot;neutral&quot;, &quot;disgust&quot;, and &quot;happy&quot;. We also considered the &quot;None&quot; class if the image&rsquo;s facial expression could not be described by any of the aforementioned emotions. Although we provide FePh as a facial expression dataset of signers in sign language, it has a wider application in gesture recognition and Human Computer Interaction (HCI) systems. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=annotated%20facial%20expression%20dataset" title="annotated facial expression dataset">annotated facial expression dataset</a>, <a href="https://publications.waset.org/abstracts/search?q=gesture%20recognition" title=" gesture recognition"> gesture recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=sequenced%20facial%20expression%20dataset" title=" sequenced facial expression dataset"> sequenced facial expression dataset</a>, <a href="https://publications.waset.org/abstracts/search?q=sign%20language%20recognition" title=" sign language recognition"> sign language recognition</a> </p> <a href="https://publications.waset.org/abstracts/129717/facial-expression-phoenix-feph-an-annotated-sequenced-dataset-for-facial-and-emotion-specified-expressions-in-sign-language" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/129717.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">159</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">515</span> Study of Ladle Furnace Slag as Mineral Filler in Asphalt Concrete with Electric Arc Furnace Slag</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=W.%20J.%20Wang">W. J. Wang</a>, <a href="https://publications.waset.org/abstracts/search?q=D.%20F.%20Lin"> D. F. Lin</a>, <a href="https://publications.waset.org/abstracts/search?q=L.%20Y.%20Chen"> L. Y. Chen</a>, <a href="https://publications.waset.org/abstracts/search?q=K.%20Y.%20Liu"> K. Y. Liu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this study, the ladle furnace slag was used as a mineral filler in asphalt concrete with electric arc furnace slag (EAF asphalt concrete) to investigate the effect on the engineering and thermal properties of asphalt cement mastics and EAF asphalt concrete, the lime was used as a comparison for mineral filler, and the usage percentage of mineral filler was set at 2%, 4%, 6%, and 8%. First of all, the engineering properties of the ladle furnace slag and lime were compared, and then the mineral filler was mixed with bitumen to form the asphalt cement mastics in order to analyze the influence of the ladle furnace slag on the properties of asphalt cement mastics, and lastly, the mineral filler was used in the EAF asphalt concrete to analyze its feasibility of using ladle furnace slag as a mineral filler. The study result shows that the ladle furnace slag and the lime have no obvious difference in their physical properties, and from the energy dispersive spectrometer (EDS) test results, we know that the lime and the ladle furnace slag have similar elemental composition, but the Ca found in the ladle furnace slag belongs to CaO, and the lime belongs to CaCO3, therefore the ladle furnace slag has the property of expansion. According to the test results, the viscosity of asphalt cement mastics will increase with the increase in the use of mineral filler. Since the ladle furnace slag has more CaO content, the viscosity of the asphalt cement mastics with ladle furnace slag will increase more than using lime as mineral filler in the asphalt cement mastics, and the use of ladle furnace slag only needs to be 2% in order to achieve the effect of anti-peeling which is 6% for lime. From the related test results of EAF asphalt concrete, it is known that the maximum stability value can be obtained when the use of mineral filler is about 5%. When the ladle furnace slag is used as the mineral filler, it can improve the stiffness, indirect tension strength, spalling resistance, and thermal insulation of EAF asphalt concrete, which also indicates that using the ladle furnace slag as the mineral filler of bitumen can help to improve the durability of the asphalt pavement. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ladle%20furnace%20slag" title="ladle furnace slag">ladle furnace slag</a>, <a href="https://publications.waset.org/abstracts/search?q=mineral%20filler" title=" mineral filler"> mineral filler</a>, <a href="https://publications.waset.org/abstracts/search?q=asphalt%20cement%20mastics" title=" asphalt cement mastics"> asphalt cement mastics</a>, <a href="https://publications.waset.org/abstracts/search?q=EAF%20asphalt%20concrete" title=" EAF asphalt concrete"> EAF asphalt concrete</a> </p> <a href="https://publications.waset.org/abstracts/170204/study-of-ladle-furnace-slag-as-mineral-filler-in-asphalt-concrete-with-electric-arc-furnace-slag" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/170204.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">85</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">514</span> Comparing Emotion Recognition from Voice and Facial Data Using Time Invariant Features</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Vesna%20Kirandziska">Vesna Kirandziska</a>, <a href="https://publications.waset.org/abstracts/search?q=Nevena%20Ackovska"> Nevena Ackovska</a>, <a href="https://publications.waset.org/abstracts/search?q=Ana%20Madevska%20Bogdanova"> Ana Madevska Bogdanova</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The problem of emotion recognition is a challenging problem. It is still an open problem from the aspect of both intelligent systems and psychology. In this paper, both voice features and facial features are used for building an emotion recognition system. A Support Vector Machine classifiers are built by using raw data from video recordings. In this paper, the results obtained for the emotion recognition are given, and a discussion about the validity and the expressiveness of different emotions is presented. A comparison between the classifiers build from facial data only, voice data only and from the combination of both data is made here. The need for a better combination of the information from facial expression and voice data is argued. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=emotion%20recognition" title="emotion recognition">emotion recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=facial%20recognition" title=" facial recognition"> facial recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=signal%20processing" title=" signal processing"> signal processing</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title=" machine learning"> machine learning</a> </p> <a href="https://publications.waset.org/abstracts/42384/comparing-emotion-recognition-from-voice-and-facial-data-using-time-invariant-features" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/42384.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">315</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">513</span> Synthesis and Characterization of Amino-Functionalized Polystyrene Nanoparticles as Reactive Filler</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yaseen%20Elhebshi">Yaseen Elhebshi</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdulkareem%20Hamid"> Abdulkareem Hamid</a>, <a href="https://publications.waset.org/abstracts/search?q=Nureddin%20Bin%20Issa"> Nureddin Bin Issa</a>, <a href="https://publications.waset.org/abstracts/search?q=Xiaonong%20Chen"> Xiaonong Chen</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A convenient method of preparing ultrafine polystyrene latex nano-particles with amino groups on the surface is developed. Polystyrene latexes in the size range 50–400 nm were prepared via emulsion polymerization, using sodium dodecyl sulfate (SDS) as surfactant. Polystyrene with amino groups on the surface will be fine to use as organic filler to modify rubber. Transmission electron microscopy (TEM) was used to observe the morphology of silicon dioxide and functionalized polystyrene nano-particles. The nature of bonding between the polymer and the reactive groups on the filler surfaces was analyzed using Fourier transform infrared spectroscopy (FTIR). Scanning electron microscopy (SEM) was employed to examine the filler surface. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=reactive%20filler" title="reactive filler">reactive filler</a>, <a href="https://publications.waset.org/abstracts/search?q=emulsion%20polymerization" title=" emulsion polymerization"> emulsion polymerization</a>, <a href="https://publications.waset.org/abstracts/search?q=particle%20size" title=" particle size"> particle size</a>, <a href="https://publications.waset.org/abstracts/search?q=polystyrene%20nanoparticles" title=" polystyrene nanoparticles"> polystyrene nanoparticles</a> </p> <a href="https://publications.waset.org/abstracts/9665/synthesis-and-characterization-of-amino-functionalized-polystyrene-nanoparticles-as-reactive-filler" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/9665.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">350</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">512</span> Emotion Recognition with Occlusions Based on Facial Expression Reconstruction and Weber Local Descriptor</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jadisha%20Cornejo">Jadisha Cornejo</a>, <a href="https://publications.waset.org/abstracts/search?q=Helio%20Pedrini"> Helio Pedrini</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Recognition of emotions based on facial expressions has received increasing attention from the scientific community over the last years. Several fields of applications can benefit from facial emotion recognition, such as behavior prediction, interpersonal relations, human-computer interactions, recommendation systems. In this work, we develop and analyze an emotion recognition framework based on facial expressions robust to occlusions through the Weber Local Descriptor (WLD). Initially, the occluded facial expressions are reconstructed following an extension approach of Robust Principal Component Analysis (RPCA). Then, WLD features are extracted from the facial expression representation, as well as Local Binary Patterns (LBP) and Histogram of Oriented Gradients (HOG). The feature vector space is reduced using Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA). Finally, K-Nearest Neighbor (K-NN) and Support Vector Machine (SVM) classifiers are used to recognize the expressions. Experimental results on three public datasets demonstrated that the WLD representation achieved competitive accuracy rates for occluded and non-occluded facial expressions compared to other approaches available in the literature. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=emotion%20recognition" title="emotion recognition">emotion recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=facial%20expression" title=" facial expression"> facial expression</a>, <a href="https://publications.waset.org/abstracts/search?q=occlusion" title=" occlusion"> occlusion</a>, <a href="https://publications.waset.org/abstracts/search?q=fiducial%20landmarks" title=" fiducial landmarks"> fiducial landmarks</a> </p> <a href="https://publications.waset.org/abstracts/90510/emotion-recognition-with-occlusions-based-on-facial-expression-reconstruction-and-weber-local-descriptor" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/90510.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">182</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">511</span> Classifying Facial Expressions Based on a Motion Local Appearance Approach</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Fabiola%20M.%20Villalobos-Castaldi">Fabiola M. Villalobos-Castaldi</a>, <a href="https://publications.waset.org/abstracts/search?q=Nicol%C3%A1s%20C.%20Kemper"> Nicolás C. Kemper</a>, <a href="https://publications.waset.org/abstracts/search?q=Esther%20Rojas-Krugger"> Esther Rojas-Krugger</a>, <a href="https://publications.waset.org/abstracts/search?q=Laura%20G.%20Ram%C3%ADrez-S%C3%A1nchez"> Laura G. Ramírez-Sánchez</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper presents the classification results about exploring the combination of a motion based approach with a local appearance method to describe the facial motion caused by the muscle contractions and expansions that are presented in facial expressions. The proposed feature extraction method take advantage of the knowledge related to which parts of the face reflects the highest deformations, so we selected 4 specific facial regions at which the appearance descriptor were applied. The most common used approaches for feature extraction are the holistic and the local strategies. In this work we present the results of using a local appearance approach estimating the correlation coefficient to the 4 corresponding landmark-localized facial templates of the expression face related to the neutral face. The results let us to probe how the proposed motion estimation scheme based on the local appearance correlation computation can simply and intuitively measure the motion parameters for some of the most relevant facial regions and how these parameters can be used to recognize facial expressions automatically. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=facial%20expression%20recognition%20system" title="facial expression recognition system">facial expression recognition system</a>, <a href="https://publications.waset.org/abstracts/search?q=feature%20extraction" title=" feature extraction"> feature extraction</a>, <a href="https://publications.waset.org/abstracts/search?q=local-appearance%20method" title=" local-appearance method"> local-appearance method</a>, <a href="https://publications.waset.org/abstracts/search?q=motion-based%20approach" title=" motion-based approach"> motion-based approach</a> </p> <a href="https://publications.waset.org/abstracts/27632/classifying-facial-expressions-based-on-a-motion-local-appearance-approach" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/27632.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">413</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">510</span> Emotion Recognition Using Artificial Intelligence</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Rahul%20Mohite">Rahul Mohite</a>, <a href="https://publications.waset.org/abstracts/search?q=Lahcen%20Ouarbya"> Lahcen Ouarbya</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper focuses on the interplay between humans and computer systems and the ability of these systems to understand and respond to human emotions, including non-verbal communication. Current emotion recognition systems are based solely on either facial or verbal expressions. The limitation of these systems is that it requires large training data sets. The paper proposes a system for recognizing human emotions that combines both speech and emotion recognition. The system utilizes advanced techniques such as deep learning and image recognition to identify facial expressions and comprehend emotions. The results show that the proposed system, based on the combination of facial expression and speech, outperforms existing ones, which are based solely either on facial or verbal expressions. The proposed system detects human emotion with an accuracy of 86%, whereas the existing systems have an accuracy of 70% using verbal expression only and 76% using facial expression only. In this paper, the increasing significance and demand for facial recognition technology in emotion recognition are also discussed. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=facial%20reputation" title="facial reputation">facial reputation</a>, <a href="https://publications.waset.org/abstracts/search?q=expression%20reputation" title=" expression reputation"> expression reputation</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20gaining%20knowledge%20of" title=" deep gaining knowledge of"> deep gaining knowledge of</a>, <a href="https://publications.waset.org/abstracts/search?q=photo%20reputation" title=" photo reputation"> photo reputation</a>, <a href="https://publications.waset.org/abstracts/search?q=facial%20technology" title=" facial technology"> facial technology</a>, <a href="https://publications.waset.org/abstracts/search?q=sign%20processing" title=" sign processing"> sign processing</a>, <a href="https://publications.waset.org/abstracts/search?q=photo%20type" title=" photo type"> photo type</a> </p> <a href="https://publications.waset.org/abstracts/162386/emotion-recognition-using-artificial-intelligence" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/162386.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">121</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">509</span> Improving the Performance of Deep Learning in Facial Emotion Recognition with Image Sharpening</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ksheeraj%20Sai%20Vepuri">Ksheeraj Sai Vepuri</a>, <a href="https://publications.waset.org/abstracts/search?q=Nada%20Attar"> Nada Attar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> We as humans use words with accompanying visual and facial cues to communicate effectively. Classifying facial emotion using computer vision methodologies has been an active research area in the computer vision field. In this paper, we propose a simple method for facial expression recognition that enhances accuracy. We tested our method on the FER-2013 dataset that contains static images. Instead of using Histogram equalization to preprocess the dataset, we used Unsharp Mask to emphasize texture and details and sharpened the edges. We also used ImageDataGenerator from Keras library for data augmentation. Then we used Convolutional Neural Networks (CNN) model to classify the images into 7 different facial expressions, yielding an accuracy of 69.46% on the test set. Our results show that using image preprocessing such as the sharpening technique for a CNN model can improve the performance, even when the CNN model is relatively simple. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=facial%20expression%20recognittion" title="facial expression recognittion">facial expression recognittion</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20preprocessing" title=" image preprocessing"> image preprocessing</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title=" deep learning"> deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=CNN" title=" CNN"> CNN</a> </p> <a href="https://publications.waset.org/abstracts/130679/improving-the-performance-of-deep-learning-in-facial-emotion-recognition-with-image-sharpening" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/130679.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">143</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">508</span> Somatosensory-Evoked Blink Reflex in Peripheral Facial Palsy</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sarah%20Sayed%20El-%20Tawab">Sarah Sayed El- Tawab</a>, <a href="https://publications.waset.org/abstracts/search?q=Emmanuel%20Kamal%20Azix%20Saba"> Emmanuel Kamal Azix Saba</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Objectives: Somatosensory blink reflex (SBR) is an eye blink response obtained from electrical stimulation of peripheral nerves or skin area of the body. It has been studied in various neurological diseases as well as among healthy subjects in different population. We designed this study to detect SBR positivity in patients with facial palsy and patients with post facial syndrome, to relate the facial palsy severity and the presence of SBR, and to associate between trigeminal BR changes and SBR positivity in peripheral facial palsy patients. Methods: 50 patients with peripheral facial palsy and post-facial syndrome 31 age and gender matched healthy volunteers were enrolled to this study. Facial motor conduction studies, trigeminal BR, and SBR were studied in all. Results: SBR was elicited in 67.7% of normal subjects, in 68% of PFS group, and in 32% of PFP group. On the non-paralytic side SBR was found in 28% by paralyzed side stimulation and in 24% by healthy side stimulation among PFP patients. For PFS group SBR was found on the non- paralytic side in 48%. Bilateral SBR elicitability was higher than its unilateral elicitability. Conclusion: Increased brainstem interneurons excitability is not essential to generate SBR. The hypothetical sensory-motor gating mechanism is responsible for SBR generation. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=somatosensory%20evoked%20blink%20reflex" title="somatosensory evoked blink reflex">somatosensory evoked blink reflex</a>, <a href="https://publications.waset.org/abstracts/search?q=post%20facial%20syndrome" title=" post facial syndrome"> post facial syndrome</a>, <a href="https://publications.waset.org/abstracts/search?q=blink%20reflex" title=" blink reflex"> blink reflex</a>, <a href="https://publications.waset.org/abstracts/search?q=enchanced%20gain" title=" enchanced gain"> enchanced gain</a> </p> <a href="https://publications.waset.org/abstracts/18913/somatosensory-evoked-blink-reflex-in-peripheral-facial-palsy" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/18913.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">619</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">507</span> KSVD-SVM Approach for Spontaneous Facial Expression Recognition</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Dawood%20Al%20Chanti">Dawood Al Chanti</a>, <a href="https://publications.waset.org/abstracts/search?q=Alice%20Caplier"> Alice Caplier</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Sparse representations of signals have received a great deal of attention in recent years. In this paper, the interest of using sparse representation as a mean for performing sparse discriminative analysis between spontaneous facial expressions is demonstrated. An automatic facial expressions recognition system is presented. It uses a KSVD-SVM approach which is made of three main stages: A pre-processing and feature extraction stage, which solves the problem of shared subspace distribution based on the random projection theory, to obtain low dimensional discriminative and reconstructive features; A dictionary learning and sparse coding stage, which uses the KSVD model to learn discriminative under or over dictionaries for sparse coding; Finally a classification stage, which uses a SVM classifier for facial expressions recognition. Our main concern is to be able to recognize non-basic affective states and non-acted expressions. Extensive experiments on the JAFFE static acted facial expressions database but also on the DynEmo dynamic spontaneous facial expressions database exhibit very good recognition rates. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=dictionary%20learning" title="dictionary learning">dictionary learning</a>, <a href="https://publications.waset.org/abstracts/search?q=random%20projection" title=" random projection"> random projection</a>, <a href="https://publications.waset.org/abstracts/search?q=pose%20and%20spontaneous%20facial%20expression" title=" pose and spontaneous facial expression"> pose and spontaneous facial expression</a>, <a href="https://publications.waset.org/abstracts/search?q=sparse%20representation" title=" sparse representation"> sparse representation</a> </p> <a href="https://publications.waset.org/abstracts/51683/ksvd-svm-approach-for-spontaneous-facial-expression-recognition" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/51683.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">305</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">506</span> Individualized Emotion Recognition Through Dual-Representations and Ground-Established Ground Truth</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Valentina%20Zhang">Valentina Zhang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> While facial expression is a complex and individualized behavior, all facial emotion recognition (FER) systems known to us rely on a single facial representation and are trained on universal data. We conjecture that: (i) different facial representations can provide different, sometimes complementing views of emotions; (ii) when employed collectively in a discussion group setting, they enable more accurate emotion reading which is highly desirable in autism care and other applications context sensitive to errors. In this paper, we first study FER using pixel-based DL vs semantics-based DL in the context of deepfake videos. Our experiment indicates that while the semantics-trained model performs better with articulated facial feature changes, the pixel-trained model outperforms on subtle or rare facial expressions. Armed with these findings, we have constructed an adaptive FER system learning from both types of models for dyadic or small interacting groups and further leveraging the synthesized group emotions as the ground truth for individualized FER training. Using a collection of group conversation videos, we demonstrate that FER accuracy and personalization can benefit from such an approach. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=neurodivergence%20care" title="neurodivergence care">neurodivergence care</a>, <a href="https://publications.waset.org/abstracts/search?q=facial%20emotion%20recognition" title=" facial emotion recognition"> facial emotion recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title=" deep learning"> deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=ground%20truth%20for%20supervised%20learning" title=" ground truth for supervised learning"> ground truth for supervised learning</a> </p> <a href="https://publications.waset.org/abstracts/144009/individualized-emotion-recognition-through-dual-representations-and-ground-established-ground-truth" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/144009.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">147</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">505</span> Effect of Filler Size and Shape on Positive Temperature Coefficient Effect</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Eric%20Asare">Eric Asare</a>, <a href="https://publications.waset.org/abstracts/search?q=Jamie%20Evans"> Jamie Evans</a>, <a href="https://publications.waset.org/abstracts/search?q=Mark%20Newton"> Mark Newton</a>, <a href="https://publications.waset.org/abstracts/search?q=Emiliano%20Bilotti"> Emiliano Bilotti</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Two types of filler shapes (sphere and flakes) and three different sizes are employed to study the size effect on PTC. The composite is prepared using a mini-extruder with high-density polyethylene (HDPE) as the matrix. A computer modelling is used to fit the experimental results. The percolation threshold decreases with decreasing filler size and this was observed for both the spherical particles as well as the flakes. This was caused by the decrease in interparticle distance with decreasing filler size. The 100 µm particles showed a larger PTC intensity compared to the 5 µm particles for the metal coated glass sphere and flake. The small particles have a large surface area and agglomeration and this makes it difficult for the conductive network to e disturbed. Increasing the filler content decreased the PTC intensity and this is due to an increase in the conductive network within the polymer matrix hence more energy is needed to disrupt the network. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=positive%20temperature%20coefficient%20%28PTC%29%20effect" title="positive temperature coefficient (PTC) effect">positive temperature coefficient (PTC) effect</a>, <a href="https://publications.waset.org/abstracts/search?q=conductive%20polymer%20composite%20%28CPC%29" title=" conductive polymer composite (CPC)"> conductive polymer composite (CPC)</a>, <a href="https://publications.waset.org/abstracts/search?q=electrical%20conductivity" title=" electrical conductivity"> electrical conductivity</a> </p> <a href="https://publications.waset.org/abstracts/19230/effect-of-filler-size-and-shape-on-positive-temperature-coefficient-effect" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/19230.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">427</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">504</span> Mathematical Analysis of Matrix and Filler Formulation in Composite Materials</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Olusegun%20A.%20Afolabi">Olusegun A. Afolabi</a>, <a href="https://publications.waset.org/abstracts/search?q=Ndivhuwo%20Ndou"> Ndivhuwo Ndou</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Composite material is an important area that has gained global visibility in many research fields in recent years. Composite material is the combination of separate materials with different properties to form a single material having different properties from the parent materials. Material composition and combination is an important aspect of composite material. The focus of this study is to provide insight into an easy way of calculating the compositions and formulations of constituent materials that make up any composite material. The compositions of the matrix and filler used for fabricating composite materials are taken into consideration. From the composite fabricated, data can be collected and analyzed based on the test and characterizations such as tensile, flexural, compression, impact, hardness, etc. Also, the densities of the matrix and the filler with regard to their constituent materials are discussed. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=composite%20material" title="composite material">composite material</a>, <a href="https://publications.waset.org/abstracts/search?q=density" title=" density"> density</a>, <a href="https://publications.waset.org/abstracts/search?q=filler" title=" filler"> filler</a>, <a href="https://publications.waset.org/abstracts/search?q=matrix" title=" matrix"> matrix</a>, <a href="https://publications.waset.org/abstracts/search?q=percentage%20weight" title=" percentage weight"> percentage weight</a>, <a href="https://publications.waset.org/abstracts/search?q=volume%20fraction" title=" volume fraction"> volume fraction</a> </p> <a href="https://publications.waset.org/abstracts/182436/mathematical-analysis-of-matrix-and-filler-formulation-in-composite-materials" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/182436.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">67</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">503</span> Deep-Learning Based Approach to Facial Emotion Recognition through Convolutional Neural Network</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Nouha%20Khediri">Nouha Khediri</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohammed%20Ben%20Ammar"> Mohammed Ben Ammar</a>, <a href="https://publications.waset.org/abstracts/search?q=Monji%20Kherallah"> Monji Kherallah</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Recently, facial emotion recognition (FER) has become increasingly essential to understand the state of the human mind. Accurately classifying emotion from the face is a challenging task. In this paper, we present a facial emotion recognition approach named CV-FER, benefiting from deep learning, especially CNN and VGG16. First, the data is pre-processed with data cleaning and data rotation. Then, we augment the data and proceed to our FER model, which contains five convolutions layers and five pooling layers. Finally, a softmax classifier is used in the output layer to recognize emotions. Based on the above contents, this paper reviews the works of facial emotion recognition based on deep learning. Experiments show that our model outperforms the other methods using the same FER2013 database and yields a recognition rate of 92%. We also put forward some suggestions for future work. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=CNN" title="CNN">CNN</a>, <a href="https://publications.waset.org/abstracts/search?q=deep-learning" title=" deep-learning"> deep-learning</a>, <a href="https://publications.waset.org/abstracts/search?q=facial%20emotion%20recognition" title=" facial emotion recognition"> facial emotion recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title=" machine learning"> machine learning</a> </p> <a href="https://publications.waset.org/abstracts/150291/deep-learning-based-approach-to-facial-emotion-recognition-through-convolutional-neural-network" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/150291.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">95</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">502</span> Comparison of Mechanical Property of UNS C12200Joints Brazed by (Cu&amp;Ag) Based Filler Metals</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ali%20Elhatmi">Ali Elhatmi</a>, <a href="https://publications.waset.org/abstracts/search?q=Mustafa%20Elshbo"> Mustafa Elshbo</a>, <a href="https://publications.waset.org/abstracts/search?q=Hussin%20Alosta"> Hussin Alosta</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this study the coper tube witch used in medical applications was brazed by Copper, Zink and Silver alloys, using BCuP2, RBCuZnAl and BAg2 filler metals. The sample of the medical tubes was chemically analyzed and the result matches the British standard. Tensile and hardness tests were carried out for brazed joints, and the tensile test results show that the BCuP2 has the hardest and the filler metal RBCuZnAl has the highest tensile strength. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=welding" title="welding">welding</a>, <a href="https://publications.waset.org/abstracts/search?q=Brazing" title=" Brazing"> Brazing</a>, <a href="https://publications.waset.org/abstracts/search?q=Copper%20tubes" title=" Copper tubes"> Copper tubes</a>, <a href="https://publications.waset.org/abstracts/search?q=Joints" title=" Joints"> Joints</a> </p> <a href="https://publications.waset.org/abstracts/92026/comparison-of-mechanical-property-of-uns-c12200joints-brazed-by-cuag-based-filler-metals" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/92026.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">227</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">501</span> Noninvasive Evaluation of Acupuncture by Measuring Facial Temperature through Thermal Image</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=An%20Guo">An Guo</a>, <a href="https://publications.waset.org/abstracts/search?q=Hieyong%20Jeong"> Hieyong Jeong</a>, <a href="https://publications.waset.org/abstracts/search?q=Tianyi%20Wang"> Tianyi Wang</a>, <a href="https://publications.waset.org/abstracts/search?q=Na%20Li"> Na Li</a>, <a href="https://publications.waset.org/abstracts/search?q=Yuko%20Ohno"> Yuko Ohno</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Acupuncture, known as sensory simulation, has been used to treat various disorders for thousands of years. However, present studies had not addressed approaches for noninvasive measurement in order to evaluate therapeutic effect of acupuncture. The purpose of this study is to propose a noninvasive method to evaluate acupuncture by measuring facial temperature through thermal image. Three human subjects were recruited in this study. Each subject received acupuncture therapy for 30 mins. Acupuncture needles (Ø0.16 x 30 mm) were inserted into Baihui point (DU20), Neiguan points (PC6) and Taichong points (LR3), acupuncture needles (Ø0.18 x 39 mm) were inserted into Tanzhong point (RN17), Zusanli points (ST36) and Yinlingquan points (SP9). Facial temperature was recorded by an infrared thermometer. Acupuncture therapeutic effect was compared pre- and post-acupuncture. Experiment results demonstrated that facial temperature changed according to acupuncture therapeutic effect. It was concluded that proposed method showed high potential to evaluate acupuncture by noninvasive measurement of facial temperature. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=acupuncture" title="acupuncture">acupuncture</a>, <a href="https://publications.waset.org/abstracts/search?q=facial%20temperature" title=" facial temperature"> facial temperature</a>, <a href="https://publications.waset.org/abstracts/search?q=noninvasive%20evaluation" title=" noninvasive evaluation"> noninvasive evaluation</a>, <a href="https://publications.waset.org/abstracts/search?q=thermal%20image" title=" thermal image"> thermal image</a> </p> <a href="https://publications.waset.org/abstracts/95222/noninvasive-evaluation-of-acupuncture-by-measuring-facial-temperature-through-thermal-image" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/95222.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">187</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">500</span> Thermal Property Improvement of Silica Reinforced Epoxy Composite Specimens</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hyu%20Sang%20Jo">Hyu Sang Jo</a>, <a href="https://publications.waset.org/abstracts/search?q=Gyo%20Woo%20Lee"> Gyo Woo Lee</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this study, the mechanical and thermal properties of epoxy composites that are reinforced with micrometer-sized silica particles were investigated by using the specimen experiments. For all specimens used in this study (from the baseline to specimen containing 70 wt% silica filler), the tensile strengths were gradually increased by 8-10%, but the ductility of the specimen was decreased by 34%, compared with those of the baseline samples. Similarly, for the samples containing 70 wt% silica filler, the coefficient of thermal expansion was reduced by 25%, but the thermal conductivity was increased by 100%, compared with those of the baseline samples. The improvement of thermal stability of the silica-reinforced specimen was confirmed to be within the experimented range, and the smaller silica particle was found to be more effective in delaying the thermal expansion of the specimens. When the smaller particle was used as filler, due to the increased specific interface area between filler and matrix, the thermal conductivities of the composite specimens were measured to be slightly lower than those of the specimens reinforced with the larger particle. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=carbon%20nanotube%20filler" title="carbon nanotube filler">carbon nanotube filler</a>, <a href="https://publications.waset.org/abstracts/search?q=epoxy%20composite" title=" epoxy composite"> epoxy composite</a>, <a href="https://publications.waset.org/abstracts/search?q=mechanical%20property" title=" mechanical property"> mechanical property</a>, <a href="https://publications.waset.org/abstracts/search?q=thermal%20property" title=" thermal property"> thermal property</a> </p> <a href="https://publications.waset.org/abstracts/44711/thermal-property-improvement-of-silica-reinforced-epoxy-composite-specimens" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/44711.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">236</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">499</span> Facial Emotion Recognition Using Deep Learning</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ashutosh%20Mishra">Ashutosh Mishra</a>, <a href="https://publications.waset.org/abstracts/search?q=Nikhil%20Goyal"> Nikhil Goyal</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A 3D facial emotion recognition model based on deep learning is proposed in this paper. Two convolution layers and a pooling layer are employed in the deep learning architecture. After the convolution process, the pooling is finished. The probabilities for various classes of human faces are calculated using the sigmoid activation function. To verify the efficiency of deep learning-based systems, a set of faces. The Kaggle dataset is used to verify the accuracy of a deep learning-based face recognition model. The model's accuracy is about 65 percent, which is lower than that of other facial expression recognition techniques. Despite significant gains in representation precision due to the nonlinearity of profound image representations. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=facial%20recognition" title="facial recognition">facial recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=computational%20intelligence" title=" computational intelligence"> computational intelligence</a>, <a href="https://publications.waset.org/abstracts/search?q=convolutional%20neural%20network" title=" convolutional neural network"> convolutional neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=depth%20map" title=" depth map"> depth map</a> </p> <a href="https://publications.waset.org/abstracts/139253/facial-emotion-recognition-using-deep-learning" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/139253.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">231</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">498</span> Effect of Filler Metal Diameter on Weld Joint of Carbon Steel SA516 Gr 70 and Filler Metal SFA 5.17 in Submerged Arc Welding SAW</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=A.%20Nait%20Salah">A. Nait Salah</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20Kaddami"> M. Kaddami</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This work describes an investigation on the effect of filler metals diameter to weld joint, and low alloy carbon steel A516 Grade 70 is the base metal. Commercially SA516 Grade70 is frequently used for the manufacturing of pressure vessels, boilers and storage tank, etc. In fabrication industry, the hardness of the weld joint is between the important parameters to check, after heat treatment of the weld. Submerged arc welding (SAW) is used with two filler metal diameters, and this solid wire electrode is used for SAW non-alloy and for fine grain steels (SFA 5.17). The different diameters were selected (&Oslash; = 2.4 mm and &Oslash; = 4 mm) to weld two specimens. Both specimens were subjected to the same preparation conditions, heat treatment, macrograph, metallurgy micrograph, and micro-hardness test. Samples show almost similar structure with highest hardness. It is important to indicate that the thickness used in the base metal is 22 mm, and all specifications, preparation and controls were according to the ASME section IX. It was observed that two different filler metal diameters performed on two similar specimens demonstrated that the mechanical property (hardness) increases with decreasing diameter. It means that even the heat treatment has the same effect with the same conditions, the filler metal diameter insures a depth weld penetration and better homogenization. Hence, the SAW welding technique mentioned in the present study is favorable to implicate for the industry using the small filler metal diameter. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ASME" title="ASME">ASME</a>, <a href="https://publications.waset.org/abstracts/search?q=base%20metal" title=" base metal"> base metal</a>, <a href="https://publications.waset.org/abstracts/search?q=micro-hardness%20test" title=" micro-hardness test"> micro-hardness test</a>, <a href="https://publications.waset.org/abstracts/search?q=submerged%20arc%20welding" title=" submerged arc welding"> submerged arc welding</a> </p> <a href="https://publications.waset.org/abstracts/96792/effect-of-filler-metal-diameter-on-weld-joint-of-carbon-steel-sa516-gr-70-and-filler-metal-sfa-517-in-submerged-arc-welding-saw" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/96792.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">153</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">497</span> Influence of Brazing Process Parameters on the Mechanical Properties of Nickel Based Superalloy</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=M.%20Zielinska">M. Zielinska</a>, <a href="https://publications.waset.org/abstracts/search?q=B.%20Daniels"> B. Daniels</a>, <a href="https://publications.waset.org/abstracts/search?q=J.%20Gabel"> J. Gabel</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Paletko"> A. Paletko</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A common nickel based superalloy Inconel625 was brazed with Ni-base braze filler material (AMS4777) containing melting-point-depressants such as B and Si. Different braze gaps, brazing times and forms of braze filler material were tested. It was determined that the melting point depressants B and Si tend to form hard and brittle phases in the joint during the braze cycle. Brittle phases significantly reduce mechanical properties (e. g. tensile strength) of the joint. Therefore, it is important to define optimal process parameters to achieve high strength joints, free of brittle phases. High ultimate tensile strength (UTS) values can be obtained if the joint area is free of brittle phases, which is equivalent to a complete isothermal solidification of the joint. Isothermal solidification takes place only if the concentration of the melting point depressant in the braze filler material of the joint is continuously reduced by diffusion into the base material. For a given brazing temperature, long brazing times and small braze filler material volumes (small braze gaps) are beneficial for isothermal solidification. On the base of the obtained results it can be stated that the form of the braze filler material has an additional influence on the joint quality. Better properties can be achieved by the use of braze-filler-material in form of foil instead of braze-filler-material in form of paste due to a reduced amount of voids and a more homogeneous braze-filler-material-composition in the braze-gap by using foil. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=diffusion%20brazing" title="diffusion brazing">diffusion brazing</a>, <a href="https://publications.waset.org/abstracts/search?q=microstructure" title=" microstructure"> microstructure</a>, <a href="https://publications.waset.org/abstracts/search?q=superalloy" title=" superalloy"> superalloy</a>, <a href="https://publications.waset.org/abstracts/search?q=tensile%20strength" title=" tensile strength"> tensile strength</a> </p> <a href="https://publications.waset.org/abstracts/6452/influence-of-brazing-process-parameters-on-the-mechanical-properties-of-nickel-based-superalloy" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/6452.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">363</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">496</span> Highly Realistic Facial Expressions of Anthropomorphic Social Agent as a Factor in Solving the &#039;Uncanny Valley&#039; Problem</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Daniia%20Nigmatullina">Daniia Nigmatullina</a>, <a href="https://publications.waset.org/abstracts/search?q=Vlada%20Kugurakova"> Vlada Kugurakova</a>, <a href="https://publications.waset.org/abstracts/search?q=Maxim%20Talanov"> Maxim Talanov</a> </p> <p class="card-text"><strong>Abstract:</strong></p> We present a methodology and our plans of anthropomorphic social agent visualization. That includes creation of three-dimensional model of the virtual companion's head and its facial expressions. Talking Head is a cross-disciplinary project of developing of the human-machine interface with cognitive functions. During the creation of a realistic humanoid robot or a character, there might be the ‘uncanny valley’ problem. We think about this phenomenon and its possible causes. We are going to overcome the ‘uncanny valley’ by increasing of realism. This article discusses issues that should be considered when creating highly realistic characters (particularly the head), their facial expressions and speech visualization. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=anthropomorphic%20social%20agent" title="anthropomorphic social agent">anthropomorphic social agent</a>, <a href="https://publications.waset.org/abstracts/search?q=facial%20animation" title=" facial animation"> facial animation</a>, <a href="https://publications.waset.org/abstracts/search?q=uncanny%20valley" title=" uncanny valley"> uncanny valley</a>, <a href="https://publications.waset.org/abstracts/search?q=visualization" title=" visualization"> visualization</a>, <a href="https://publications.waset.org/abstracts/search?q=3D%20modeling" title=" 3D modeling"> 3D modeling</a> </p> <a href="https://publications.waset.org/abstracts/41558/highly-realistic-facial-expressions-of-anthropomorphic-social-agent-as-a-factor-in-solving-the-uncanny-valley-problem" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/41558.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">290</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">495</span> Anthropometric Measurements of Facial Proportions in Azerbaijan Population</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Nigar%20Sultanova">Nigar Sultanova</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Facial morphology is a constant topic of concern for clinicians. When anthropometric methods were introduced into clinical practice to quantify changes in the craniofacial framework, features distinguishing various ethnic group were discovered. Normative data of facial measurements are indispensable to precise determination of the degree of deviations from normal. Establish the reference range of facial proportions in Azerbaijan population by anthropometric measurements of craniofacial complex. The study group consisted of 350 healthy young subjects, 175 males and 175 females, 18 to 25 years of age, from 7 different regions of Azerbaijan. The anthropometric examination was performed according to L.Farkas's method with our modification. In order to determine the morphologic characteristics of seven regions of the craniofacial complex 42 anthropometric measurements were selected. The anthropometric examination. Included the usage of 33 anthropometric landmarks. The 80 indices of the facial proportions, suggested by Farkas and Munro, were calculated: head -10, face - 23, nose - 23, lips - 9, orbits - 11, ears - 4. The date base of the North American white population was used as a reference group. Anthropometric measurements of facial proportions in Azerbaijan population revealed a significant difference between mеn and womеn, according to sexual dimorphism. In comparison with North American whites, considerable differences of facial proportions were observed in the head, face, orbits, labio-oral, nose and ear region. However, in women of the Azerbaijani population, 29 out of 80 proportion indices were similar to the proportions of NAW women. In the men of the Azerbaijani population, 27 out of 80 proportion indices did not reveal a statistically significant difference from the proportions of NAW men. Estimation of the reference range of facial proportions in Azerbaijan population migth be helpful to formulate surgical plan in treatment of congenital or post-traumatic facial deformities successfully. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=facial%20morphology" title="facial morphology">facial morphology</a>, <a href="https://publications.waset.org/abstracts/search?q=anthropometry" title=" anthropometry"> anthropometry</a>, <a href="https://publications.waset.org/abstracts/search?q=indices%20of%20proportion" title=" indices of proportion"> indices of proportion</a>, <a href="https://publications.waset.org/abstracts/search?q=measurement" title=" measurement"> measurement</a> </p> <a href="https://publications.waset.org/abstracts/147472/anthropometric-measurements-of-facial-proportions-in-azerbaijan-population" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/147472.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">117</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">&lsaquo;</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=facial%20filler&amp;page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=facial%20filler&amp;page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=facial%20filler&amp;page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=facial%20filler&amp;page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=facial%20filler&amp;page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=facial%20filler&amp;page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=facial%20filler&amp;page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=facial%20filler&amp;page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=facial%20filler&amp;page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=facial%20filler&amp;page=17">17</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=facial%20filler&amp;page=18">18</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=facial%20filler&amp;page=2" rel="next">&rsaquo;</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">&copy; 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">&times;</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10