CINXE.COM
Search results for: skin color
<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: skin color</title> <meta name="description" content="Search results for: skin color"> <meta name="keywords" content="skin color"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="skin color" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="skin color"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 2046</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: skin color</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2046</span> Towards Integrating Statistical Color Features for Human Skin Detection</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mohd%20Zamri%20Osman">Mohd Zamri Osman</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohd%20Aizaini%20Maarof"> Mohd Aizaini Maarof</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohd%20Foad%20Rohani"> Mohd Foad Rohani</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Human skin detection recognized as the primary step in most of the applications such as face detection, illicit image filtering, hand recognition and video surveillance. The performance of any skin detection applications greatly relies on the two components: feature extraction and classification method. Skin color is the most vital information used for skin detection purpose. However, color feature alone sometimes could not handle images with having same color distribution with skin color. A color feature of pixel-based does not eliminate the skin-like color due to the intensity of skin and skin-like color fall under the same distribution. Hence, the statistical color analysis will be exploited such mean and standard deviation as an additional feature to increase the reliability of skin detector. In this paper, we studied the effectiveness of statistical color feature for human skin detection. Furthermore, the paper analyzed the integrated color and texture using eight classifiers with three color spaces of RGB, YCbCr, and HSV. The experimental results show that the integrating statistical feature using Random Forest classifier achieved a significant performance with an F1-score 0.969. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=color%20space" title="color space">color space</a>, <a href="https://publications.waset.org/abstracts/search?q=neural%20network" title=" neural network"> neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=random%20forest" title=" random forest"> random forest</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20detection" title=" skin detection"> skin detection</a>, <a href="https://publications.waset.org/abstracts/search?q=statistical%20feature" title=" statistical feature"> statistical feature</a> </p> <a href="https://publications.waset.org/abstracts/43485/towards-integrating-statistical-color-features-for-human-skin-detection" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/43485.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">462</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2045</span> Use of Segmentation and Color Adjustment for Skin Tone Classification in Dermatological Images</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Fernando%20Duarte">Fernando Duarte</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The work aims to evaluate the use of classical image processing methodologies towards skin tone classification in dermatological images. The skin tone is an important attribute when considering several factor for skin cancer diagnosis. Currently, there is a lack of clear methodologies to classify the skin tone based only on the dermatological image. In this work, a recent released dataset with the label for skin tone was used as reference for the evaluation of classical methodologies for segmentation and adjustment of color space for classification of skin tone in dermatological images. It was noticed that even though the classical methodologies can work fine for segmentation and color adjustment, classifying the skin tone without proper control of the aquisition of the sample images ended being very unreliable. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=segmentation" title="segmentation">segmentation</a>, <a href="https://publications.waset.org/abstracts/search?q=classification" title=" classification"> classification</a>, <a href="https://publications.waset.org/abstracts/search?q=color%20space" title=" color space"> color space</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20tone" title=" skin tone"> skin tone</a>, <a href="https://publications.waset.org/abstracts/search?q=Fitzpatrick" title=" Fitzpatrick"> Fitzpatrick</a> </p> <a href="https://publications.waset.org/abstracts/188975/use-of-segmentation-and-color-adjustment-for-skin-tone-classification-in-dermatological-images" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/188975.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">35</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2044</span> Clinical Factors of Quality Switched Ruby Laser Therapy for Lentigo Depigmentation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=SunWoo%20Lee">SunWoo Lee</a>, <a href="https://publications.waset.org/abstracts/search?q=TaeBum%20Lee"> TaeBum Lee</a>, <a href="https://publications.waset.org/abstracts/search?q=YoonHwa%20Park"> YoonHwa Park</a>, <a href="https://publications.waset.org/abstracts/search?q=YooJeong%20Kim"> YooJeong Kim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Solar lentigines appear predominantly on chronically sun-exposed areas of skin, such as the face and the back of the hands. Among the several ways to lentigines treatment, quality-switched lasers are well-known effective treatment for removing solar lentigines. The present pilot study was therefore designed to assess the efficacy of quality-switched ruby laser treatment of such lentigines compare between pretreatment and posttreatment of skin brightness. Twenty-two adults with chronic sun-damaged skin (mean age 52.8 years, range 37–74 years) were treated at the Korean site. A 694 nm Q-switched ruby laser was used, with the energy density set from 1.4 to 12.5 J/cm2, to treat solar lentigines. Average brightness of skin color before ruby laser treatment was 137.3 and its skin color was brightened after ruby laser treatment by 150.5. Also, standard deviation of skin color was decreased from 17.8 to 16.4. Regarding the multivariate model, age and energy were identified as significant factors for skin color brightness change in lentigo depigmentation by ruby laser treatment. Their respective odds ratios were 1.082 (95% CI, 1.007–1.163), and 1.431 (95% CI, 1.051–1.946). Lentigo depigmentation treatment using ruby lasers resulted in a high performance in skin color brightness. Among the relative factors involve with ruby laser treatment, age and energy were the most effective factors which skin color change to brighter than pretreatment. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=depigmentation" title="depigmentation">depigmentation</a>, <a href="https://publications.waset.org/abstracts/search?q=lentigine" title=" lentigine"> lentigine</a>, <a href="https://publications.waset.org/abstracts/search?q=quality%20switched%20ruby%20laser" title=" quality switched ruby laser"> quality switched ruby laser</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20color" title=" skin color"> skin color</a> </p> <a href="https://publications.waset.org/abstracts/48368/clinical-factors-of-quality-switched-ruby-laser-therapy-for-lentigo-depigmentation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/48368.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">251</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2043</span> FISCEAPP: FIsh Skin Color Evaluation APPlication</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=J.%20Urban">J. Urban</a>, <a href="https://publications.waset.org/abstracts/search?q=%C3%81.%20S.%20Botella"> Á. S. Botella</a>, <a href="https://publications.waset.org/abstracts/search?q=L.%20E.%20Robaina"> L. E. Robaina</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20B%C3%A1rta"> A. Bárta</a>, <a href="https://publications.waset.org/abstracts/search?q=P.%20Sou%C4%8Dek"> P. Souček</a>, <a href="https://publications.waset.org/abstracts/search?q=P.%20C%C3%ADsa%C5%99"> P. Císař</a>, <a href="https://publications.waset.org/abstracts/search?q=%C5%A0.%20Pap%C3%A1%C4%8Dek"> Š. Papáček</a>, <a href="https://publications.waset.org/abstracts/search?q=L.%20M.%20Dom%C3%ADnguez"> L. M. Domínguez</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Skin coloration in fish is of great physiological, behavioral and ecological importance and can be considered as an index of animal welfare in aquaculture as well as an important quality factor in the retail value. Currently, in order to compare color in animals fed on different diets, biochemical analysis, and colorimetry of fished, mildly anesthetized or dead body, are very accurate and meaningful measurements. The noninvasive method using digital images of the fish body was developed as a standalone application. This application deals with the computation burden and memory consumption of large input files, optimizing piece wise processing and analysis with the memory/computation time ratio. For the comparison of color distributions of various experiments and different color spaces (RGB, CIE L*a*b*) the comparable semi-equidistant binning of multi channels representation is introduced. It is derived from the knowledge of quantization levels and Freedman-Diaconis rule. The color calibrations and camera responsivity function were necessary part of the measurement process. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=color%20distribution" title="color distribution">color distribution</a>, <a href="https://publications.waset.org/abstracts/search?q=fish%20skin%20color" title=" fish skin color"> fish skin color</a>, <a href="https://publications.waset.org/abstracts/search?q=piecewise%20transformation" title=" piecewise transformation"> piecewise transformation</a>, <a href="https://publications.waset.org/abstracts/search?q=object%20to%20background%20segmentation" title=" object to background segmentation"> object to background segmentation</a> </p> <a href="https://publications.waset.org/abstracts/15406/fisceapp-fish-skin-color-evaluation-application" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/15406.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">262</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2042</span> Burnout Recognition for Call Center Agents by Using Skin Color Detection with Hand Poses </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=El%20Sayed%20A.%20Sharara">El Sayed A. Sharara</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Tsuji"> A. Tsuji</a>, <a href="https://publications.waset.org/abstracts/search?q=K.%20Terada"> K. Terada</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Call centers have been expanding and they have influence on activation in various markets increasingly. A call center’s work is known as one of the most demanding and stressful jobs. In this paper, we propose the fatigue detection system in order to detect burnout of call center agents in the case of a neck pain and upper back pain. Our proposed system is based on the computer vision technique combined skin color detection with the Viola-Jones object detector. To recognize the gesture of hand poses caused by stress sign, the YCbCr color space is used to detect the skin color region including face and hand poses around the area related to neck ache and upper back pain. A cascade of clarifiers by Viola-Jones is used for face recognition to extract from the skin color region. The detection of hand poses is given by the evaluation of neck pain and upper back pain by using skin color detection and face recognition method. The system performance is evaluated using two groups of dataset created in the laboratory to simulate call center environment. Our call center agent burnout detection system has been implemented by using a web camera and has been processed by MATLAB. From the experimental results, our system achieved 96.3% for upper back pain detection and 94.2% for neck pain detection. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=call%20center%20agents" title="call center agents">call center agents</a>, <a href="https://publications.waset.org/abstracts/search?q=fatigue" title=" fatigue"> fatigue</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20color%20detection" title=" skin color detection"> skin color detection</a>, <a href="https://publications.waset.org/abstracts/search?q=face%20recognition" title=" face recognition"> face recognition</a> </p> <a href="https://publications.waset.org/abstracts/74913/burnout-recognition-for-call-center-agents-by-using-skin-color-detection-with-hand-poses" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/74913.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">294</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2041</span> Rearrangement and Depletion of Human Skin Folate after UVA Exposure </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Luai%20Z.%20Hasoun">Luai Z. Hasoun</a>, <a href="https://publications.waset.org/abstracts/search?q=Steven%20W.%20Bailey"> Steven W. Bailey</a>, <a href="https://publications.waset.org/abstracts/search?q=Kitti%20K.%20Outlaw"> Kitti K. Outlaw</a>, <a href="https://publications.waset.org/abstracts/search?q=June%20E.%20Ayling"> June E. Ayling</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Human skin color is thought to have evolved to balance sufficient photochemical synthesis of vitamin D versus the need to protect not only DNA but also folate from degradation by ultraviolet light (UV). Although the risk of DNA damage and subsequent skin cancer is related to light skin color, the effect of UV on skin folate of any species is unknown. Here we show that UVA irradiation at 13 mW/cm2 for a total exposure of 187 J/cm2 (similar to a maximal daily equatorial dose) induced a significant loss of total folate in epidermis of ex vivo white skin. No loss was observed in black skin samples, or in the dermis of either color. Interestingly, while the concentration of 5 methyltetrahydrofolate (5-MTHF) fell in white epidermis, a concomitant increase of tetrahydrofolic acid was found, though not enough to maintain the total pool. These results demonstrate that UVA indeed not only decreases folate in skin, but also rearranges the pool components. This could be due in part to the reported increase of NADPH oxidase activity upon UV irradiation, which in turn depletes the NADPH needed for 5-MTHF biosynthesis by 5,10-methylenetetrahydrofolate reductase. The increased tetrahydrofolic acid might further support production of the nucleotide bases needed for DNA repair. However, total folate was lost at a rate that could, with strong or continuous enough exposure to ultraviolet radiation, substantially deplete light colored skin locally, and also put pressure on total body stores for individuals with low intake of folate. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=depletion" title="depletion">depletion</a>, <a href="https://publications.waset.org/abstracts/search?q=folate" title=" folate"> folate</a>, <a href="https://publications.waset.org/abstracts/search?q=human%20skin" title=" human skin"> human skin</a>, <a href="https://publications.waset.org/abstracts/search?q=ultraviolet" title=" ultraviolet"> ultraviolet</a> </p> <a href="https://publications.waset.org/abstracts/40764/rearrangement-and-depletion-of-human-skin-folate-after-uva-exposure" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/40764.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">387</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2040</span> Melaninic Discrimination among Primary School Children</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Margherita%20Cardellini">Margherita Cardellini</a> </p> <p class="card-text"><strong>Abstract:</strong></p> To our knowledge, dark skinned children are often victims of discrimination from adults and society, but few studies specifically focus on skin color discrimination on children coming from the same children. Even today, the 'color blind children' ideology is widespread among adults, teachers, and educators and maybe also among scholars, which seem really careful about study expressions of racism in childhood. This social and cultural belief let people think that all the children, because of their age and their brief experience in the world, are disinterested in skin color. Sometimes adults think that children are even incapable of perceiving skin colors and that it could be dangerous to talk about melaninic differences with them because they finally could notice this difference, producing prejudices and racism. Psychology and neurology research projects are showing for many years that even the newborns are already capable of perceiving skin color and ethnic differences by the age of 3 months. Starting from this theoretical framework we conducted a research project to understand if and how primary school children talk about skin colors, picking up any stereotypes or prejudices. Choosing to use the focus group as a methodology to stimulate the group dimension and interaction, several stories about skin color discrimination's episodes within their classroom or school have emerged. Using the photo elicitation technique we chose to stimulate talk about the research object, which is the skin color, asking the children what was ‘the first two things that come into your mind’ when they look the photographs presented during the focus group, which represented dark and light skinned women and men. So, this paper will present some of these stories about episodes of discrimination with an escalation grade of proximity related to the discriminatory act. It will be presented a story of discrimination happened within the school, in an after-school daycare, in the classroom and even episode of discrimination that children tell during the focus groups in the presence of the discriminated child. If it is true that the Declaration of the Right of the Child state that every child should be discrimination free, it’s also true that every adult should protect children from every form of discrimination. How, as adults, can we defend children against discrimination if we cannot admit that even children are potential discrimination’s actors? Without awareness, we risk to devalue these episodes, implicitly confident that the only way to fight against discrimination is to keep her quiet. The right not to be discriminated goes through the right to talk about its own experiences of discrimination and the right to perceive the unfairness of the constant depreciation about skin color or any element of physical diversity. Intercultural education could act as spokesperson for this mission in the belief that difference and plurality could really become elements of potential enrichment for humanity, starting from children. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=colorism" title="colorism">colorism</a>, <a href="https://publications.waset.org/abstracts/search?q=experiences%20of%20discrimination" title=" experiences of discrimination"> experiences of discrimination</a>, <a href="https://publications.waset.org/abstracts/search?q=primary%20school%20children" title=" primary school children"> primary school children</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20color%20discrimination" title=" skin color discrimination"> skin color discrimination</a> </p> <a href="https://publications.waset.org/abstracts/77410/melaninic-discrimination-among-primary-school-children" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/77410.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">195</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2039</span> Quality Rabbit Skin Gelatin with Acetic Acid Extract</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Wehandaka%20Pancapalaga">Wehandaka Pancapalaga</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This study aimed to analyze the water content, yield, fat content, protein content, viscosity, gel strength, pH, melting and organoleptic rabbit skin gelatin with acetic acid extraction levels are different. The materials used in this study were Rex rabbit skin male. Treatments that P1 = the extraction of acetic acid 2% (v / v); P2 = the extraction of acetic acid 3% (v / v); P3 = the extraction of acetic acid 4 % (v / v). P5 = the extraction of acetic acid 5% (v / v). The results showed that the greater the concentration of acetic acid as the extraction of rabbit skin can reduce the water content and fat content of rabbit skin gelatin but increase the protein content, viscosity, pH, gel strength, yield and melting point rabbit skin gelatin. texture, color and smell of gelatin rabbits there were no differences with cow skin gelatin. The results showed that the quality of rabbit skin gelatin accordance Indonesian National Standard (SNI). Conclusion 5% acetic acid extraction produces the best quality gelatin. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=gelatin" title="gelatin">gelatin</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20rabbit" title=" skin rabbit"> skin rabbit</a>, <a href="https://publications.waset.org/abstracts/search?q=acetic%20acid%20extraction" title=" acetic acid extraction"> acetic acid extraction</a>, <a href="https://publications.waset.org/abstracts/search?q=quality" title=" quality"> quality</a> </p> <a href="https://publications.waset.org/abstracts/61347/quality-rabbit-skin-gelatin-with-acetic-acid-extract" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/61347.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">417</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2038</span> A Survey of Skin Cancer Detection and Classification from Skin Lesion Images Using Deep Learning</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Joseph%20George">Joseph George</a>, <a href="https://publications.waset.org/abstracts/search?q=Anne%20Kotteswara%20Roa"> Anne Kotteswara Roa</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Skin disease is one of the most common and popular kinds of health issues faced by people nowadays. Skin cancer (SC) is one among them, and its detection relies on the skin biopsy outputs and the expertise of the doctors, but it consumes more time and some inaccurate results. At the early stage, skin cancer detection is a challenging task, and it easily spreads to the whole body and leads to an increase in the mortality rate. Skin cancer is curable when it is detected at an early stage. In order to classify correct and accurate skin cancer, the critical task is skin cancer identification and classification, and it is more based on the cancer disease features such as shape, size, color, symmetry and etc. More similar characteristics are present in many skin diseases; hence it makes it a challenging issue to select important features from a skin cancer dataset images. Hence, the skin cancer diagnostic accuracy is improved by requiring an automated skin cancer detection and classification framework; thereby, the human expert’s scarcity is handled. Recently, the deep learning techniques like Convolutional neural network (CNN), Deep belief neural network (DBN), Artificial neural network (ANN), Recurrent neural network (RNN), and Long and short term memory (LSTM) have been widely used for the identification and classification of skin cancers. This survey reviews different DL techniques for skin cancer identification and classification. The performance metrics such as precision, recall, accuracy, sensitivity, specificity, and F-measures are used to evaluate the effectiveness of SC identification using DL techniques. By using these DL techniques, the classification accuracy increases along with the mitigation of computational complexities and time consumption. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=skin%20cancer" title="skin cancer">skin cancer</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title=" deep learning"> deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=performance%20measures" title=" performance measures"> performance measures</a>, <a href="https://publications.waset.org/abstracts/search?q=accuracy" title=" accuracy"> accuracy</a>, <a href="https://publications.waset.org/abstracts/search?q=datasets" title=" datasets"> datasets</a> </p> <a href="https://publications.waset.org/abstracts/151256/a-survey-of-skin-cancer-detection-and-classification-from-skin-lesion-images-using-deep-learning" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/151256.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">129</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2037</span> Improved Skin Detection Using Colour Space and Texture</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Medjram%20Sofiane">Medjram Sofiane</a>, <a href="https://publications.waset.org/abstracts/search?q=Babahenini%20Mohamed%20Chaouki"> Babahenini Mohamed Chaouki</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohamed%20Benali%20Yamina"> Mohamed Benali Yamina</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Skin detection is an important task for computer vision systems. A good method for skin detection means a good and successful result of the system. The colour is a good descriptor that allows us to detect skin colour in the images, but because of lightings effects and objects that have a similar colour skin, skin detection becomes difficult. In this paper, we proposed a method using the YCbCr colour space for skin detection and lighting effects elimination, then we use the information of texture to eliminate the false regions detected by the YCbCr colour skin model. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=skin%20detection" title="skin detection">skin detection</a>, <a href="https://publications.waset.org/abstracts/search?q=YCbCr" title=" YCbCr"> YCbCr</a>, <a href="https://publications.waset.org/abstracts/search?q=GLCM" title=" GLCM"> GLCM</a>, <a href="https://publications.waset.org/abstracts/search?q=texture" title=" texture"> texture</a>, <a href="https://publications.waset.org/abstracts/search?q=human%20skin" title=" human skin"> human skin</a> </p> <a href="https://publications.waset.org/abstracts/19039/improved-skin-detection-using-colour-space-and-texture" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/19039.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">459</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2036</span> Analysis of Tactile Perception of Textiles by Fingertip Skin Model</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Izabela%20L.%20Ciesielska-Wr%CF%8Cbel">Izabela L. Ciesielska-Wrόbel</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper presents finite element models of the fingertip skin which have been created to simulate the contact of textile objects with the skin to gain a better understanding of the perception of textiles through the skin, so-called Hand of Textiles (HoT). Many objective and subjective techniques have been developed to analyze HoT, however none of them provide exact overall information concerning the sensation of textiles through the skin. As the human skin is a complex heterogeneous hyperelastic body composed of many particles, some simplifications had to be made at the stage of building the models. The same concerns models of woven structures, however their utilitarian value was maintained. The models reflect only friction between skin and woven textiles, deformation of the skin and fabrics when “touching” textiles and heat transfer from the surface of the skin into direction of textiles. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=fingertip%20skin%20models" title="fingertip skin models">fingertip skin models</a>, <a href="https://publications.waset.org/abstracts/search?q=finite%20element%20models" title=" finite element models"> finite element models</a>, <a href="https://publications.waset.org/abstracts/search?q=modelling%20of%20textiles" title=" modelling of textiles"> modelling of textiles</a>, <a href="https://publications.waset.org/abstracts/search?q=sensation%20of%20textiles%20through%20the%20skin" title=" sensation of textiles through the skin"> sensation of textiles through the skin</a> </p> <a href="https://publications.waset.org/abstracts/26064/analysis-of-tactile-perception-of-textiles-by-fingertip-skin-model" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/26064.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">465</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2035</span> Evaluating the Performance of Color Constancy Algorithm</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Damanjit%20Kaur">Damanjit Kaur</a>, <a href="https://publications.waset.org/abstracts/search?q=Avani%20Bhatia"> Avani Bhatia</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Color constancy is significant for human vision since color is a pictorial cue that helps in solving different visions tasks such as tracking, object recognition, or categorization. Therefore, several computational methods have tried to simulate human color constancy abilities to stabilize machine color representations. Two different kinds of methods have been used, i.e., normalization and constancy. While color normalization creates a new representation of the image by canceling illuminant effects, color constancy directly estimates the color of the illuminant in order to map the image colors to a canonical version. Color constancy is the capability to determine colors of objects independent of the color of the light source. This research work studies the most of the well-known color constancy algorithms like white point and gray world. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=color%20constancy" title="color constancy">color constancy</a>, <a href="https://publications.waset.org/abstracts/search?q=gray%20world" title=" gray world"> gray world</a>, <a href="https://publications.waset.org/abstracts/search?q=white%20patch" title=" white patch"> white patch</a>, <a href="https://publications.waset.org/abstracts/search?q=modified%20white%20patch" title=" modified white patch "> modified white patch </a> </p> <a href="https://publications.waset.org/abstracts/4799/evaluating-the-performance-of-color-constancy-algorithm" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/4799.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">319</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2034</span> Hyper-Immunoglobulin E (Hyper-Ige) Syndrome In Skin Of Color: A Retrospective Single-Centre Observational Study</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Rohit%20Kothari">Rohit Kothari</a>, <a href="https://publications.waset.org/abstracts/search?q=Muneer%20Mohamed"> Muneer Mohamed</a>, <a href="https://publications.waset.org/abstracts/search?q=Vivekanandh%20K."> Vivekanandh K.</a>, <a href="https://publications.waset.org/abstracts/search?q=Sunmeet%20Sandhu"> Sunmeet Sandhu</a>, <a href="https://publications.waset.org/abstracts/search?q=Preema%20Sinha"> Preema Sinha</a>, <a href="https://publications.waset.org/abstracts/search?q=Anuj%20Bhatnagar"> Anuj Bhatnagar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Introduction: Hyper-IgE syndrome is a rare primary immunodeficiency syndrome characterised by triad of severe atopic dermatitis, recurrent pulmonary infections, and recurrent staphylococcal skin infections. The diagnosis requires a high degree of suspicion, typical clinical features, and not mere rise in serum-IgE levels, which may be seen in multiple conditions. Genetic studies are not always possible in a resource poor setting. This study highlights various presentations of Hyper-IgE syndrome in skin of color children. Case-series: Our study had six children of Hyper-IgE syndrome aged twomonths to tenyears. All had onset in first ten months of life except one with a late-onset at two years. All had recurrent eczematoid rash, which responded poorly to conventional treatment, secondary infection, multiple episodes of hospitalisation for pulmonary infection, and raised serum IgE levels. One case had occasional vesicles, bullae, and crusted plaques over both the extremities. Genetic study was possible in only one of them who was found to have pathogenic homozygous deletions of exon-15 to 18 in DOCK8 gene following which he underwent bone marrow transplant (BMT), however, succumbed to lower respiratory tract infection two months after BMT and rest of them received multiple courses of antibiotics, oral/ topical steroids, and cyclosporine intermittently with variable response. Discussion: Our study highlights various characteristics, presentation, and management of this rare syndrome in children. Knowledge of these manifestations in skin of color will facilitate early identification and contribute to optimal care of the patients as representative data on the same is limited in literature. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=absolute%20eosinophil%20count" title="absolute eosinophil count">absolute eosinophil count</a>, <a href="https://publications.waset.org/abstracts/search?q=atopic%20dermatitis" title=" atopic dermatitis"> atopic dermatitis</a>, <a href="https://publications.waset.org/abstracts/search?q=eczematous%20rash" title=" eczematous rash"> eczematous rash</a>, <a href="https://publications.waset.org/abstracts/search?q=hyper-immunoglobulin%20E%20syndrome" title=" hyper-immunoglobulin E syndrome"> hyper-immunoglobulin E syndrome</a>, <a href="https://publications.waset.org/abstracts/search?q=pulmonary%20infection" title=" pulmonary infection"> pulmonary infection</a>, <a href="https://publications.waset.org/abstracts/search?q=serum%20IgE" title=" serum IgE"> serum IgE</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20of%20color" title=" skin of color"> skin of color</a> </p> <a href="https://publications.waset.org/abstracts/143963/hyper-immunoglobulin-e-hyper-ige-syndrome-in-skin-of-color-a-retrospective-single-centre-observational-study" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/143963.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">138</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2033</span> In vitro Antioxidant and DNA Protectant Activity of Different Skin Colored Eggplant (Solanum melongena)</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=K.%20M.%20Somawathie">K. M. Somawathie</a>, <a href="https://publications.waset.org/abstracts/search?q=V.%20Rizliya"> V. Rizliya</a>, <a href="https://publications.waset.org/abstracts/search?q=H.%20A.%20M.%20Wickrmasinghe"> H. A. M. Wickrmasinghe</a>, <a href="https://publications.waset.org/abstracts/search?q=Terrence%20Madhujith"> Terrence Madhujith</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The main objective of our study was to determine the in vitro antioxidant and DNA protectant activity of aqueous extract of S. melongena with different skin colors; dark purple (DP), moderately purple (MP), light purple (LP) and purple and green (PG). The antioxidant activity was evaluated using the DPPH and ABTS free radical scavenging assay, ferric reducing antioxidant power (FRAP), ferric thiocyanate (FTC) and the egg yolk model. The effectiveness of eggplant extracts against radical induced DNA damage was also determined. There was a significant difference (p < 0.0001) between the skin color and antioxidant activity. TPC and FRAP values of eggplant extracts ranged from 48.67±0.27-61.11±0.26 (mg GAE/100 g fresh weight) and 4.19±0.11-7.46±0.26 (mmol of FeS04/g of fresh weight) respectively. MP displayed the highest percentage of DPPH radical scavenging activity while, DP demonstrated the strongest total antioxidant capacity. In the FTC and egg yolk model, DP and MP showed better antioxidant activity than PG and LP. All eggplant extracts showed potent antioxidant activity in retaining DNA against AAPH mediated radical damage. DP and MP demonstrated better antioxidant activity which may be attributed to the higher phenolic content since a positive correlation was observed between the TPC and the antioxidant parameters. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Solanum%20melongena" title="Solanum melongena">Solanum melongena</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20color" title=" skin color"> skin color</a>, <a href="https://publications.waset.org/abstracts/search?q=antioxidant" title=" antioxidant"> antioxidant</a>, <a href="https://publications.waset.org/abstracts/search?q=DNA%20protection" title=" DNA protection"> DNA protection</a>, <a href="https://publications.waset.org/abstracts/search?q=lipid%20peroxidation" title=" lipid peroxidation"> lipid peroxidation</a> </p> <a href="https://publications.waset.org/abstracts/35657/in-vitro-antioxidant-and-dna-protectant-activity-of-different-skin-colored-eggplant-solanum-melongena" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/35657.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">431</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2032</span> A Way of Converting Color Images to Gray Scale Ones for the Color-Blind: Applying to the part of the Tokyo Subway Map</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Katsuhiro%20Narikiyo">Katsuhiro Narikiyo</a>, <a href="https://publications.waset.org/abstracts/search?q=Shota%20Hashikawa"> Shota Hashikawa</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper proposes a way of removing noises and reducing the number of colors contained in a JPEG image. Main purpose of this project is to convert color images to monochrome images for the color-blind. We treat the crispy color images like the Tokyo subway map. Each color in the image has an important information. But for the color blinds, similar colors cannot be distinguished. If we can convert those colors to different gray values, they can distinguish them. Therefore we try to convert color images to monochrome images. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=color-blind" title="color-blind">color-blind</a>, <a href="https://publications.waset.org/abstracts/search?q=JPEG" title=" JPEG"> JPEG</a>, <a href="https://publications.waset.org/abstracts/search?q=monochrome%20image" title=" monochrome image"> monochrome image</a>, <a href="https://publications.waset.org/abstracts/search?q=denoise" title=" denoise"> denoise</a> </p> <a href="https://publications.waset.org/abstracts/2968/a-way-of-converting-color-images-to-gray-scale-ones-for-the-color-blind-applying-to-the-part-of-the-tokyo-subway-map" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/2968.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">356</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2031</span> Penetration Depth Study of Linear Siloxanes through Human Skin</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=K.%20Szymkowska">K. Szymkowska</a>, <a href="https://publications.waset.org/abstracts/search?q=K.%20Mojsiewicz-%20Pie%C5%84kowska"> K. Mojsiewicz- Pieńkowska</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Siloxanes are a common ingredients in medicinal products used on the skin, as well as cosmetics. It is widely believed that the silicones are not capable of overcoming the skin barrier. The aim of the study was to verify the possibility of penetration and permeation of linear siloxanes through human skin and determine depth penetration limit of these compounds. Based on the results it was found that human skin is not a barrier for linear siloxanes. PDMS 50 cSt was not identified in the dermis suggests that this molecular size of silicones (3780Da) is safe when it is used in the skin formulations. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=linear%20siloxanes" title="linear siloxanes">linear siloxanes</a>, <a href="https://publications.waset.org/abstracts/search?q=methyl%20siloxanes" title=" methyl siloxanes"> methyl siloxanes</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20penetration" title=" skin penetration"> skin penetration</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20permeation" title=" skin permeation"> skin permeation</a> </p> <a href="https://publications.waset.org/abstracts/47996/penetration-depth-study-of-linear-siloxanes-through-human-skin" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/47996.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">401</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2030</span> Automatic Facial Skin Segmentation Using Possibilistic C-Means Algorithm for Evaluation of Facial Surgeries</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Elham%20Alaee">Elham Alaee</a>, <a href="https://publications.waset.org/abstracts/search?q=Mousa%20Shamsi"> Mousa Shamsi</a>, <a href="https://publications.waset.org/abstracts/search?q=Hossein%20Ahmadi"> Hossein Ahmadi</a>, <a href="https://publications.waset.org/abstracts/search?q=Soroosh%20Nazem"> Soroosh Nazem</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohammad%20Hossein%20Sedaaghi"> Mohammad Hossein Sedaaghi </a> </p> <p class="card-text"><strong>Abstract:</strong></p> Human face has a fundamental role in the appearance of individuals. So the importance of facial surgeries is undeniable. Thus, there is a need for the appropriate and accurate facial skin segmentation in order to extract different features. Since Fuzzy C-Means (FCM) clustering algorithm doesn’t work appropriately for noisy images and outliers, in this paper we exploit Possibilistic C-Means (PCM) algorithm in order to segment the facial skin. For this purpose, first, we convert facial images from RGB to YCbCr color space. To evaluate performance of the proposed algorithm, the database of Sahand University of Technology, Tabriz, Iran was used. In order to have a better understanding from the proposed algorithm; FCM and Expectation-Maximization (EM) algorithms are also used for facial skin segmentation. The proposed method shows better results than the other segmentation methods. Results include misclassification error (0.032) and the region’s area error (0.045) for the proposed algorithm. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=facial%20image" title="facial image">facial image</a>, <a href="https://publications.waset.org/abstracts/search?q=segmentation" title=" segmentation"> segmentation</a>, <a href="https://publications.waset.org/abstracts/search?q=PCM" title=" PCM"> PCM</a>, <a href="https://publications.waset.org/abstracts/search?q=FCM" title=" FCM"> FCM</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20error" title=" skin error"> skin error</a>, <a href="https://publications.waset.org/abstracts/search?q=facial%20surgery" title=" facial surgery"> facial surgery</a> </p> <a href="https://publications.waset.org/abstracts/10297/automatic-facial-skin-segmentation-using-possibilistic-c-means-algorithm-for-evaluation-of-facial-surgeries" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/10297.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">586</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2029</span> Hand Detection and Recognition for Malay Sign Language</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mohd%20Noah%20A.%20Rahman">Mohd Noah A. Rahman</a>, <a href="https://publications.waset.org/abstracts/search?q=Afzaal%20H.%20Seyal"> Afzaal H. Seyal</a>, <a href="https://publications.waset.org/abstracts/search?q=Norhafilah%20Bara"> Norhafilah Bara</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Developing a software application using an interface with computers and peripheral devices using gestures of human body such as hand movements keeps growing in interest. A review on this hand gesture detection and recognition based on computer vision technique remains a very challenging task. This is to provide more natural, innovative and sophisticated way of non-verbal communication, such as sign language, in human computer interaction. Nevertheless, this paper explores hand detection and hand gesture recognition applying a vision based approach. The hand detection and recognition used skin color spaces such as HSV and YCrCb are applied. However, there are limitations that are needed to be considered. Almost all of skin color space models are sensitive to quickly changing or mixed lighting circumstances. There are certain restrictions in order for the hand recognition to give better results such as the distance of user’s hand to the webcam and the posture and size of the hand. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=hand%20detection" title="hand detection">hand detection</a>, <a href="https://publications.waset.org/abstracts/search?q=hand%20gesture" title=" hand gesture"> hand gesture</a>, <a href="https://publications.waset.org/abstracts/search?q=hand%20recognition" title=" hand recognition"> hand recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=sign%20language" title=" sign language"> sign language</a> </p> <a href="https://publications.waset.org/abstracts/46765/hand-detection-and-recognition-for-malay-sign-language" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/46765.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">306</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2028</span> Skin Care through Ayurveda</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=K.%20L.%20Virupaksha%20Gupta">K. L. Virupaksha Gupta </a> </p> <p class="card-text"><strong>Abstract:</strong></p> Ayurveda offers a holistic outlook regarding skin care. Most Initial step in Ayurveda is to identify the skin type and care accordingly which is highly personalized. Though dermatologically there are various skin type classifications such Baumann skin types (based on 4 parameters i) Oily Vs Dry ii) Sensitive Vs Resistant iii) Pigmented Vs Non-Pigmented iv) Wrinkled Vs Tight (Unwrinkled) etc but Skin typing in Ayurveda is mainly determined by the prakriti (constitution) of the individual as well as the status of Doshas (Humors) which are basically of 3 types – i.e Vata Pitta and Kapha,. Difference between them is mainly attributed to the qualities of each dosha (humor). All the above said skin types can be incorporated under these three types. The skin care modalities in each of the constitution vary greatly. Skin of an individual of Vata constitution would be lustreless, having rough texture and cracks due to dryness and thus should be given warm and unctuous therapies and oil massage for lubrication and natural moisturizers for hydration. Skin of an individual of Pitta constitution would look more vascular (pinkish), delicate and sensitive with a fair complexion, unctuous and tendency for wrinkles and greying of hair at an early age and hence should be given cooling and nurturing therapies and should avoid tanning treatments. Skin of an individual of kapha constitution will have oily skin, they are delicate and look beautiful and radiant and hence these individuals would require therapies to mainly combat oily skin. Hence, the skin typing and skin care in Ayurveda is highly rational and scientific. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ayurveda" title="Ayurveda">Ayurveda</a>, <a href="https://publications.waset.org/abstracts/search?q=dermatology" title=" dermatology"> dermatology</a>, <a href="https://publications.waset.org/abstracts/search?q=Dosha" title=" Dosha"> Dosha</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20types" title=" skin types"> skin types</a> </p> <a href="https://publications.waset.org/abstracts/19790/skin-care-through-ayurveda" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/19790.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">407</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2027</span> Assessment of Image Databases Used for Human Skin Detection Methods</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Saleh%20Alshehri">Saleh Alshehri</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Human skin detection is a vital step in many applications. Some of the applications are critical especially those related to security. This leverages the importance of a high-performance detection algorithm. To validate the accuracy of the algorithm, image databases are usually used. However, the suitability of these image databases is still questionable. It is suggested that the suitability can be measured mainly by the span the database covers of the color space. This research investigates the validity of three famous image databases. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=image%20databases" title="image databases">image databases</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20processing" title=" image processing"> image processing</a>, <a href="https://publications.waset.org/abstracts/search?q=pattern%20recognition" title=" pattern recognition"> pattern recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=neural%20networks" title=" neural networks"> neural networks</a> </p> <a href="https://publications.waset.org/abstracts/87836/assessment-of-image-databases-used-for-human-skin-detection-methods" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/87836.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">271</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2026</span> Black-Brown and Yellow-Brown-Red Skin Pigmentation Elements are Shared in Common: Using Art and Science for Multicultural Education</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mary%20Kay%20Bacallao">Mary Kay Bacallao</a> </p> <p class="card-text"><strong>Abstract:</strong></p> New research on the human genome has revealed secrets to the variation in skin pigmentation found in all human populations. Application of this research to multicultural education has a profound effect on students from all backgrounds. This paper identifies the four locations in the human genome that code for variation in skin pigmentation worldwide. The research makes this new knowledge accessible to students of all ages as they participate in an art project that brings these scientific multicultural concepts to life. Students participate in the application of breakthrough scientific principles through hands-on art activities where they simulate the work of the DNA coding to create their own skin tone using the colors expressed to varying degrees in every people group. As students create their own artwork handprint from the pallet of colors, they realize that each color on the pallet is essential to creating every tone of skin. This research project serves to bring people together and appreciate the variety and diversity in skin tones. As students explore the variations, they create pigmentation with the use of the eumelanins, which are the black-brown sources of pigmentation, and the pheomelanins, which are the yellow-reddish-brown sources of pigmentation. The research project dispels myths about skin tones that have divided people in the past. As a group project, this research leads to greater appreciation and understanding of the diverse family groups. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=diversity" title="diversity">diversity</a>, <a href="https://publications.waset.org/abstracts/search?q=multicultural" title=" multicultural"> multicultural</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20pigmentation" title=" skin pigmentation"> skin pigmentation</a>, <a href="https://publications.waset.org/abstracts/search?q=eumelanins" title=" eumelanins"> eumelanins</a>, <a href="https://publications.waset.org/abstracts/search?q=pheomelanins" title=" pheomelanins"> pheomelanins</a>, <a href="https://publications.waset.org/abstracts/search?q=handprint" title=" handprint"> handprint</a>, <a href="https://publications.waset.org/abstracts/search?q=artwork" title=" artwork"> artwork</a>, <a href="https://publications.waset.org/abstracts/search?q=science" title=" science"> science</a>, <a href="https://publications.waset.org/abstracts/search?q=genome" title=" genome"> genome</a>, <a href="https://publications.waset.org/abstracts/search?q=human" title=" human"> human</a> </p> <a href="https://publications.waset.org/abstracts/171441/black-brown-and-yellow-brown-red-skin-pigmentation-elements-are-shared-in-common-using-art-and-science-for-multicultural-education" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/171441.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">67</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2025</span> Spectra Analysis in Sunset Color Demonstrations with a White-Color LED as a Light Source</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Makoto%20Hasegawa">Makoto Hasegawa</a>, <a href="https://publications.waset.org/abstracts/search?q=Seika%20Tokumitsu"> Seika Tokumitsu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Spectra of light beams emitted from white-color LED torches are different from those of conventional electric torches. In order to confirm if white-color LED torches can be used as light sources for popular sunset color demonstrations in spite of such differences, spectra of travelled light beams and scattered light beams with each of a white-color LED torch (composed of a blue LED and yellow-color fluorescent material) and a conventional electric torch as a light source were measured and compared with each other in a 50 cm-long water tank for sunset color demonstration experiments. Suspension liquid was prepared from acryl-emulsion and tap-water in the water tank, and light beams from the white-color LED torch or the conventional electric torch were allowed to travel in this suspension liquid. Sunset-like color was actually observed when the white-color LED torch was used as the light source in sunset color demonstrations. However, the observed colors when viewed with naked eye look slightly different from those obtainable with the conventional electric torch. At the same time, with the white-color LED, changes in colors in short to middle wavelength regions were recognized with careful observations. From those results, white-color LED torches are confirmed to be applicable as light sources in sunset color demonstrations, although certain attentions have to be paid. Further advanced classes will be successfully performed with white-color LED torches as light sources. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=blue%20sky%20demonstration" title="blue sky demonstration">blue sky demonstration</a>, <a href="https://publications.waset.org/abstracts/search?q=sunset%20color%20demonstration" title=" sunset color demonstration"> sunset color demonstration</a>, <a href="https://publications.waset.org/abstracts/search?q=white%20LED%20torch" title=" white LED torch"> white LED torch</a>, <a href="https://publications.waset.org/abstracts/search?q=physics%20education" title=" physics education"> physics education</a> </p> <a href="https://publications.waset.org/abstracts/47625/spectra-analysis-in-sunset-color-demonstrations-with-a-white-color-led-as-a-light-source" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/47625.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">284</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2024</span> Fabrication of Optical Tissue Phantoms Simulating Human Skin and Their Application</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jihoon%20Park">Jihoon Park</a>, <a href="https://publications.waset.org/abstracts/search?q=Sungkon%20Yu"> Sungkon Yu</a>, <a href="https://publications.waset.org/abstracts/search?q=Byungjo%20Jung"> Byungjo Jung</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Although various optical tissue phantoms (OTPs) simulating human skin have been actively studied, their completeness is unclear because skin tissue has the intricate optical property and complicated structure disturbing the optical simulation. In this study, we designed multilayer OTP mimicking skin structure, and fabricated OTP models simulating skin-blood vessel and skin pigmentation in the skin, which are useful in Biomedical optics filed. The OTPs were characterized with the optical property and the cross-sectional structure, and analyzed by using various optical tools such as a laser speckle imaging system, OCT and a digital microscope to show the practicality. The measured optical property was within 5% error, and the thickness of each layer was uniform within 10% error in micrometer scale. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=blood%20vessel" title="blood vessel">blood vessel</a>, <a href="https://publications.waset.org/abstracts/search?q=optical%20tissue%20phantom" title=" optical tissue phantom"> optical tissue phantom</a>, <a href="https://publications.waset.org/abstracts/search?q=optical%20property" title=" optical property"> optical property</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20tissue" title=" skin tissue"> skin tissue</a>, <a href="https://publications.waset.org/abstracts/search?q=pigmentation" title=" pigmentation"> pigmentation</a> </p> <a href="https://publications.waset.org/abstracts/68389/fabrication-of-optical-tissue-phantoms-simulating-human-skin-and-their-application" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/68389.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">455</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2023</span> Classification of Red, Green and Blue Values from Face Images Using k-NN Classifier to Predict the Skin or Non-Skin</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Kemal%20Polat">Kemal Polat</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this study, it has been estimated whether there is skin by using RBG values obtained from the camera and k-nearest neighbor (k-NN) classifier. The dataset used in this study has an unbalanced distribution and a linearly non-separable structure. This problem can also be called a big data problem. The Skin dataset was taken from UCI machine learning repository. As the classifier, we have used the k-NN method to handle this big data problem. For k value of k-NN classifier, we have used as 1. To train and test the k-NN classifier, 50-50% training-testing partition has been used. As the performance metrics, TP rate, FP Rate, Precision, recall, f-measure and AUC values have been used to evaluate the performance of k-NN classifier. These obtained results are as follows: 0.999, 0.001, 0.999, 0.999, 0.999, and 1,00. As can be seen from the obtained results, this proposed method could be used to predict whether the image is skin or not. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=k-NN%20classifier" title="k-NN classifier">k-NN classifier</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20or%20non-skin%20classification" title=" skin or non-skin classification"> skin or non-skin classification</a>, <a href="https://publications.waset.org/abstracts/search?q=RGB%20values" title=" RGB values"> RGB values</a>, <a href="https://publications.waset.org/abstracts/search?q=classification" title=" classification"> classification</a> </p> <a href="https://publications.waset.org/abstracts/86538/classification-of-red-green-and-blue-values-from-face-images-using-k-nn-classifier-to-predict-the-skin-or-non-skin" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/86538.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">248</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2022</span> A Neural Approach for Color-Textured Images Segmentation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Khalid%20Salhi">Khalid Salhi</a>, <a href="https://publications.waset.org/abstracts/search?q=El%20Miloud%20Jaara"> El Miloud Jaara</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohammed%20Talibi%20Alaoui"> Mohammed Talibi Alaoui</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, we present a neural approach for unsupervised natural color-texture image segmentation, which is based on both Kohonen maps and mathematical morphology, using a combination of the texture and the image color information of the image, namely, the fractal features based on fractal dimension are selected to present the information texture, and the color features presented in RGB color space. These features are then used to train the network Kohonen, which will be represented by the underlying probability density function, the segmentation of this map is made by morphological watershed transformation. The performance of our color-texture segmentation approach is compared first, to color-based methods or texture-based methods only, and then to k-means method. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=segmentation" title="segmentation">segmentation</a>, <a href="https://publications.waset.org/abstracts/search?q=color-texture" title=" color-texture"> color-texture</a>, <a href="https://publications.waset.org/abstracts/search?q=neural%20networks" title=" neural networks"> neural networks</a>, <a href="https://publications.waset.org/abstracts/search?q=fractal" title=" fractal"> fractal</a>, <a href="https://publications.waset.org/abstracts/search?q=watershed" title=" watershed"> watershed</a> </p> <a href="https://publications.waset.org/abstracts/51740/a-neural-approach-for-color-textured-images-segmentation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/51740.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">346</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2021</span> The Effect of Skin to Skin Contact Immediately to Maternal Breastfeeding Self-Efficacy after Cesarean Section</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=D.%20Triana">D. Triana</a>, <a href="https://publications.waset.org/abstracts/search?q=I.%20N.%20Rachmawati"> I. N. Rachmawati</a>, <a href="https://publications.waset.org/abstracts/search?q=L.%20Sabri"> L. Sabri</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Maternal breastfeeding self-efficacy is positively associated with increased duration of breastfeeding in different cultures and age groups. This study aims to determine the effect of skin-to-skin contact immediately after the cesarean section on maternal breastfeeding self-efficacy. The research design is Posttest quasi-experimental research design only with control groups involving 52 women with consecutive sampling in Langsa-Aceh. The data collected through breastfeeding Self-Efficacy Scale-Short Form. The results of Independent t-test showed a significant difference in the mean values of maternal breastfeeding self-efficacy in the intervention group and the control group (59.00 ± 6.54; 49.62 ± 7.78; p= 0.001). Skin to skin contact is proven to affect the maternal breastfeeding self-efficacy after cesarean section significantly. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=breastfeeding%20self-efficacy" title="breastfeeding self-efficacy">breastfeeding self-efficacy</a>, <a href="https://publications.waset.org/abstracts/search?q=cesarean%20section" title=" cesarean section"> cesarean section</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20to%20skin%20contact" title=" skin to skin contact"> skin to skin contact</a>, <a href="https://publications.waset.org/abstracts/search?q=immediately" title=" immediately"> immediately</a> </p> <a href="https://publications.waset.org/abstracts/32533/the-effect-of-skin-to-skin-contact-immediately-to-maternal-breastfeeding-self-efficacy-after-cesarean-section" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/32533.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">377</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2020</span> A Convolutional Deep Neural Network Approach for Skin Cancer Detection Using Skin Lesion Images</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Firas%20Gerges">Firas Gerges</a>, <a href="https://publications.waset.org/abstracts/search?q=Frank%20Y.%20Shih"> Frank Y. Shih</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Malignant melanoma, known simply as melanoma, is a type of skin cancer that appears as a mole on the skin. It is critical to detect this cancer at an early stage because it can spread across the body and may lead to the patient's death. When detected early, melanoma is curable. In this paper, we propose a deep learning model (convolutional neural networks) in order to automatically classify skin lesion images as malignant or benign. Images underwent certain pre-processing steps to diminish the effect of the normal skin region on the model. The result of the proposed model showed a significant improvement over previous work, achieving an accuracy of 97%. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title="deep learning">deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=skin%20cancer" title=" skin cancer"> skin cancer</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20processing" title=" image processing"> image processing</a>, <a href="https://publications.waset.org/abstracts/search?q=melanoma" title=" melanoma"> melanoma</a> </p> <a href="https://publications.waset.org/abstracts/134720/a-convolutional-deep-neural-network-approach-for-skin-cancer-detection-using-skin-lesion-images" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/134720.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">148</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2019</span> Experimental Characterization of the Color Quality and Error Rate for an Red, Green, and Blue-Based Light Emission Diode-Fixture Used in Visible Light Communications</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Juan%20F.%20Gutierrez">Juan F. Gutierrez</a>, <a href="https://publications.waset.org/abstracts/search?q=Jesus%20M.%20Quintero"> Jesus M. Quintero</a>, <a href="https://publications.waset.org/abstracts/search?q=Diego%20Sandoval"> Diego Sandoval</a> </p> <p class="card-text"><strong>Abstract:</strong></p> An important feature of LED technology is the fast on-off commutation, which allows data transmission. Visible Light Communication (VLC) is a wireless method to transmit data with visible light. Modulation formats such as On-Off Keying (OOK) and Color Shift Keying (CSK) are used in VLC. Since CSK is based on three color bands uses red, green, and blue monochromatic LED (RGB-LED) to define a pattern of chromaticities. This type of CSK provides poor color quality in the illuminated area. This work presents the design and implementation of a VLC system using RGB-based CSK with 16, 8, and 4 color points, mixing with a steady baseline of a phosphor white-LED, to improve the color quality of the LED-Fixture. The experimental system was assessed in terms of the Color Rendering Index (CRI) and the Symbol Error Rate (SER). Good color quality performance of the LED-Fixture was obtained with an acceptable SER. The laboratory setup used to characterize and calibrate an LED-Fixture is described. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=VLC" title="VLC">VLC</a>, <a href="https://publications.waset.org/abstracts/search?q=indoor%20lighting" title=" indoor lighting"> indoor lighting</a>, <a href="https://publications.waset.org/abstracts/search?q=color%20quality" title=" color quality"> color quality</a>, <a href="https://publications.waset.org/abstracts/search?q=symbol%20error%20rate" title=" symbol error rate"> symbol error rate</a>, <a href="https://publications.waset.org/abstracts/search?q=color%20shift%20keying" title=" color shift keying"> color shift keying</a> </p> <a href="https://publications.waset.org/abstracts/158336/experimental-characterization-of-the-color-quality-and-error-rate-for-an-red-green-and-blue-based-light-emission-diode-fixture-used-in-visible-light-communications" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/158336.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">100</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2018</span> The Impact of the “Cold Ambient Color = Healthy” Intuition on Consumer Food Choice</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yining%20Yu">Yining Yu</a>, <a href="https://publications.waset.org/abstracts/search?q=Bingjie%20Li"> Bingjie Li</a>, <a href="https://publications.waset.org/abstracts/search?q=Miaolei%20Jia"> Miaolei Jia</a>, <a href="https://publications.waset.org/abstracts/search?q=Lei%20Wang"> Lei Wang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Ambient color temperature is one of the most ubiquitous factors in retailing. However, there is limited research regarding the effect of cold versus warm ambient color on consumers’ food consumption. This research investigates an unexplored lay belief named the “cold ambient color = healthy” intuition and its impact on food choice. We demonstrate that consumers have built the “cold ambient color = healthy” intuition, such that they infer that a restaurant with a cold-colored ambiance is more likely to sell healthy food than a warm-colored restaurant. This deep-seated intuition also guides consumers’ food choices. We find that using a cold (vs. warm) ambient color increases the choice of healthy food, which offers insights into healthy diet promotion for retailers and policymakers. Theoretically, our work contributes to the literature on color psychology, sensory marketing, and food consumption. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ambient%20color%20temperature" title="ambient color temperature">ambient color temperature</a>, <a href="https://publications.waset.org/abstracts/search?q=cold%20ambient%20color" title=" cold ambient color"> cold ambient color</a>, <a href="https://publications.waset.org/abstracts/search?q=food%20choice" title=" food choice"> food choice</a>, <a href="https://publications.waset.org/abstracts/search?q=consumer%20wellbeing" title=" consumer wellbeing"> consumer wellbeing</a> </p> <a href="https://publications.waset.org/abstracts/148864/the-impact-of-the-cold-ambient-color-healthy-intuition-on-consumer-food-choice" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/148864.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">142</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">2017</span> Transparent Photovoltaic Skin for Artificial Thermoreceptor and Nociceptor Memory</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Priyanka%20Bhatnagar">Priyanka Bhatnagar</a>, <a href="https://publications.waset.org/abstracts/search?q=Malkeshkumar%20Patel"> Malkeshkumar Patel</a>, <a href="https://publications.waset.org/abstracts/search?q=Joondong%20Kim"> Joondong Kim</a>, <a href="https://publications.waset.org/abstracts/search?q=Joonpyo%20Hong"> Joonpyo Hong</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Artificial skin and sensory memory platforms are produced using a flexible, transparent photovoltaic (TPV) device. The TPV device is composed of a metal oxide heterojunction (nZnO/p-NiO) and transmits visible light (> 50%) while producing substantial electric power (0.5 V and 200 μA cm-2 ). This TPV device is a transparent energy interface that can be used to detect signals and propagate information without an external energy supply. The TPV artificial skin offers a temperature detection range (0 C75 C) that is wider than that of natural skin (5 C48 °C) due to the temperature-sensitive pyrocurrent from the ZnO layer. Moreover, the TPV thermoreceptor offers sensory memory of extreme thermal stimuli. Much like natural skin, artificial skin uses the nociceptor mechanism to protect tissue from harmful damage via signal amplification (hyperalgesia) and early adaption (allodynia). This demonstrates the many features of TPV artificial skin, which can sense and transmit signals and memorize information under self-operation mode. This transparent photovoltaic skin can provide sustainable energy for use in human electronics. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=transparent" title="transparent">transparent</a>, <a href="https://publications.waset.org/abstracts/search?q=photovoltaics" title=" photovoltaics"> photovoltaics</a>, <a href="https://publications.waset.org/abstracts/search?q=thermal%20memory" title=" thermal memory"> thermal memory</a>, <a href="https://publications.waset.org/abstracts/search?q=artificial%20skin" title=" artificial skin"> artificial skin</a>, <a href="https://publications.waset.org/abstracts/search?q=thermoreceptor" title=" thermoreceptor"> thermoreceptor</a> </p> <a href="https://publications.waset.org/abstracts/149259/transparent-photovoltaic-skin-for-artificial-thermoreceptor-and-nociceptor-memory" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/149259.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">110</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">‹</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=skin%20color&page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=skin%20color&page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=skin%20color&page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=skin%20color&page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=skin%20color&page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=skin%20color&page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=skin%20color&page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=skin%20color&page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=skin%20color&page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=skin%20color&page=68">68</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=skin%20color&page=69">69</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=skin%20color&page=2" rel="next">›</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">© 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">×</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>