CINXE.COM

Search results for: guided imagery

<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: guided imagery</title> <meta name="description" content="Search results for: guided imagery"> <meta name="keywords" content="guided imagery"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="guided imagery" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="guided imagery"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 1050</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: guided imagery</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1050</span> Students’ Perception of Guided Imagery Improving Anxiety before Examination: A Qualitative Study</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Wong%20Ka%20Fai">Wong Ka Fai</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Introduction: Many students are worried before an examination; that is a common picture worldwide. Health problems from stress before examination were insomnia, tiredness, isolation, stomach upset, and anxiety. Nursing students experienced high stress from the examination. Guided imagery is a healing process of applying imagination to help the body heal, survive, or live well. It can bring about significant physiological and biochemical changes, which can trigger the recovery process. A study of nursing students improving their anxiety before examination with guided imagery was proposed. Aim: The aim of this study was to explore the outcome of guided imagery on nursing students’ anxiety before examination in Hong Kong. Method: The qualitative study method was used. 16 first-year students studying nursing programme were invited to practice guided imagery to improve their anxiety before the examination period. One week before the examination, the semi-structured interviews with these students were carried out by the researcher. Result: From the content analysis of interview data, these nursing students showed considerable similarities in their anxiety perception. Nursing students’ perceived improved anxiety was evidenced by a reduction of stressful feelings, improved physical health, satisfaction with daily activities, and enhanced skills for solving problems and upcoming situations. Conclusion: This study indicated that guided imagery can be used as an alternative measure to improve students’ anxiety and psychological problems. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=nursing%20students" title="nursing students">nursing students</a>, <a href="https://publications.waset.org/abstracts/search?q=perception" title=" perception"> perception</a>, <a href="https://publications.waset.org/abstracts/search?q=anxiety" title=" anxiety"> anxiety</a>, <a href="https://publications.waset.org/abstracts/search?q=guided%20imagery" title=" guided imagery"> guided imagery</a> </p> <a href="https://publications.waset.org/abstracts/172769/students-perception-of-guided-imagery-improving-anxiety-before-examination-a-qualitative-study" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/172769.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">76</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1049</span> The Effect of PETTLEP Imagery on Equestrian Jumping Tasks</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Nurwina%20Anuar">Nurwina Anuar</a>, <a href="https://publications.waset.org/abstracts/search?q=Aswad%20Anuar"> Aswad Anuar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Imagery is a popular mental technique used by athletes and coaches to improve learning and performance. It has been widely investigated and beneficial in the sports context. However, the imagery application in equestrian sport has been understudied. Thus, the effectiveness of imagery should encompass the application in the equestrian sport to ensure its application covert all sports. Unlike most sports (e.g., football, badminton, tennis, ski) which are both mental and physical are dependent solely upon human decision and response, equestrian sports involves the interaction of human-horse collaboration to success in the equestrian tasks. This study aims to investigate the effect of PETTLEP imagery on equestrian jumping tasks, motivation and imagery ability. It was hypothesized that the use of PETTLEP imagery intervention will significantly increase in the skill equestrian jumping tasks. It was also hypothesized that riders’ imagery ability and motivation will increase across phases. The participants were skilled riders with less to no imagery experience. A single-subject ABA design was employed. The study was occurred over five week’s period at Universiti Teknologi Malaysia Equestrian Park. Imagery ability was measured using the Sport Imagery Assessment Questionnaires (SIAQ), the motivational measured based on the Motivational imagery ability measure for Sport (MIAMS). The effectiveness of the PETTLEP imagery intervention on show jumping tasks were evaluated by the professional equine rider on the observational scale. Results demonstrated the improvement on all equestrian jumping tasks for the most participants from baseline to intervention. Result shows the improvement on imagery ability and participants’ motivations after the PETTLEP imagery intervention. Implication of the present study include underlining the impact of PETTLEP imagery on equestrian jumping tasks. The result extends the previous research on the effectiveness of PETTLEP imagery in the sports context that involves interaction and collaboration between human and horse. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=PETTLEP%20imagery" title="PETTLEP imagery">PETTLEP imagery</a>, <a href="https://publications.waset.org/abstracts/search?q=imagery%20ability" title=" imagery ability"> imagery ability</a>, <a href="https://publications.waset.org/abstracts/search?q=equestrian" title=" equestrian"> equestrian</a>, <a href="https://publications.waset.org/abstracts/search?q=equestrian%20jumping%20tasks" title=" equestrian jumping tasks"> equestrian jumping tasks</a> </p> <a href="https://publications.waset.org/abstracts/82648/the-effect-of-pettlep-imagery-on-equestrian-jumping-tasks" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/82648.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">202</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1048</span> Effects of Different Kinds of Combined Action Observation and Motor Imagery on Improving Golf Putting Performance and Learning</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Chi%20H.%20Lin">Chi H. Lin</a>, <a href="https://publications.waset.org/abstracts/search?q=Chi%20C.%20Lin"> Chi C. Lin</a>, <a href="https://publications.waset.org/abstracts/search?q=Chih%20L.%20Hsieh"> Chih L. Hsieh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Motor Imagery (MI) alone or combined with action observation (AO) has been shown to enhance motor performance and skill learning. The most effective way to combine these techniques has received limited scientific scrutiny. In the present study, we examined the effects of simultaneous (i.e., observing an action whilst imagining carrying out the action concurrently), alternate (i.e., observing an action and then doing imagery related to that action consecutively) and synthesis (alternately perform action observation and imagery action and then perform observation and imagery action simultaneously) AOMI combinations on improving golf putting performance and learning. Participants, 45 university students who had no formal experience of using imagery for the study, were randomly allocated to one of four training groups: simultaneous action observation and motor imagery (S-AOMI), alternate action observation and motor imagery (A-AOMI), synthesis action observation and motor imagery (A-S-AOMI), and a control group. And it was applied 'Different Experimental Groups with Pre and Post Measured' designs. Participants underwent eighteen times of different interventions, which were happened three times a week and lasting for six weeks. We analyzed the information we received based on two-factor (group × times) mixed between and within analysis of variance to discuss the real effects on participants' golf putting performance and learning about different intervention methods of different types of combined action observation and motor imagery. After the intervention, we then used imagery questionnaire and journey to understand the condition and suggestion about different motor imagery and action observation intervention from the participants. The results revealed that the three experimental groups both are effective in putting performance and learning but not for the control group, and the A-S-AOMI group is significantly better effect than S-AOMI group on golf putting performance and learning. The results confirmed the effect of motor imagery combined with action observation on the performance and learning of golf putting. In particular, in the groups of synthesis, motor imagery, or action observation were alternately performed first and then performed motor imagery, and action observation simultaneously would have the best effectiveness. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=motor%20skill%20learning" title="motor skill learning">motor skill learning</a>, <a href="https://publications.waset.org/abstracts/search?q=motor%20imagery" title=" motor imagery"> motor imagery</a>, <a href="https://publications.waset.org/abstracts/search?q=action%20observation" title=" action observation"> action observation</a>, <a href="https://publications.waset.org/abstracts/search?q=simulation" title=" simulation"> simulation</a> </p> <a href="https://publications.waset.org/abstracts/128207/effects-of-different-kinds-of-combined-action-observation-and-motor-imagery-on-improving-golf-putting-performance-and-learning" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/128207.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">138</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1047</span> Comparative Study of Accuracy of Land Cover/Land Use Mapping Using Medium Resolution Satellite Imagery: A Case Study</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=M.%20C.%20Paliwal">M. C. Paliwal</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20K.%20Jain"> A. K. Jain</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20K.%20Katiyar"> S. K. Katiyar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Classification of satellite imagery is very important for the assessment of its accuracy. In order to determine the accuracy of the classified image, usually the assumed-true data are derived from ground truth data using Global Positioning System. The data collected from satellite imagery and ground truth data is then compared to find out the accuracy of data and error matrices are prepared. Overall and individual accuracies are calculated using different methods. The study illustrates advanced classification and accuracy assessment of land use/land cover mapping using satellite imagery. IRS-1C-LISS IV data were used for classification of satellite imagery. The satellite image was classified using the software in fourteen classes namely water bodies, agricultural fields, forest land, urban settlement, barren land and unclassified area etc. Classification of satellite imagery and calculation of accuracy was done by using ERDAS-Imagine software to find out the best method. This study is based on the data collected for Bhopal city boundaries of Madhya Pradesh State of India. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=resolution" title="resolution">resolution</a>, <a href="https://publications.waset.org/abstracts/search?q=accuracy%20assessment" title=" accuracy assessment"> accuracy assessment</a>, <a href="https://publications.waset.org/abstracts/search?q=land%20use%20mapping" title=" land use mapping"> land use mapping</a>, <a href="https://publications.waset.org/abstracts/search?q=satellite%20imagery" title=" satellite imagery"> satellite imagery</a>, <a href="https://publications.waset.org/abstracts/search?q=ground%20truth%20data" title=" ground truth data"> ground truth data</a>, <a href="https://publications.waset.org/abstracts/search?q=error%20matrices" title=" error matrices"> error matrices</a> </p> <a href="https://publications.waset.org/abstracts/13294/comparative-study-of-accuracy-of-land-coverland-use-mapping-using-medium-resolution-satellite-imagery-a-case-study" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/13294.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">507</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1046</span> Mental Imagery as an Auxiliary Tool to the Performance of Elite Competitive Swimmers of the University of the East Manila</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hillary%20Jo%20Muyalde">Hillary Jo Muyalde</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Introduction: Elite athletes train regularly to enhance their physical endurance, but sometimes, training sessions are not enough. When competition comes, these athletes struggle to find focus. Mental imagery is a psychological technique that helps condition the mind to focus and eventually help improve performance. This study aims to help elite competitive swimmers of the University of the East improve their performance with Mental Imagery as an auxiliary tool. Methodology: The study design used was quasi-experimental with a purposive sampling technique and a within-subject design. It was conducted with a total of 41 participants. The participants were given a Sport Imagery Ability Questionnaire (SIAQ) to measure imagery ability and the Mental Imagery Program. The study utilized a Paired T-test for data analysis where the participants underwent six weeks of no mental imagery training and were compared to six weeks with the Mental Imagery Program (MIP). The researcher recorded the personal best time of participants in their respective specialty stroke. Results: The results of the study showed a t-value of 17.804 for Butterfly stroke events, 9.922 for Backstroke events, 7.787 for Breaststroke events, and 17.440 in Freestyle. This indicated that MIP had a positive effect on participants’ performance. The SIAQ result also showed a big difference where -10.443 for Butterfly events, -5.363 for Backstroke, -7.244 for Breaststroke events, and -10.727 for Freestyle events, which meant the participants were able to image better than before MIP. Conclusion: In conclusion, the findings of this study showed that there is indeed an improvement in the performance of the participants after the application of the Mental Imagery Program. It is recommended from this study that the participants continue to use mental imagery as an auxiliary tool to their training regimen for continuous positive results. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=mental%20Imagery" title="mental Imagery">mental Imagery</a>, <a href="https://publications.waset.org/abstracts/search?q=personal%20best%20time" title=" personal best time"> personal best time</a>, <a href="https://publications.waset.org/abstracts/search?q=SIAQ" title=" SIAQ"> SIAQ</a>, <a href="https://publications.waset.org/abstracts/search?q=specialty%20stroke" title=" specialty stroke"> specialty stroke</a> </p> <a href="https://publications.waset.org/abstracts/178990/mental-imagery-as-an-auxiliary-tool-to-the-performance-of-elite-competitive-swimmers-of-the-university-of-the-east-manila" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/178990.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">79</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1045</span> Optimizing Glycemic Control with AI-Guided Dietary Supplements: A Randomized Trial in Type 2 Diabetes</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Evgeny%20Pokushalov">Evgeny Pokushalov</a>, <a href="https://publications.waset.org/abstracts/search?q=Claire%20Garcia"> Claire Garcia</a>, <a href="https://publications.waset.org/abstracts/search?q=Andrey%20Ponomarenko"> Andrey Ponomarenko</a>, <a href="https://publications.waset.org/abstracts/search?q=John%20Smith"> John Smith</a>, <a href="https://publications.waset.org/abstracts/search?q=Michael%20Johnson"> Michael Johnson</a>, <a href="https://publications.waset.org/abstracts/search?q=Inessa%20Pak"> Inessa Pak</a>, <a href="https://publications.waset.org/abstracts/search?q=Evgenya%20Shrainer"> Evgenya Shrainer</a>, <a href="https://publications.waset.org/abstracts/search?q=Dmitry%20Kudlay"> Dmitry Kudlay</a>, <a href="https://publications.waset.org/abstracts/search?q=Leila%20Kasimova"> Leila Kasimova</a>, <a href="https://publications.waset.org/abstracts/search?q=Richard%20Miller"> Richard Miller</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This study evaluated the efficacy of an AI-guided dietary supplement regimen compared to a standard physician-guided regimen in managing Type 2 diabetes (T2D). A total of 160 patients were randomly assigned to either the AI-guided group (n=80) or the physician-guided group (n=80) and followed over 90 days. The AI-guided group received 5.3 ± 1.2 supplements per patient, while the physician-guided group received 2.7 ± 0.6 supplements per patient. The AI system personalized supplement types and dosages based on individual genetic and metabolic profiles. The AI-guided group showed a significant reduction in HbA1c levels from 7.5 ± 0.8% to 7.1 ± 0.7%, compared to a reduction from 7.6 ± 0.9% to 7.4 ± 0.8% in the physician-guided group (mean difference: -0.3%, 95% CI: -0.5% to -0.1%; p < 0.01). Secondary outcomes, including fasting plasma glucose, HOMA-IR, and insulin levels, also improved more in the AI-guided group. Subgroup analyses revealed that the AI-guided regimen was particularly effective in patients with specific genetic polymorphisms and elevated metabolic markers. Safety profiles were comparable between both groups, with no serious adverse events reported. In conclusion, the AI-guided dietary supplement regimen significantly improved glycemic control and metabolic health in T2D patients compared to the standard physician-guided approach, demonstrating the potential of personalized AI-driven interventions in diabetes management. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Type%202%20diabetes" title="Type 2 diabetes">Type 2 diabetes</a>, <a href="https://publications.waset.org/abstracts/search?q=AI-guided%20supplementation" title=" AI-guided supplementation"> AI-guided supplementation</a>, <a href="https://publications.waset.org/abstracts/search?q=personalized%20medicine" title=" personalized medicine"> personalized medicine</a>, <a href="https://publications.waset.org/abstracts/search?q=glycemic%20control" title=" glycemic control"> glycemic control</a>, <a href="https://publications.waset.org/abstracts/search?q=metabolic%20health" title=" metabolic health"> metabolic health</a>, <a href="https://publications.waset.org/abstracts/search?q=genetic%20polymorphisms" title=" genetic polymorphisms"> genetic polymorphisms</a>, <a href="https://publications.waset.org/abstracts/search?q=dietary%20supplements" title=" dietary supplements"> dietary supplements</a>, <a href="https://publications.waset.org/abstracts/search?q=HbA1c" title=" HbA1c"> HbA1c</a>, <a href="https://publications.waset.org/abstracts/search?q=fasting%20plasma%20glucose" title=" fasting plasma glucose"> fasting plasma glucose</a>, <a href="https://publications.waset.org/abstracts/search?q=HOMA-IR" title=" HOMA-IR"> HOMA-IR</a>, <a href="https://publications.waset.org/abstracts/search?q=personalized%20nutrition" title=" personalized nutrition"> personalized nutrition</a> </p> <a href="https://publications.waset.org/abstracts/194485/optimizing-glycemic-control-with-ai-guided-dietary-supplements-a-randomized-trial-in-type-2-diabetes" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/194485.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">9</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1044</span> 3D Guided Image Filtering to Improve Quality of Short-Time Binned Dynamic PET Images Using MRI Images</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Tabassum%20Husain">Tabassum Husain</a>, <a href="https://publications.waset.org/abstracts/search?q=Shen%20Peng%20Li"> Shen Peng Li</a>, <a href="https://publications.waset.org/abstracts/search?q=Zhaolin%20Chen"> Zhaolin Chen</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper evaluates the usability of 3D Guided Image Filtering to enhance the quality of short-time binned dynamic PET images by using MRI images. Guided image filtering is an edge-preserving filter proposed to enhance 2D images. The 3D filter is applied on 1 and 5-minute binned images. The results are compared with 15-minute binned images and the Gaussian filtering. The guided image filter enhances the quality of dynamic PET images while also preserving important information of the voxels. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=dynamic%20PET%20images" title="dynamic PET images">dynamic PET images</a>, <a href="https://publications.waset.org/abstracts/search?q=guided%20image%20filter" title=" guided image filter"> guided image filter</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20enhancement" title=" image enhancement"> image enhancement</a>, <a href="https://publications.waset.org/abstracts/search?q=information%20preservation%20filtering" title=" information preservation filtering"> information preservation filtering</a> </p> <a href="https://publications.waset.org/abstracts/152864/3d-guided-image-filtering-to-improve-quality-of-short-time-binned-dynamic-pet-images-using-mri-images" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/152864.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">132</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1043</span> Multi-Temporal Cloud Detection and Removal in Satellite Imagery for Land Resources Investigation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Feng%20Yin">Feng Yin</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Clouds are inevitable contaminants in optical satellite imagery, and prevent the satellite imaging systems from acquiring clear view of the earth surface. The presence of clouds in satellite imagery bring negative influences for remote sensing land resources investigation. As a consequence, detecting the locations of clouds in satellite imagery is an essential preprocessing step, and further remove the existing clouds is crucial for the application of imagery. In this paper, a multi-temporal based satellite imagery cloud detection and removal method is proposed, which will be used for large-scale land resource investigation. The proposed method is mainly composed of four steps. First, cloud masks are generated for cloud contaminated images by single temporal cloud detection based on multiple spectral features. Then, a cloud-free reference image of target areas is synthesized by weighted averaging time-series images in which cloud pixels are ignored. Thirdly, the refined cloud detection results are acquired by multi-temporal analysis based on the reference image. Finally, detected clouds are removed via multi-temporal linear regression. The results of a case application in Hubei province indicate that the proposed multi-temporal cloud detection and removal method is effective and promising for large-scale land resource investigation. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=cloud%20detection" title="cloud detection">cloud detection</a>, <a href="https://publications.waset.org/abstracts/search?q=cloud%20remove" title=" cloud remove"> cloud remove</a>, <a href="https://publications.waset.org/abstracts/search?q=multi-temporal%20imagery" title=" multi-temporal imagery"> multi-temporal imagery</a>, <a href="https://publications.waset.org/abstracts/search?q=land%20resources%20investigation" title=" land resources investigation"> land resources investigation</a> </p> <a href="https://publications.waset.org/abstracts/90359/multi-temporal-cloud-detection-and-removal-in-satellite-imagery-for-land-resources-investigation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/90359.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">278</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1042</span> The Image as an Initial Element of the Cognitive Understanding of Words</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=S.%20Pesina">S. Pesina</a>, <a href="https://publications.waset.org/abstracts/search?q=T.%20Solonchak"> T. Solonchak</a> </p> <p class="card-text"><strong>Abstract:</strong></p> An analysis of word semantics focusing on the invariance of advanced imagery in several pressing problems. Interest in the language of imagery is caused by the introduction, in the linguistics sphere, of a new paradigm, the center of which is the personality of the speaker (the subject of the language). Particularly noteworthy is the question of the place of the image when discussing the lexical, phraseological values and the relationship of imagery and metaphors. In part, the formation of a metaphor, as an interaction between two intellective entities, occurs at a cognitive level, and it is the category of the image, having cognitive roots, which aides in the correct interpretation of the results of this process on the lexical-semantic level. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=image" title="image">image</a>, <a href="https://publications.waset.org/abstracts/search?q=metaphor" title=" metaphor"> metaphor</a>, <a href="https://publications.waset.org/abstracts/search?q=concept" title=" concept"> concept</a>, <a href="https://publications.waset.org/abstracts/search?q=creation%20of%20a%20metaphor" title=" creation of a metaphor"> creation of a metaphor</a>, <a href="https://publications.waset.org/abstracts/search?q=cognitive%20linguistics" title=" cognitive linguistics"> cognitive linguistics</a>, <a href="https://publications.waset.org/abstracts/search?q=erased%20image" title=" erased image"> erased image</a>, <a href="https://publications.waset.org/abstracts/search?q=vivid%20image" title=" vivid image"> vivid image</a> </p> <a href="https://publications.waset.org/abstracts/10617/the-image-as-an-initial-element-of-the-cognitive-understanding-of-words" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/10617.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">361</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1041</span> Satellite Imagery Classification Based on Deep Convolution Network</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Zhong%20Ma">Zhong Ma</a>, <a href="https://publications.waset.org/abstracts/search?q=Zhuping%20Wang"> Zhuping Wang</a>, <a href="https://publications.waset.org/abstracts/search?q=Congxin%20Liu"> Congxin Liu</a>, <a href="https://publications.waset.org/abstracts/search?q=Xiangzeng%20Liu"> Xiangzeng Liu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Satellite imagery classification is a challenging problem with many practical applications. In this paper, we designed a deep convolution neural network (DCNN) to classify the satellite imagery. The contributions of this paper are twofold &mdash; First, to cope with the large-scale variance in the satellite image, we introduced the inception module, which has multiple filters with different size at the same level, as the building block to build our DCNN model. Second, we proposed a genetic algorithm based method to efficiently search the best hyper-parameters of the DCNN in a large search space. The proposed method is evaluated on the benchmark database. The results of the proposed hyper-parameters search method show it will guide the search towards better regions of the parameter space. Based on the found hyper-parameters, we built our DCNN models, and evaluated its performance on satellite imagery classification, the results show the classification accuracy of proposed models outperform the state of the art method. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=satellite%20imagery%20classification" title="satellite imagery classification">satellite imagery classification</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20convolution%20network" title=" deep convolution network"> deep convolution network</a>, <a href="https://publications.waset.org/abstracts/search?q=genetic%20algorithm" title=" genetic algorithm"> genetic algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=hyper-parameter%20optimization" title=" hyper-parameter optimization"> hyper-parameter optimization</a> </p> <a href="https://publications.waset.org/abstracts/44963/satellite-imagery-classification-based-on-deep-convolution-network" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/44963.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">300</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1040</span> Guided Wave in a Cylinder with Trepezoid Cross-Section</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Nan%20Tang">Nan Tang</a>, <a href="https://publications.waset.org/abstracts/search?q=Bin%20Wu"> Bin Wu</a>, <a href="https://publications.waset.org/abstracts/search?q=Cunfu%20He"> Cunfu He</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The trapezoid rods are widely used in civil engineering as load –carrying members. Ultrasonic guided wave is one of the most popular techniques in analyzing the propagation of elastic guided wave. The goal of this paper is to investigate the propagation of elastic waves in the isotropic bar with trapezoid cross-section. Dispersion curves that describe the relationship between the frequency and velocity provide the fundamental information to describe the propagation of elastic waves through a structure. Based on the SAFE (semi-analytical finite element) a linear algebraic system of equations is obtained. By using numerical methods, dispersion curves solved for the rods with the trapezoid cross-section. These fundamental information plays an important role in applying ultrasonic guided waves to NTD for structures with trapezoid cross section. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=guided%20wave" title="guided wave">guided wave</a>, <a href="https://publications.waset.org/abstracts/search?q=dispersion" title=" dispersion"> dispersion</a>, <a href="https://publications.waset.org/abstracts/search?q=finite%20element%20method" title=" finite element method"> finite element method</a>, <a href="https://publications.waset.org/abstracts/search?q=trapezoid%20rod" title=" trapezoid rod"> trapezoid rod</a> </p> <a href="https://publications.waset.org/abstracts/30839/guided-wave-in-a-cylinder-with-trepezoid-cross-section" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/30839.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">292</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1039</span> Teachers’ and Students’ Reactions to a Guided Reading Program Designed by a Teachers’ Professional Learning Community</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yea-Mei%20Leou">Yea-Mei Leou</a>, <a href="https://publications.waset.org/abstracts/search?q=Shiu-Hsung%20Huang"> Shiu-Hsung Huang</a>, <a href="https://publications.waset.org/abstracts/search?q=T.%20C.%20Shen"> T. C. Shen</a>, <a href="https://publications.waset.org/abstracts/search?q=Chin-Ya%20Fang"> Chin-Ya Fang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The purposes of this study were to explore how to establish a professional learning community for English teachers at a junior high school, and to explore how teachers and students think about the guided reading program. The participants were three experienced English teachers and their ESL seventh-grade students from three classes in a junior high school. Leveled picture books and worksheets were used in the program. Questionnaires and interviews were used for gathering information. The findings were as follows: First, most students enjoyed this guided reading program. Second, the teachers thought the guided reading program was helpful to students’ learning and the discussions in the professional learning community refreshed their ideas, but the preparation for the teaching was time-consuming. Suggestions based on the findings were provided. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ESL%20students" title="ESL students">ESL students</a>, <a href="https://publications.waset.org/abstracts/search?q=guided%20reading" title=" guided reading"> guided reading</a>, <a href="https://publications.waset.org/abstracts/search?q=leveled%20books" title=" leveled books"> leveled books</a>, <a href="https://publications.waset.org/abstracts/search?q=professional%20learning%20community" title=" professional learning community"> professional learning community</a> </p> <a href="https://publications.waset.org/abstracts/6750/teachers-and-students-reactions-to-a-guided-reading-program-designed-by-a-teachers-professional-learning-community" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/6750.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">377</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1038</span> Cognitive Linguistic Features Underlying Spelling Development in a Second Language: A Case Study of L2 Spellers in South Africa</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=A.%20Van%20Staden">A. Van Staden</a>, <a href="https://publications.waset.org/abstracts/search?q=A.%20Tolmie"> A. Tolmie</a>, <a href="https://publications.waset.org/abstracts/search?q=E.%20Vorster"> E. Vorster</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Research confirms the multifaceted nature of spelling development and underscores the importance of both cognitive and linguistic skills that affect sound spelling development such as working and long-term memory, phonological and orthographic awareness, mental orthographic images, semantic knowledge and morphological awareness. This has clear implications for many South African English second language spellers (L2) who attempt to become proficient spellers. Since English has an opaque orthography, with irregular spelling patterns and insufficient sound/grapheme correspondences, L2 spellers can neither rely, nor draw on the phonological awareness skills of their first language (for example Sesotho and many other African languages), to assist them to spell the majority of English words. Epistemologically, this research is informed by social constructivism. In addition the researchers also hypothesized that the principles of the Overlapping Waves Theory was an appropriate lens through which to investigate whether L2 spellers could significantly improve their spelling skills via the implementation of an alternative route to spelling development, namely the orthographic route, and more specifically via the application of visual imagery. Post-test results confirmed the results of previous research that argues for the interactive nature of different cognitive and linguistic systems such as working memory and its subsystems and long-term memory, as learners were systematically guided to store visual orthographic images of words in their long-term lexicons. Moreover, the results have shown that L2 spellers in the experimental group (n = 9) significantly outperformed L2 spellers (n = 9) in the control group whose intervention involved phonological awareness (and coding) including the teaching of spelling rules. Consequently, L2 learners in the experimental group significantly improved in all the post-test measures included in this investigation, namely the four sub-tests of short-term memory; as well as two spelling measures (i.e. diagnostic and standardized measures). Against this background, the findings of this study look promising and have shown that, within a social-constructivist learning environment, learners can be systematically guided to apply higher-order thinking processes such as visual imagery to successfully store and retrieve mental images of spelling words from their output lexicons. Moreover, results from the present study could play an important role in directing research into this under-researched aspect of L2 literacy development within the South African education context. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=English%20second%20language%20spellers" title="English second language spellers">English second language spellers</a>, <a href="https://publications.waset.org/abstracts/search?q=phonological%20and%20orthographic%20coding" title=" phonological and orthographic coding"> phonological and orthographic coding</a>, <a href="https://publications.waset.org/abstracts/search?q=social%20constructivism" title=" social constructivism"> social constructivism</a>, <a href="https://publications.waset.org/abstracts/search?q=visual%20imagery%20as%20spelling%20strategy" title=" visual imagery as spelling strategy"> visual imagery as spelling strategy</a> </p> <a href="https://publications.waset.org/abstracts/46165/cognitive-linguistic-features-underlying-spelling-development-in-a-second-language-a-case-study-of-l2-spellers-in-south-africa" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/46165.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">359</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1037</span> A Novel Spectral Index for Automatic Shadow Detection in Urban Mapping Based on WorldView-2 Satellite Imagery</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Kaveh%20Shahi">Kaveh Shahi</a>, <a href="https://publications.waset.org/abstracts/search?q=Helmi%20Z.%20M.%20Shafri"> Helmi Z. M. Shafri</a>, <a href="https://publications.waset.org/abstracts/search?q=Ebrahim%20Taherzadeh"> Ebrahim Taherzadeh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In remote sensing, shadow causes problems in many applications such as change detection and classification. It is caused by objects which are elevated, thus can directly affect the accuracy of information. For these reasons, it is very important to detect shadows particularly in urban high spatial resolution imagery which created a significant problem. This paper focuses on automatic shadow detection based on a new spectral index for multispectral imagery known as Shadow Detection Index (SDI). The new spectral index was tested on different areas of World-View 2 images and the results demonstrated that the new spectral index has a massive potential to extract shadows effectively and automatically. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=spectral%20index" title="spectral index">spectral index</a>, <a href="https://publications.waset.org/abstracts/search?q=shadow%20detection" title=" shadow detection"> shadow detection</a>, <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing%20images" title=" remote sensing images"> remote sensing images</a>, <a href="https://publications.waset.org/abstracts/search?q=World-View%202" title=" World-View 2"> World-View 2</a> </p> <a href="https://publications.waset.org/abstracts/13500/a-novel-spectral-index-for-automatic-shadow-detection-in-urban-mapping-based-on-worldview-2-satellite-imagery" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/13500.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">538</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1036</span> Effectiveness of Imagery Compared with Exercise Training on Hip Abductor Strength and EMG Production in Healthy Adults</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Majid%20Manawer%20Alenezi">Majid Manawer Alenezi</a>, <a href="https://publications.waset.org/abstracts/search?q=Gavin%20Lawrence"> Gavin Lawrence</a>, <a href="https://publications.waset.org/abstracts/search?q=Hans-Peter%20Kubis"> Hans-Peter Kubis</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Imagery training could be an important treatment for muscle function improvements in patients who are facing limitations in exercise training by pain or other adverse symptoms. However, recent studies are mostly limited to small muscle groups and are often contradictory. Moreover, a possible bilateral transfer effect of imagery training has not been examined. We, therefore, investigated the effectiveness of unilateral imagery training in comparison with exercise training on hip abductor muscle strength and EMG. Additionally, both limbs were assessed to investigate bilateral transfer effects. Healthy individuals took part in an imagery or exercise training intervention for two weeks and were assesses pre and post training. Participants (n=30), after randomization into an imagery and an exercise group, trained 5 times a week under supervision with additional self-performed training on the weekends. The training consisted of performing, or to imagine, 5 maximal isometric hip abductor contractions (= one set), repeating the set 7 times. All measurements and trainings were performed laying on the side on a dynamometer table. The imagery script combined kinesthetic and visual imagery with internal perspective for producing imagined maximal hip abduction contractions. The exercise group performed the same number of tasks but performing the maximal hip abductor contractions. Maximal hip abduction strength and EMG amplitudes were measured of right and left limbs pre- and post-training period. Additionally, handgrip strength and right shoulder abduction (Strength and EMG) were measured. Using mixed model ANOVA (strength measures) and Wilcoxen-tests (EMGs), data revealed a significant increase in hip abductor strength production in the imagery group on the trained right limb (~6%). However, this was not reported for the exercise group. Additionally, the left hip abduction strength (not used for training) did not show a main effect in strength, however, there was a significant interaction of group and time revealing that the strength increased in the imagery group while it remained constant in the exercise group. EMG recordings supported the strength findings showing significant elevation of EMG amplitudes after imagery training on right and left side, while the exercise training group did not show any changes. Moreover, measures of handgrip strength and shoulder abduction showed no effects over time and no interactions in both groups. Experiments showed that imagery training is a suitable method for effectively increasing functional parameters of larger limb muscles (strength and EMG) which were enhanced on both sides (trained and untrained) confirming a bilateral transfer effect. Indeed, exercise training did not reveal any increases in the parameters above omitting functional improvements. The healthy individuals tested might not easily achieve benefits from exercise training within the time tested. However, it is evident that imagery training is effective in increasing the central motor command towards the muscles and that the effect seems to be segmental (no increase in handgrip strength and shoulder abduction parameters) and affects both sides (trained and untrained). In conclusion, imagery training was effective in functional improvements in limb muscles and produced a bilateral transfer on strength and EMG measures. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=imagery" title="imagery">imagery</a>, <a href="https://publications.waset.org/abstracts/search?q=exercise" title=" exercise"> exercise</a>, <a href="https://publications.waset.org/abstracts/search?q=physiotherapy" title=" physiotherapy"> physiotherapy</a>, <a href="https://publications.waset.org/abstracts/search?q=motor%20imagery" title=" motor imagery"> motor imagery</a> </p> <a href="https://publications.waset.org/abstracts/85814/effectiveness-of-imagery-compared-with-exercise-training-on-hip-abductor-strength-and-emg-production-in-healthy-adults" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/85814.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">234</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1035</span> The Effect of General Corrosion on the Guided Wave Inspection of the Pipeline</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Shiuh-Kuang%20Yang">Shiuh-Kuang Yang</a>, <a href="https://publications.waset.org/abstracts/search?q=Sheam-Chyun%20Lin"> Sheam-Chyun Lin</a>, <a href="https://publications.waset.org/abstracts/search?q=Jyin-Wen%20Cheng"> Jyin-Wen Cheng</a>, <a href="https://publications.waset.org/abstracts/search?q=Deng-Guei%20Hsu">Deng-Guei Hsu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The torsional mode of guided wave, T(0,1), has been applied to detect characteristics and defects in pipelines, especially in the cases of coated, elevated and buried pipes. The signals of minor corrosions would be covered by the noise, unfortunately, because the coated material and buried medium always induce a strong attenuation of the guided wave. Furthermore, the guided wave would be attenuated more seriously and make the signals hard to be identified when setting the array ring of the transducers on a general corrosion area of the pipe. The objective of this study is then to discuss the effects of the above-mentioned general corrosion on guided wave tests by experiments and signal processing techniques, based on the use of the finite element method, the two-dimensional Fourier transform and the continuous wavelet transform. Results show that the excitation energy would be reduced when the array ring set on the pipe surface having general corrosion. The non-uniformed contact surface also produces the unwanted asymmetric modes of the propagating guided wave. Some of them are even mixing together with T(0,1) mode and increase the difficulty of measurements, especially when a defect or local corrosion merged in the general corrosion area. It is also showed that the guided waves attenuation are increasing with the increasing corrosion depth or the rising inspection frequency. However, the coherent signals caused by the general corrosion would be decayed with increasing frequency. The results obtained from this research should be able to provide detectors to understand the impact when the array ring set on the area of general corrosion and the way to distinguish the localized corrosion which is inside the area of general corrosion. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=guided%20wave" title="guided wave">guided wave</a>, <a href="https://publications.waset.org/abstracts/search?q=finite%20element%20method" title=" finite element method"> finite element method</a>, <a href="https://publications.waset.org/abstracts/search?q=two-dimensional%20fourier%20transform" title=" two-dimensional fourier transform"> two-dimensional fourier transform</a>, <a href="https://publications.waset.org/abstracts/search?q=wavelet%20transform" title=" wavelet transform"> wavelet transform</a>, <a href="https://publications.waset.org/abstracts/search?q=general%20corrosion" title=" general corrosion"> general corrosion</a>, <a href="https://publications.waset.org/abstracts/search?q=localized%20corrosion" title=" localized corrosion"> localized corrosion</a> </p> <a href="https://publications.waset.org/abstracts/24573/the-effect-of-general-corrosion-on-the-guided-wave-inspection-of-the-pipeline" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/24573.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">404</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1034</span> Estimation of Soil Nutrient Content Using Google Earth and Pleiades Satellite Imagery for Small Farms</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Lucas%20Barbosa%20Da%20Silva">Lucas Barbosa Da Silva</a>, <a href="https://publications.waset.org/abstracts/search?q=Jun%20Okamoto%20Jr."> Jun Okamoto Jr.</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Precision Agriculture has long being benefited from crop fields’ aerial imagery. This important tool has allowed identifying patterns in crop fields, generating useful information to the production management. Reflectance intensity data in different ranges from the electromagnetic spectrum may indicate presence or absence of nutrients in the soil of an area. Different relations between the different light bands may generate even more detailed information. The knowledge of the nutrients content in the soil or in the crop during its growth is a valuable asset to the farmer that seeks to optimize its yield. However, small farmers in Brazil often lack the resources to access this kind information, and, even when they do, it is not presented in a comprehensive and/or objective way. So, the challenges of implementing this technology ranges from the sampling of the imagery, using aerial platforms, building of a mosaic with the images to cover the entire crop field, extracting the reflectance information from it and analyzing its relationship with the parameters of interest, to the display of the results in a manner that the farmer may take the necessary decisions more objectively. In this work, it’s proposed an analysis of soil nutrient contents based on image processing of satellite imagery and comparing its outtakes with commercial laboratory’s chemical analysis. Also, sources of satellite imagery are compared, to assess the feasibility of using Google Earth data in this application, and the impacts of doing so, versus the application of imagery from satellites like Landsat-8 and Pleiades. Furthermore, an algorithm for building mosaics is implemented using Google Earth imagery and finally, the possibility of using unmanned aerial vehicles is analyzed. From the data obtained, some soil parameters are estimated, namely, the content of Potassium, Phosphorus, Boron, Manganese, among others. The suitability of Google Earth Imagery for this application is verified within a reasonable margin, when compared to Pleiades Satellite imagery and to the current commercial model. It is also verified that the mosaic construction method has little or no influence on the estimation results. Variability maps are created over the covered area and the impacts of the image resolution and sample time frame are discussed, allowing easy assessments of the results. The final results show that easy and cheaper remote sensing and analysis methods are possible and feasible alternatives for the small farmer, with little access to technological and/or financial resources, to make more accurate decisions about soil nutrient management. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=remote%20sensing" title="remote sensing">remote sensing</a>, <a href="https://publications.waset.org/abstracts/search?q=precision%20agriculture" title=" precision agriculture"> precision agriculture</a>, <a href="https://publications.waset.org/abstracts/search?q=mosaic" title=" mosaic"> mosaic</a>, <a href="https://publications.waset.org/abstracts/search?q=soil" title=" soil"> soil</a>, <a href="https://publications.waset.org/abstracts/search?q=nutrient%20content" title=" nutrient content"> nutrient content</a>, <a href="https://publications.waset.org/abstracts/search?q=satellite%20imagery" title=" satellite imagery"> satellite imagery</a>, <a href="https://publications.waset.org/abstracts/search?q=aerial%20imagery" title=" aerial imagery"> aerial imagery</a> </p> <a href="https://publications.waset.org/abstracts/86336/estimation-of-soil-nutrient-content-using-google-earth-and-pleiades-satellite-imagery-for-small-farms" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/86336.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">175</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1033</span> Plot Scale Estimation of Crop Biophysical Parameters from High Resolution Satellite Imagery</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Shreedevi%20Moharana">Shreedevi Moharana</a>, <a href="https://publications.waset.org/abstracts/search?q=Subashisa%20Dutta"> Subashisa Dutta</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The present study focuses on the estimation of crop biophysical parameters like crop chlorophyll, nitrogen and water stress at plot scale in the crop fields. To achieve these, we have used high-resolution satellite LISS IV imagery. A new methodology has proposed in this research work, the spectral shape function of paddy crop is employed to get the significant wavelengths sensitive to paddy crop parameters. From the shape functions, regression index models were established for the critical wavelength with minimum and maximum wavelengths of multi-spectrum high-resolution LISS IV data. Moreover, the functional relationships were utilized to develop the index models. From these index models crop, biophysical parameters were estimated and mapped from LISS IV imagery at plot scale in crop field level. The result showed that the nitrogen content of the paddy crop varied from 2-8%, chlorophyll from 1.5-9% and water content variation observed from 40-90% respectively. It was observed that the variability in rice agriculture system in India was purely a function of field topography. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=crop%20parameters" title="crop parameters">crop parameters</a>, <a href="https://publications.waset.org/abstracts/search?q=index%20model" title=" index model"> index model</a>, <a href="https://publications.waset.org/abstracts/search?q=LISS%20IV%20imagery" title=" LISS IV imagery"> LISS IV imagery</a>, <a href="https://publications.waset.org/abstracts/search?q=plot%20scale" title=" plot scale"> plot scale</a>, <a href="https://publications.waset.org/abstracts/search?q=shape%20function" title=" shape function"> shape function</a> </p> <a href="https://publications.waset.org/abstracts/89499/plot-scale-estimation-of-crop-biophysical-parameters-from-high-resolution-satellite-imagery" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/89499.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">168</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1032</span> Validation of a Placebo Method with Potential for Blinding in Ultrasound-Guided Dry Needling</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Johnson%20C.%20Y.%20Pang">Johnson C. Y. Pang</a>, <a href="https://publications.waset.org/abstracts/search?q=Bo%20Peng"> Bo Peng</a>, <a href="https://publications.waset.org/abstracts/search?q=Kara%20K.%20L.%20Reeves"> Kara K. L. Reeves</a>, <a href="https://publications.waset.org/abstracts/search?q=Allan%20C.%20L.%20Fud"> Allan C. L. Fud</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Objective: Dry needling (DN) has long been used as a treatment method for various musculoskeletal pain conditions. However, the evidence level of the studies was low due to the limitations of the methodology. Lack of randomization and inappropriate blinding is potentially the main sources of bias. A method that can differentiate clinical results due to the targeted experimental procedure from its placebo effect is needed to enhance the validity of the trial. Therefore, this study aimed to validate the method as a placebo ultrasound(US)-guided DN for patients with knee osteoarthritis (KOA). Design: This is a randomized controlled trial (RCT). Ninety subjects (25 males and 65 females) aged between 51 and 80 (61.26 ± 5.57) with radiological KOA were recruited and randomly assigned into three groups with a computer program. Group 1 (G1) received real US-guided DN, Group 2 (G2) received placebo US-guided DN, and Group 3 (G3) was the control group. Both G1 and G2 subjects received the same procedure of US-guided DN, except the US monitor was turned off in G2, blinding the G2 subjects to the incorporation of faux US guidance. This arrangement created the placebo effect intended to permit comparison of their results to those who received actual US-guided DN. Outcome measures, including the visual analog scale (VAS) and Knee injury and Osteoarthritis Outcome Score (KOOS) subscales of pain, symptoms, and quality of life (QOL), were analyzed by repeated measures analysis of covariance (ANCOVA) for time effects and group effects. The data regarding the perception of receiving real US-guided DN or placebo US-guided DN were analyzed by the chi-squared test. The missing data were analyzed with the intention-to-treat (ITT) approach if more than 5% of the data were missing. Results: The placebo US-guided DN (G2) subjects had the same perceptions as the use of real US guidance in the advancement of DN (p<0.128). G1 had significantly higher pain reduction (VAS and KOOS-pain) than G2 and G3 at 8 weeks (both p<0.05) only. There was no significant difference between G2 and G3 at 8 weeks (both p>0.05). Conclusion: The method with the US monitor turned off during the application of DN is credible for blinding the participants and allowing researchers to incorporate faux US guidance. The validated placebo US-guided DN technique can aid in investigations of the effects of US-guided DN with short-term effects of pain reduction for patients with KOA. Acknowledgment: This work was supported by the Caritas Institute of Higher Education [grant number IDG200101]. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ultrasound-guided%20dry%20needling" title="ultrasound-guided dry needling">ultrasound-guided dry needling</a>, <a href="https://publications.waset.org/abstracts/search?q=dry%20needling" title=" dry needling"> dry needling</a>, <a href="https://publications.waset.org/abstracts/search?q=knee%20osteoarthritis" title=" knee osteoarthritis"> knee osteoarthritis</a>, <a href="https://publications.waset.org/abstracts/search?q=physiotheraphy" title=" physiotheraphy"> physiotheraphy</a> </p> <a href="https://publications.waset.org/abstracts/147288/validation-of-a-placebo-method-with-potential-for-blinding-in-ultrasound-guided-dry-needling" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/147288.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">120</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1031</span> Optimizing Weight Loss with AI (GenAISᵀᴹ): A Randomized Trial of Dietary Supplement Prescriptions in Obese Patients</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Evgeny%20Pokushalov">Evgeny Pokushalov</a>, <a href="https://publications.waset.org/abstracts/search?q=Andrey%20Ponomarenko"> Andrey Ponomarenko</a>, <a href="https://publications.waset.org/abstracts/search?q=John%20Smith"> John Smith</a>, <a href="https://publications.waset.org/abstracts/search?q=Michael%20Johnson"> Michael Johnson</a>, <a href="https://publications.waset.org/abstracts/search?q=Claire%20Garcia"> Claire Garcia</a>, <a href="https://publications.waset.org/abstracts/search?q=Inessa%20Pak"> Inessa Pak</a>, <a href="https://publications.waset.org/abstracts/search?q=Evgenya%20Shrainer"> Evgenya Shrainer</a>, <a href="https://publications.waset.org/abstracts/search?q=Dmitry%20Kudlay"> Dmitry Kudlay</a>, <a href="https://publications.waset.org/abstracts/search?q=Sevda%20Bayramova"> Sevda Bayramova</a>, <a href="https://publications.waset.org/abstracts/search?q=Richard%20Miller"> Richard Miller</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Background: Obesity is a complex, multifactorial chronic disease that poses significant health risks. Recent advancements in artificial intelligence (AI) offer the potential for more personalized and effective dietary supplement (DS) regimens to promote weight loss. This study aimed to evaluate the efficacy of AI-guided DS prescriptions compared to standard physician-guided DS prescriptions in obese patients. Methods: This randomized, parallel-group pilot study enrolled 60 individuals aged 40 to 60 years with a body mass index (BMI) of 25 or greater. Participants were randomized to receive either AI-guided DS prescriptions (n = 30) or physician-guided DS prescriptions (n = 30) for 180 days. The primary endpoints were the percentage change in body weight and the proportion of participants achieving a ≥5% weight reduction. Secondary endpoints included changes in BMI, fat mass, visceral fat rating, systolic and diastolic blood pressure, lipid profiles, fasting plasma glucose, hsCRP levels, and postprandial appetite ratings. Adverse events were monitored throughout the study. Results: Both groups were well balanced in terms of baseline characteristics. Significant weight loss was observed in the AI-guided group, with a mean reduction of -12.3% (95% CI: -13.1 to -11.5%) compared to -7.2% (95% CI: -8.1 to -6.3%) in the physician-guided group, resulting in a treatment difference of -5.1% (95% CI: -6.4 to -3.8%; p < 0.01). At day 180, 84.7% of the AI-guided group achieved a weight reduction of ≥5%, compared to 54.5% in the physician-guided group (Odds Ratio: 4.3; 95% CI: 3.1 to 5.9; p < 0.01). Significant improvements were also observed in BMI, fat mass, and visceral fat rating in the AI-guided group (p < 0.01 for all). Postprandial appetite suppression was greater in the AI-guided group, with significant reductions in hunger and prospective food consumption, and increases in fullness and satiety (p < 0.01 for all). Adverse events were generally mild-to-moderate, with higher incidences of gastrointestinal symptoms in the AI-guided group, but these were manageable and did not impact adherence. Conclusion: The AI-guided dietary supplement regimen was more effective in promoting weight loss, improving body composition, and suppressing appetite compared to the physician-guided regimen. These findings suggest that AI-guided, personalized supplement prescriptions could offer a more effective approach to managing obesity. Further research with larger sample sizes is warranted to confirm these results and optimize AI-based interventions for weight loss. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=obesity" title="obesity">obesity</a>, <a href="https://publications.waset.org/abstracts/search?q=AI-guided" title=" AI-guided"> AI-guided</a>, <a href="https://publications.waset.org/abstracts/search?q=dietary%20supplements" title=" dietary supplements"> dietary supplements</a>, <a href="https://publications.waset.org/abstracts/search?q=weight%20loss" title=" weight loss"> weight loss</a>, <a href="https://publications.waset.org/abstracts/search?q=personalized%20medicine" title=" personalized medicine"> personalized medicine</a>, <a href="https://publications.waset.org/abstracts/search?q=metabolic%20health" title=" metabolic health"> metabolic health</a>, <a href="https://publications.waset.org/abstracts/search?q=appetite%20suppression" title=" appetite suppression"> appetite suppression</a> </p> <a href="https://publications.waset.org/abstracts/194486/optimizing-weight-loss-with-ai-genais-a-randomized-trial-of-dietary-supplement-prescriptions-in-obese-patients" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/194486.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">8</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1030</span> Agile Real-Time Field Programmable Gate Array-Based Image Processing System for Drone Imagery in Digital Agriculture</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sabiha%20Shahid%20Antora">Sabiha Shahid Antora</a>, <a href="https://publications.waset.org/abstracts/search?q=Young%20Ki%20Chang"> Young Ki Chang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Along with various farm management technologies, imagery is an important tool that facilitates crop assessment, monitoring, and management. As a consequence, drone imaging technology is playing a vital role to capture the state of the entire field for yield mapping, crop scouting, weed detection, and so on. Although it is essential to inspect the cultivable lands in real-time for making rapid decisions regarding field variable inputs to combat stresses and diseases, drone imagery is still evolving in this area of interest. Cost margin and post-processing complexions of the image stream are the main challenges of imaging technology. Therefore, this proposed project involves the cost-effective field programmable gate array (FPGA) based image processing device that would process the image stream in real-time as well as providing the processed output to support on-the-spot decisions in the crop field. As a result, the real-time FPGA-based image processing system would reduce operating costs while minimizing a few intermediate steps to deliver scalable field decisions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=real-time" title="real-time">real-time</a>, <a href="https://publications.waset.org/abstracts/search?q=FPGA" title=" FPGA"> FPGA</a>, <a href="https://publications.waset.org/abstracts/search?q=drone%20imagery" title=" drone imagery"> drone imagery</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20processing" title=" image processing"> image processing</a>, <a href="https://publications.waset.org/abstracts/search?q=crop%20monitoring" title=" crop monitoring"> crop monitoring</a> </p> <a href="https://publications.waset.org/abstracts/132611/agile-real-time-field-programmable-gate-array-based-image-processing-system-for-drone-imagery-in-digital-agriculture" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/132611.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">113</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1029</span> Added Value of 3D Ultrasound Image Guided Hepatic Interventions by X Matrix Technology</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ahmed%20Abdel%20Sattar%20Khalil">Ahmed Abdel Sattar Khalil</a>, <a href="https://publications.waset.org/abstracts/search?q=Hazem%20Omar"> Hazem Omar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Background: Image-guided hepatic interventions are integral to the management of infective and neoplastic liver lesions. Over the past decades, 2D ultrasound was used for guidance of hepatic interventions; with the recent advances in ultrasound technology, 3D ultrasound was used to guide hepatic interventions. The aim of this study was to illustrate the added value of 3D image guided hepatic interventions by x matrix technology. Patients and Methods: This prospective study was performed on 100 patients who were divided into two groups; group A included 50 patients who were managed by 2D ultrasonography probe guidance, and group B included 50 patients who were managed by 3D X matrix ultrasonography probe guidance. Thermal ablation was done for 70 patients, 40 RFA (20 by the 2D probe and 20 by the 3D x matrix probe), and 30 MWA (15 by the 2D probe and 15 by the 3D x matrix probe). Chemical ablation (PEI) was done on 20 patients (10 by the 2D probe and 10 by the 3D x matrix probe). Drainage of hepatic collections and biopsy from undiagnosed hepatic focal lesions was done on 10 patients (5 by the 2D probe and 5 by the 3D x matrix probe). Results: The efficacy of ultrasonography-guided hepatic interventions by 3D x matrix probe was higher than the 2D probe but not significantly higher, with a p-value of 0.705, 0.5428 for RFA, MWA respectively, 0.5312 for PEI, 0.2918 for drainage of hepatic collections and biopsy. The complications related to the use of the 3D X matrix probe were significantly lower than the 2D probe, with a p-value of 0.003. The timing of the procedure was shorter by the usage of 3D x matrix probe in comparison to the 2D probe with a p-value of 0.08,0.34 for RFA and PEI and significantly shorter for MWA, and drainage of hepatic collection, biopsy with a P-value of 0.02,0.001 respectively. Conclusions: 3D ultrasonography-guided hepatic interventions by  x matrix probe have better efficacy, less complication, and shorter time of procedure than the 2D ultrasonography-guided hepatic interventions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=3D" title="3D">3D</a>, <a href="https://publications.waset.org/abstracts/search?q=X%20matrix" title=" X matrix"> X matrix</a>, <a href="https://publications.waset.org/abstracts/search?q=2D" title=" 2D"> 2D</a>, <a href="https://publications.waset.org/abstracts/search?q=ultrasonography" title=" ultrasonography"> ultrasonography</a>, <a href="https://publications.waset.org/abstracts/search?q=MWA" title=" MWA"> MWA</a>, <a href="https://publications.waset.org/abstracts/search?q=RFA" title=" RFA"> RFA</a>, <a href="https://publications.waset.org/abstracts/search?q=PEI" title=" PEI"> PEI</a>, <a href="https://publications.waset.org/abstracts/search?q=drainage%20of%20hepatic%20collections" title=" drainage of hepatic collections"> drainage of hepatic collections</a>, <a href="https://publications.waset.org/abstracts/search?q=biopsy" title=" biopsy"> biopsy</a> </p> <a href="https://publications.waset.org/abstracts/173809/added-value-of-3d-ultrasound-image-guided-hepatic-interventions-by-x-matrix-technology" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/173809.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">95</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1028</span> Generic Hybrid Models for Two-Dimensional Ultrasonic Guided Wave Problems</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Manoj%20Reghu">Manoj Reghu</a>, <a href="https://publications.waset.org/abstracts/search?q=Prabhu%20Rajagopal"> Prabhu Rajagopal</a>, <a href="https://publications.waset.org/abstracts/search?q=C.%20V.%20Krishnamurthy"> C. V. Krishnamurthy</a>, <a href="https://publications.waset.org/abstracts/search?q=Krishnan%20Balasubramaniam"> Krishnan Balasubramaniam</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A thorough understanding of guided ultrasonic wave behavior in structures is essential for the application of existing Non Destructive Evaluation (NDE) technologies, as well as for the development of new methods. However, the analysis of guided wave phenomena is challenging because of their complex dispersive and multimodal nature. Although numerical solution procedures have proven to be very useful in this regard, the increasing complexity of features and defects to be considered, as well as the desire to improve the accuracy of inspection often imposes a large computational cost. Hybrid models that combine numerical solutions for wave scattering with faster alternative methods for wave propagation have long been considered as a solution to this problem. However usually such models require modification of the base code of the solution procedure. Here we aim to develop Generic Hybrid models that can be directly applied to any two different solution procedures. With this goal in mind, a Numerical Hybrid model and an Analytical-Numerical Hybrid model has been developed. The concept and implementation of these Hybrid models are discussed in this paper. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=guided%20ultrasonic%20waves" title="guided ultrasonic waves">guided ultrasonic waves</a>, <a href="https://publications.waset.org/abstracts/search?q=Finite%20Element%20Method%20%28FEM%29" title=" Finite Element Method (FEM)"> Finite Element Method (FEM)</a>, <a href="https://publications.waset.org/abstracts/search?q=Hybrid%20model" title=" Hybrid model"> Hybrid model</a> </p> <a href="https://publications.waset.org/abstracts/16058/generic-hybrid-models-for-two-dimensional-ultrasonic-guided-wave-problems" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/16058.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">465</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1027</span> Numerical Study of Nonlinear Guided Waves in Composite Laminates with Delaminations</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Reza%20Soleimanpour">Reza Soleimanpour</a>, <a href="https://publications.waset.org/abstracts/search?q=Ching%20Tai%20Ng"> Ching Tai Ng</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Fibre-composites are widely used in various structures due to their attractive properties such as higher stiffness to mass ratio and better corrosion resistance compared to metallic materials. However, one serious weakness of this composite material is delamination, which is a subsurface separation of laminae. A low level of this barely visible damage can cause a significant reduction in residual compressive strength. In the last decade, the application of guided waves for damage detection has been a topic of significant interest for many researches. Among all guided wave techniques, nonlinear guided wave has shown outstanding sensitivity and capability for detecting different types of damages, e.g. cracks and delaminations. So far, most of researches on applications of nonlinear guided wave have been dedicated to isotropic material, such as aluminium and steel, while only a few works have been done on applications of nonlinear characteristics of guided waves in anisotropic materials. This study investigates the nonlinear interactions of the fundamental antisymmetric lamb wave (A0) with delamination in composite laminates using three-dimensional (3D) explicit finite element (FE) simulations. The nonlinearity considered in this study arises from interactions of two interfaces of sub-laminates at the delamination region, which generates contact acoustic nonlinearity (CAN). The aim of this research is to investigate the phenomena of CAN in composite laminated beams by a series of numerical case studies. In this study interaction of fundamental antisymmetric lamb wave with delamination of different sizes are studied in detail. The results show that the A0 lamb wave interacts with the delaminations generating CAN in the form of higher harmonics, which is a good indicator for determining the existence of delaminations in composite laminates. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=contact%20acoustic%20nonlinearity" title="contact acoustic nonlinearity">contact acoustic nonlinearity</a>, <a href="https://publications.waset.org/abstracts/search?q=delamination" title=" delamination"> delamination</a>, <a href="https://publications.waset.org/abstracts/search?q=fibre%20reinforced%20composite%20beam" title=" fibre reinforced composite beam"> fibre reinforced composite beam</a>, <a href="https://publications.waset.org/abstracts/search?q=finite%20element" title=" finite element"> finite element</a>, <a href="https://publications.waset.org/abstracts/search?q=nonlinear%20guided%20waves" title=" nonlinear guided waves"> nonlinear guided waves</a> </p> <a href="https://publications.waset.org/abstracts/45425/numerical-study-of-nonlinear-guided-waves-in-composite-laminates-with-delaminations" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/45425.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">204</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1026</span> Application of Rapid Eye Imagery in Crop Type Classification Using Vegetation Indices</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sunita%20Singh">Sunita Singh</a>, <a href="https://publications.waset.org/abstracts/search?q=Rajani%20Srivastava"> Rajani Srivastava</a> </p> <p class="card-text"><strong>Abstract:</strong></p> For natural resource management and in other applications about earth observation revolutionary remote sensing technology plays a significant role. One of such application in monitoring and classification of crop types at spatial and temporal scale, as it provides latest, most precise and cost-effective information. Present study emphasizes the use of three different vegetation indices of Rapid Eye imagery on crop type classification. It also analyzed the effect of each indices on classification accuracy. Rapid Eye imagery is highly demanded and preferred for agricultural and forestry sectors as it has red-edge and NIR bands. The three indices used in this study were: the Normalized Difference Vegetation Index (NDVI), the Green Normalized Difference Vegetation Index (GNDVI), and the Normalized Difference Red Edge Index (NDRE) and all of these incorporated the Red Edge band. The study area is Varanasi district of Uttar Pradesh, India and Radial Basis Function (RBF) kernel was used here for the Support Vector Machines (SVMs) classification. Classification was performed with these three vegetation indices. The contribution of each indices on image classification accuracy was also tested with single band classification. Highest classification accuracy of 85% was obtained using three vegetation indices. The study concluded that NDRE has the highest contribution on classification accuracy compared to the other vegetation indices and the Rapid Eye imagery can get satisfactory results of classification accuracy without original bands. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=GNDVI" title="GNDVI">GNDVI</a>, <a href="https://publications.waset.org/abstracts/search?q=NDRE" title=" NDRE"> NDRE</a>, <a href="https://publications.waset.org/abstracts/search?q=NDVI" title=" NDVI"> NDVI</a>, <a href="https://publications.waset.org/abstracts/search?q=rapid%20eye" title=" rapid eye"> rapid eye</a>, <a href="https://publications.waset.org/abstracts/search?q=vegetation%20indices" title=" vegetation indices"> vegetation indices</a> </p> <a href="https://publications.waset.org/abstracts/79921/application-of-rapid-eye-imagery-in-crop-type-classification-using-vegetation-indices" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/79921.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">362</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1025</span> Potassium-Phosphorus-Nitrogen Detection and Spectral Segmentation Analysis Using Polarized Hyperspectral Imagery and Machine Learning </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Nicholas%20V.%20Scott">Nicholas V. Scott</a>, <a href="https://publications.waset.org/abstracts/search?q=Jack%20McCarthy"> Jack McCarthy </a> </p> <p class="card-text"><strong>Abstract:</strong></p> Military, law enforcement, and counter terrorism organizations are often tasked with target detection and image characterization of scenes containing explosive materials in various types of environments where light scattering intensity is high. Mitigation of this photonic noise using classical digital filtration and signal processing can be difficult. This is partially due to the lack of robust image processing methods for photonic noise removal, which strongly influence high resolution target detection and machine learning-based pattern recognition. Such analysis is crucial to the delivery of reliable intelligence. Polarization filters are a possible method for ambient glare reduction by allowing only certain modes of the electromagnetic field to be captured, providing strong scene contrast. An experiment was carried out utilizing a polarization lens attached to a hyperspectral imagery camera for the purpose of exploring the degree to which an imaged polarized scene of potassium, phosphorus, and nitrogen mixture allows for improved target detection and image segmentation. Preliminary imagery results based on the application of machine learning algorithms, including competitive leaky learning and distance metric analysis, to polarized hyperspectral imagery, suggest that polarization filters provide a slight advantage in image segmentation. The results of this work have implications for understanding the presence of explosive material in dry, desert areas where reflective glare is a significant impediment to scene characterization. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=explosive%20material" title="explosive material">explosive material</a>, <a href="https://publications.waset.org/abstracts/search?q=hyperspectral%20imagery" title=" hyperspectral imagery"> hyperspectral imagery</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20segmentation" title=" image segmentation"> image segmentation</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title=" machine learning"> machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=polarization" title=" polarization"> polarization</a> </p> <a href="https://publications.waset.org/abstracts/127733/potassium-phosphorus-nitrogen-detection-and-spectral-segmentation-analysis-using-polarized-hyperspectral-imagery-and-machine-learning" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/127733.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">141</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1024</span> Validation of a Placebo Method with Potential for Blinding in Ultrasound-Guided Dry Needling</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Johnson%20C.%20Y.%20Pang">Johnson C. Y. Pang</a>, <a href="https://publications.waset.org/abstracts/search?q=Bo%20Pengb"> Bo Pengb</a>, <a href="https://publications.waset.org/abstracts/search?q=Kara%20K.%20L.%20Reevesc"> Kara K. L. Reevesc</a>, <a href="https://publications.waset.org/abstracts/search?q=Allan%20C.%20L.%20Fud"> Allan C. L. Fud</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Objective: Dry needling (DN) has long been used as a treatment method for various musculoskeletal pain conditions. However, the evidence level of the studies was low due to the limitations of the methodology. Lack of randomization and inappropriate blinding are potentially the main sources of bias. A method that can differentiate clinical results due to the targeted experimental procedure from its placebo effect is needed to enhance the validity of the trial. Therefore, this study aimed to validate the method as a placebo ultrasound(US)-guided DN for patients with knee osteoarthritis (KOA). Design: This is a randomized controlled trial (RCT). Ninety subjects (25 males and 65 females) aged between 51 and 80 (61.26±5.57) with radiological KOA were recruited and randomly assigned into three groups with a computer program. Group 1 (G1) received real US-guided DN, Group 2 (G2) received placebo US-guided DN, and Group 3 (G3) was the control group. Both G1 and G2 subjects received the same procedure of US-guided DN, except the US monitor was turned off in G2, blinding the G2 subjects to the incorporation of faux US guidance. This arrangement created the placebo effect intended to permit comparison of their results to those who received actual US-guided DN. Outcome measures, including the visual analog scale (VAS) and Knee injury and Osteoarthritis Outcome Score (KOOS) subscales of pain, symptoms and quality of life (QOL), were analyzed by repeated-measures analysis of covariance (ANCOVA) for time effects and group effects. The data regarding the perception of receiving real US-guided DN or placebo US-guided DN were analyzed by the chi-squared test. The missing data were analyzed with the intention-to-treat (ITT) approach if more than 5% of the data were missing. Results: The placebo US-guided DN (G2) subjects had the same perceptions as the use of real US guidance in the advancement of DN (p<0.128). G1 had significantly higher pain reduction (VAS and KOOS-pain) than G2 and G3 at 8 weeks (both p<0.05) only. There was no significant difference between G2 and G3 at 8 weeks (both p>0.05). Conclusion: The method with the US monitor turned off during the application of DN is credible for blinding the participants and allowing researchers to incorporate faux US guidance. The validated placebo US-guided DN technique can aid in investigations of the effects of US-guided DN with short-term effects of pain reduction for patients with KOA. Acknowledgment: This work was supported by the Caritas Institute of Higher Education [grant number IDG200101]. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=reliability" title="reliability">reliability</a>, <a href="https://publications.waset.org/abstracts/search?q=jumping" title=" jumping"> jumping</a>, <a href="https://publications.waset.org/abstracts/search?q=3D%20motion%20analysis" title=" 3D motion analysis"> 3D motion analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=anterior%20crucial%20ligament%20reconstruction" title=" anterior crucial ligament reconstruction"> anterior crucial ligament reconstruction</a> </p> <a href="https://publications.waset.org/abstracts/147767/validation-of-a-placebo-method-with-potential-for-blinding-in-ultrasound-guided-dry-needling" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/147767.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">119</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1023</span> A Teaching Method for Improving Sentence Fluency in Writing</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Manssour%20Habbash">Manssour Habbash</a>, <a href="https://publications.waset.org/abstracts/search?q=Srinivasa%20Rao%20Idapalapati"> Srinivasa Rao Idapalapati</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Although writing is a multifaceted task, teaching writing is a demanding task basically for two reasons: Grammar and Syntax. This article provides a method of teaching writing that was found to be effective in improving students’ academic writing composition skill. The article explains the concepts of ‘guided-discovery’ and ‘guided-construction’ upon which a method of teaching writing is grounded and developed. Providing a brief commentary on what the core could mean primarily, the article presents an exposition of understanding and identifying the core and building upon the core that can demonstrate the way a teacher can make use of the concepts in teaching for improving the writing skills of their students. The method is an adaptation of grammar translation method that has been improvised to suit to a student-centered classroom environment. An intervention of teaching writing through this method was tried out with positive outcomes in formal classroom research setup, and in view of the content’s quality that relates more to the classroom practices and also in consideration of its usefulness to the practicing teachers the process and the findings are presented in a narrative form along with the results in tabular form. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=core%20of%20a%20text" title="core of a text">core of a text</a>, <a href="https://publications.waset.org/abstracts/search?q=guided%20construction" title=" guided construction"> guided construction</a>, <a href="https://publications.waset.org/abstracts/search?q=guided%20discovery" title=" guided discovery"> guided discovery</a>, <a href="https://publications.waset.org/abstracts/search?q=theme%20of%20a%20text" title=" theme of a text"> theme of a text</a> </p> <a href="https://publications.waset.org/abstracts/42210/a-teaching-method-for-improving-sentence-fluency-in-writing" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/42210.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">380</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1022</span> Automatic Extraction of Arbitrarily Shaped Buildings from VHR Satellite Imagery</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Evans%20Belly">Evans Belly</a>, <a href="https://publications.waset.org/abstracts/search?q=Imdad%20Rizvi"> Imdad Rizvi</a>, <a href="https://publications.waset.org/abstracts/search?q=M.%20M.%20Kadam"> M. M. Kadam</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Satellite imagery is one of the emerging technologies which are extensively utilized in various applications such as detection/extraction of man-made structures, monitoring of sensitive areas, creating graphic maps etc. The main approach here is the automated detection of buildings from very high resolution (VHR) optical satellite images. Initially, the shadow, the building and the non-building regions (roads, vegetation etc.) are investigated wherein building extraction is mainly focused. Once all the landscape is collected a trimming process is done so as to eliminate the landscapes that may occur due to non-building objects. Finally the label method is used to extract the building regions. The label method may be altered for efficient building extraction. The images used for the analysis are the ones which are extracted from the sensors having resolution less than 1 meter (VHR). This method provides an efficient way to produce good results. The additional overhead of mid processing is eliminated without compromising the quality of the output to ease the processing steps required and time consumed. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=building%20detection" title="building detection">building detection</a>, <a href="https://publications.waset.org/abstracts/search?q=shadow%20detection" title=" shadow detection"> shadow detection</a>, <a href="https://publications.waset.org/abstracts/search?q=landscape%20generation" title=" landscape generation"> landscape generation</a>, <a href="https://publications.waset.org/abstracts/search?q=label" title=" label"> label</a>, <a href="https://publications.waset.org/abstracts/search?q=partitioning" title=" partitioning"> partitioning</a>, <a href="https://publications.waset.org/abstracts/search?q=very%20high%20resolution%20%28VHR%29%20satellite%20imagery" title=" very high resolution (VHR) satellite imagery"> very high resolution (VHR) satellite imagery</a> </p> <a href="https://publications.waset.org/abstracts/76690/automatic-extraction-of-arbitrarily-shaped-buildings-from-vhr-satellite-imagery" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/76690.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">314</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">1021</span> Effect of Motor Imagery of Truncal Exercises on Trunk Function and Balance in Early Stroke: A Randomized Controlled Trial</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Elsa%20Reethu">Elsa Reethu</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20Karthik%20Babu"> S. Karthik Babu</a>, <a href="https://publications.waset.org/abstracts/search?q=N.%20Syed"> N. Syed</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Background: Studies in the past focused on the additional benefits of action observation in improving upper and lower limb functions and improving activities of daily living when administered along with conventional therapy. Nevertheless, there is a paucity of literature proving the effects of motor imagery of truncal exercise in improving trunk control in patients with stroke. Aims/purpose: To study the effect of motor imagery of truncal exercises on trunk function and balance in early stroke. Methods: A total of 24 patients were included in the study. 12 were included in the experimental group and 12 were included in control group Trunk function was measured using Trunk Control Test (TCT), Trunk Impairment Scale Verheyden (TIS Verheyden) and Trunk Impairment Scale Fujiwara (TIS Fujiwara). The balance was assessed using Brunel Balance Assessment (BBA) and Tinetti POMA. For the experimental group, each session was for 30 minutes of physical exercises and 15 minutes of motor imagery, once a day, six times a week for 3 weeks and prior to the exercise session, patients viewed a video tape of all the trunk exercises to be performed for 15minutes. The control group practiced the trunk exercises alone for the same duration. Measurements were taken before, after and 4 weeks after intervention. Results: The effect of treatment in motor imagery group showed better improvement when compared with control group when measured after 3 weeks on values of static sitting balance, dynamic balance, total TIS (Verheyden) score, BBA, Tinetti balance and gait with a large effect size of 0.86, 1.99, 1.69, 1.06, 1.63 and 0.97 respectively. The moderate effect size was seen in values of TIS Fujiwara (0.58) and small effect size was seen on TCT (0.12) and TIS coordination component (0.13).at the end of 4 weeks after intervention, the large effect size was identified on values of dynamic balance (2.06), total TIS score (1.59) and Tinetti balance (1.24). The moderate effect size was observed on BBA (0.62) and Tinetti gait (0.72). Conclusion: Trunk motor imagery is effective in improving trunk function and balance in patients with stroke and has a carryover effect in the aspects of mobility. The therapy gain that was observed during the time of discharge was seen to be maintained at the follow-up levels. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=stroke" title="stroke">stroke</a>, <a href="https://publications.waset.org/abstracts/search?q=trunk%20rehabilitation" title=" trunk rehabilitation"> trunk rehabilitation</a>, <a href="https://publications.waset.org/abstracts/search?q=trunk%20function" title=" trunk function"> trunk function</a>, <a href="https://publications.waset.org/abstracts/search?q=balance" title=" balance"> balance</a>, <a href="https://publications.waset.org/abstracts/search?q=motor%20imagery" title=" motor imagery"> motor imagery</a> </p> <a href="https://publications.waset.org/abstracts/45751/effect-of-motor-imagery-of-truncal-exercises-on-trunk-function-and-balance-in-early-stroke-a-randomized-controlled-trial" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/45751.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">300</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">&lsaquo;</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=guided%20imagery&amp;page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=guided%20imagery&amp;page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=guided%20imagery&amp;page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=guided%20imagery&amp;page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=guided%20imagery&amp;page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=guided%20imagery&amp;page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=guided%20imagery&amp;page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=guided%20imagery&amp;page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=guided%20imagery&amp;page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=guided%20imagery&amp;page=34">34</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=guided%20imagery&amp;page=35">35</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=guided%20imagery&amp;page=2" rel="next">&rsaquo;</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">&copy; 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">&times;</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10