CINXE.COM
Search results for: long short-term memory (LSTM)
<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: long short-term memory (LSTM)</title> <meta name="description" content="Search results for: long short-term memory (LSTM)"> <meta name="keywords" content="long short-term memory (LSTM)"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="long short-term memory (LSTM)" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="long short-term memory (LSTM)"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 7009</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: long short-term memory (LSTM)</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7009</span> Long Short-Time Memory Neural Networks for Human Driving Behavior Modelling</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Lu%20Zhao">Lu Zhao</a>, <a href="https://publications.waset.org/abstracts/search?q=Nadir%20Farhi"> Nadir Farhi</a>, <a href="https://publications.waset.org/abstracts/search?q=Yeltsin%20Valero"> Yeltsin Valero</a>, <a href="https://publications.waset.org/abstracts/search?q=Zoi%20Christoforou"> Zoi Christoforou</a>, <a href="https://publications.waset.org/abstracts/search?q=Nadia%20Haddadou"> Nadia Haddadou</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, a long short-term memory (LSTM) neural network model is proposed to replicate simultaneously car-following and lane-changing behaviors in road networks. By combining two kinds of LSTM layers and three input designs of the neural network, six variants of the LSTM model have been created. These models were trained and tested on the NGSIM 101 dataset, and the results were evaluated in terms of longitudinal speed and lateral position, respectively. Then, we compared the LSTM model with a classical car-following model (the intelligent driving model (IDM)) in the part of speed decision. In addition, the LSTM model is compared with a model using classical neural networks. After the comparison, the LSTM model demonstrates higher accuracy than the physical model IDM in terms of car-following behavior and displays better performance with regard to both car-following and lane-changing behavior compared to the classical neural network model. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=traffic%20modeling" title="traffic modeling">traffic modeling</a>, <a href="https://publications.waset.org/abstracts/search?q=neural%20networks" title=" neural networks"> neural networks</a>, <a href="https://publications.waset.org/abstracts/search?q=LSTM" title=" LSTM"> LSTM</a>, <a href="https://publications.waset.org/abstracts/search?q=car-following" title=" car-following"> car-following</a>, <a href="https://publications.waset.org/abstracts/search?q=lane-change" title=" lane-change"> lane-change</a> </p> <a href="https://publications.waset.org/abstracts/139730/long-short-time-memory-neural-networks-for-human-driving-behavior-modelling" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/139730.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">261</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7008</span> Long Short-Term Memory Stream Cruise Control Method for Automated Drift Detection and Adaptation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mohammad%20Abu-Shaira">Mohammad Abu-Shaira</a>, <a href="https://publications.waset.org/abstracts/search?q=Weishi%20Shi"> Weishi Shi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Adaptive learning, a commonly employed solution to drift, involves updating predictive models online during their operation to react to concept drifts, thereby serving as a critical component and natural extension for online learning systems that learn incrementally from each example. This paper introduces LSTM-SCCM “Long Short-Term Memory Stream Cruise Control Method”, a drift adaptation-as-a-service framework for online learning. LSTM-SCCM automates drift adaptation through prompt detection, drift magnitude quantification, dynamic hyperparameter tuning, performing shortterm optimization and model recalibration for immediate adjustments, and, when necessary, conducting long-term model recalibration to ensure deeper enhancements in model performance. LSTM-SCCM is incorporated into a suite of cutting-edge online regression models, assessing their performance across various types of concept drift using diverse datasets with varying characteristics. The findings demonstrate that LSTM-SCCM represents a notable advancement in both model performance and efficacy in handling concept drift occurrences. LSTM-SCCM stands out as the sole framework adept at effectively tackling concept drifts within regression scenarios. Its proactive approach to drift adaptation distinguishes it from conventional reactive methods, which typically rely on retraining after significant degradation to model performance caused by drifts. Additionally, LSTM-SCCM employs an in-memory approach combined with the Self-Adjusting Memory (SAM) architecture to enhance real-time processing and adaptability. The framework incorporates variable thresholding techniques and does not assume any particular data distribution, making it an ideal choice for managing high-dimensional datasets and efficiently handling large-scale data. Our experiments, which include abrupt, incremental, and gradual drifts across both low- and high-dimensional datasets with varying noise levels, and applied to four state-of-the-art online regression models, demonstrate that LSTM-SCCM is versatile and effective, rendering it a valuable solution for online regression models to address concept drift. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=automated%20drift%20detection%20and%20adaptation" title="automated drift detection and adaptation">automated drift detection and adaptation</a>, <a href="https://publications.waset.org/abstracts/search?q=concept%20drift" title=" concept drift"> concept drift</a>, <a href="https://publications.waset.org/abstracts/search?q=hyperparameters%20optimization" title=" hyperparameters optimization"> hyperparameters optimization</a>, <a href="https://publications.waset.org/abstracts/search?q=online%20and%20adaptive%20learning" title=" online and adaptive learning"> online and adaptive learning</a>, <a href="https://publications.waset.org/abstracts/search?q=regression" title=" regression"> regression</a> </p> <a href="https://publications.waset.org/abstracts/193474/long-short-term-memory-stream-cruise-control-method-for-automated-drift-detection-and-adaptation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/193474.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">11</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7007</span> Analysis of Multilayer Neural Network Modeling and Long Short-Term Memory</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Danilo%20L%C3%B3pez">Danilo López</a>, <a href="https://publications.waset.org/abstracts/search?q=Nelson%20Vera"> Nelson Vera</a>, <a href="https://publications.waset.org/abstracts/search?q=Luis%20Pedraza"> Luis Pedraza</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper analyzes fundamental ideas and concepts related to neural networks, which provide the reader a theoretical explanation of Long Short-Term Memory (LSTM) networks operation classified as Deep Learning Systems, and to explicitly present the mathematical development of Backward Pass equations of the LSTM network model. This mathematical modeling associated with software development will provide the necessary tools to develop an intelligent system capable of predicting the behavior of licensed users in wireless cognitive radio networks. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=neural%20networks" title="neural networks">neural networks</a>, <a href="https://publications.waset.org/abstracts/search?q=multilayer%20perceptron" title=" multilayer perceptron"> multilayer perceptron</a>, <a href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory" title=" long short-term memory"> long short-term memory</a>, <a href="https://publications.waset.org/abstracts/search?q=recurrent%20neuronal%20network" title=" recurrent neuronal network"> recurrent neuronal network</a>, <a href="https://publications.waset.org/abstracts/search?q=mathematical%20analysis" title=" mathematical analysis"> mathematical analysis</a> </p> <a href="https://publications.waset.org/abstracts/63507/analysis-of-multilayer-neural-network-modeling-and-long-short-term-memory" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/63507.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">420</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7006</span> Deep Learning Approaches for Accurate Detection of Epileptic Seizures from Electroencephalogram Data</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ramzi%20Rihane">Ramzi Rihane</a>, <a href="https://publications.waset.org/abstracts/search?q=Yassine%20Benayed"> Yassine Benayed</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Epilepsy is a chronic neurological disorder characterized by recurrent, unprovoked seizures resulting from abnormal electrical activity in the brain. Timely and accurate detection of these seizures is essential for improving patient care. In this study, we leverage the UK Bonn University open-source EEG dataset and employ advanced deep-learning techniques to automate the detection of epileptic seizures. By extracting key features from both time and frequency domains, as well as Spectrogram features, we enhance the performance of various deep learning models. Our investigation includes architectures such as Long Short-Term Memory (LSTM), Bidirectional LSTM (Bi-LSTM), 1D Convolutional Neural Networks (1D-CNN), and hybrid CNN-LSTM and CNN-BiLSTM models. The models achieved impressive accuracies: LSTM (98.52%), Bi-LSTM (98.61%), CNN-LSTM (98.91%), CNN-BiLSTM (98.83%), and CNN (98.73%). Additionally, we utilized a data augmentation technique called SMOTE, which yielded the following results: CNN (97.36%), LSTM (97.01%), Bi-LSTM (97.23%), CNN-LSTM (97.45%), and CNN-BiLSTM (97.34%). These findings demonstrate the effectiveness of deep learning in capturing complex patterns in EEG signals, providing a reliable and scalable solution for real-time seizure detection in clinical environments. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=electroencephalogram" title="electroencephalogram">electroencephalogram</a>, <a href="https://publications.waset.org/abstracts/search?q=epileptic%20seizure" title=" epileptic seizure"> epileptic seizure</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title=" deep learning"> deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=LSTM" title=" LSTM"> LSTM</a>, <a href="https://publications.waset.org/abstracts/search?q=CNN" title=" CNN"> CNN</a>, <a href="https://publications.waset.org/abstracts/search?q=BI-LSTM" title=" BI-LSTM"> BI-LSTM</a>, <a href="https://publications.waset.org/abstracts/search?q=seizure%20detection" title=" seizure detection"> seizure detection</a> </p> <a href="https://publications.waset.org/abstracts/193110/deep-learning-approaches-for-accurate-detection-of-epileptic-seizures-from-electroencephalogram-data" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/193110.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">12</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7005</span> Sentiment Analysis of Chinese Microblog Comments: Comparison between Support Vector Machine and Long Short-Term Memory</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Xu%20Jiaqiao">Xu Jiaqiao</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Text sentiment analysis is an important branch of natural language processing. This technology is widely used in public opinion analysis and web surfing recommendations. At present, the mainstream sentiment analysis methods include three parts: sentiment analysis based on a sentiment dictionary, based on traditional machine learning, and based on deep learning. This paper mainly analyzes and compares the advantages and disadvantages of the SVM method of traditional machine learning and the Long Short-term Memory (LSTM) method of deep learning in the field of Chinese sentiment analysis, using Chinese comments on Sina Microblog as the data set. Firstly, this paper classifies and adds labels to the original comment dataset obtained by the web crawler, and then uses Jieba word segmentation to classify the original dataset and remove stop words. After that, this paper extracts text feature vectors and builds document word vectors to facilitate the training of the model. Finally, SVM and LSTM models are trained respectively. After accuracy calculation, it can be obtained that the accuracy of the LSTM model is 85.80%, while the accuracy of SVM is 91.07%. But at the same time, LSTM operation only needs 2.57 seconds, SVM model needs 6.06 seconds. Therefore, this paper concludes that: compared with the SVM model, the LSTM model is worse in accuracy but faster in processing speed. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=sentiment%20analysis" title="sentiment analysis">sentiment analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=support%20vector%20machine" title=" support vector machine"> support vector machine</a>, <a href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory" title=" long short-term memory"> long short-term memory</a>, <a href="https://publications.waset.org/abstracts/search?q=Chinese%20microblog%20comments" title=" Chinese microblog comments"> Chinese microblog comments</a> </p> <a href="https://publications.waset.org/abstracts/154733/sentiment-analysis-of-chinese-microblog-comments-comparison-between-support-vector-machine-and-long-short-term-memory" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/154733.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">94</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7004</span> Bidirectional Long Short-Term Memory-Based Signal Detection for Orthogonal Frequency Division Multiplexing With All Index Modulation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mahmut%20Yildirim">Mahmut Yildirim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper proposed the bidirectional long short-term memory (Bi-LSTM) network-aided deep learning (DL)-based signal detection for Orthogonal frequency division multiplexing with all index modulation (OFDM-AIM), namely Bi-DeepAIM. OFDM-AIM is developed to increase the spectral efficiency of OFDM with index modulation (OFDM-IM), a promising multi-carrier technique for communication systems beyond 5G. In this paper, due to its strong classification ability, Bi-LSTM is considered an alternative to the maximum likelihood (ML) algorithm, which is used for signal detection in the classical OFDM-AIM scheme. The performance of the Bi-DeepAIM is compared with LSTM network-aided DL-based OFDM-AIM (DeepAIM) and classic OFDM-AIM that uses (ML)-based signal detection via BER performance and computational time criteria. Simulation results show that Bi-DeepAIM obtains better bit error rate (BER) performance than DeepAIM and lower computation time in signal detection than ML-AIM. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=bidirectional%20long%20short-term%20memory" title="bidirectional long short-term memory">bidirectional long short-term memory</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title=" deep learning"> deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=maximum%20likelihood" title=" maximum likelihood"> maximum likelihood</a>, <a href="https://publications.waset.org/abstracts/search?q=OFDM%20with%20all%20index%20modulation" title=" OFDM with all index modulation"> OFDM with all index modulation</a>, <a href="https://publications.waset.org/abstracts/search?q=signal%20detection" title=" signal detection"> signal detection</a> </p> <a href="https://publications.waset.org/abstracts/183512/bidirectional-long-short-term-memory-based-signal-detection-for-orthogonal-frequency-division-multiplexing-with-all-index-modulation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/183512.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">72</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7003</span> Electrical Machine Winding Temperature Estimation Using Stateful Long Short-Term Memory Networks (LSTM) and Truncated Backpropagation Through Time (TBPTT)</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yujiang%20Wu">Yujiang Wu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> As electrical machine (e-machine) power density re-querulents become more stringent in vehicle electrification, mounting a temperature sensor for e-machine stator windings becomes increasingly difficult. This can lead to higher manufacturing costs, complicated harnesses, and reduced reliability. In this paper, we propose a deep-learning method for predicting electric machine winding temperature, which can either replace the sensor entirely or serve as a backup to the existing sensor. We compare the performance of our method, the stateful long short-term memory networks (LSTM) with truncated backpropagation through time (TBTT), with that of linear regression, as well as stateless LSTM with/without residual connection. Our results demonstrate the strength of combining stateful LSTM and TBTT in tackling nonlinear time series prediction problems with long sequence lengths. Additionally, in industrial applications, high-temperature region prediction accuracy is more important because winding temperature sensing is typically used for derating machine power when the temperature is high. To evaluate the performance of our algorithm, we developed a temperature-stratified MSE. We propose a simple but effective data preprocessing trick to improve the high-temperature region prediction accuracy. Our experimental results demonstrate the effectiveness of our proposed method in accurately predicting winding temperature, particularly in high-temperature regions, while also reducing manufacturing costs and improving reliability. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title="deep learning">deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=electrical%20machine" title=" electrical machine"> electrical machine</a>, <a href="https://publications.waset.org/abstracts/search?q=functional%20safety" title=" functional safety"> functional safety</a>, <a href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory%20networks%20%28LSTM%29" title=" long short-term memory networks (LSTM)"> long short-term memory networks (LSTM)</a>, <a href="https://publications.waset.org/abstracts/search?q=thermal%20management" title=" thermal management"> thermal management</a>, <a href="https://publications.waset.org/abstracts/search?q=time%20series%20prediction" title=" time series prediction"> time series prediction</a> </p> <a href="https://publications.waset.org/abstracts/170991/electrical-machine-winding-temperature-estimation-using-stateful-long-short-term-memory-networks-lstm-and-truncated-backpropagation-through-time-tbptt" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/170991.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">99</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7002</span> Power Grid Line Ampacity Forecasting Based on a Long-Short-Term Memory Neural Network </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Xiang-Yao%20Zheng">Xiang-Yao Zheng</a>, <a href="https://publications.waset.org/abstracts/search?q=Jen-Cheng%20Wang"> Jen-Cheng Wang</a>, <a href="https://publications.waset.org/abstracts/search?q=Joe-Air%20Jiang"> Joe-Air Jiang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Improving the line ampacity while using existing power grids is an important issue that electricity dispatchers are now facing. Using the information provided by the dynamic thermal rating (DTR) of transmission lines, an overhead power grid can operate safely. However, dispatchers usually lack real-time DTR information. Thus, this study proposes a long-short-term memory (LSTM)-based method, which is one of the neural network models. The LSTM-based method predicts the DTR of lines using the weather data provided by Central Weather Bureau (CWB) of Taiwan. The possible thermal bottlenecks at different locations along the line and the margin of line ampacity can be real-time determined by the proposed LSTM-based prediction method. A case study that targets the 345 kV power grid of TaiPower in Taiwan is utilized to examine the performance of the proposed method. The simulation results show that the proposed method is useful to provide the information for the smart grid application in the future. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=electricity%20dispatch" title="electricity dispatch">electricity dispatch</a>, <a href="https://publications.waset.org/abstracts/search?q=line%20ampacity%20prediction" title=" line ampacity prediction"> line ampacity prediction</a>, <a href="https://publications.waset.org/abstracts/search?q=dynamic%20thermal%20rating" title=" dynamic thermal rating"> dynamic thermal rating</a>, <a href="https://publications.waset.org/abstracts/search?q=long-short-term%20memory%20neural%20network" title=" long-short-term memory neural network"> long-short-term memory neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=smart%20grid" title=" smart grid"> smart grid</a> </p> <a href="https://publications.waset.org/abstracts/63755/power-grid-line-ampacity-forecasting-based-on-a-long-short-term-memory-neural-network" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/63755.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">282</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7001</span> Performance Evaluation of the Classic seq2seq Model versus a Proposed Semi-supervised Long Short-Term Memory Autoencoder for Time Series Data Forecasting</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Aswathi%20Thrivikraman">Aswathi Thrivikraman</a>, <a href="https://publications.waset.org/abstracts/search?q=S.%20Advaith"> S. Advaith</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The study is aimed at designing encoders for deciphering intricacies in time series data by redescribing the dynamics operating on a lower-dimensional manifold. A semi-supervised LSTM autoencoder is devised and investigated to see if the latent representation of the time series data can better forecast the data. End-to-end training of the LSTM autoencoder, together with another LSTM network that is connected to the latent space, forces the hidden states of the encoder to represent the most meaningful latent variables relevant for forecasting. Furthermore, the study compares the predictions with those of a traditional seq2seq model. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=LSTM" title="LSTM">LSTM</a>, <a href="https://publications.waset.org/abstracts/search?q=autoencoder" title=" autoencoder"> autoencoder</a>, <a href="https://publications.waset.org/abstracts/search?q=forecasting" title=" forecasting"> forecasting</a>, <a href="https://publications.waset.org/abstracts/search?q=seq2seq%20model" title=" seq2seq model"> seq2seq model</a> </p> <a href="https://publications.waset.org/abstracts/157449/performance-evaluation-of-the-classic-seq2seq-model-versus-a-proposed-semi-supervised-long-short-term-memory-autoencoder-for-time-series-data-forecasting" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/157449.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">155</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">7000</span> Artificial Neural Network for Forecasting of Daily Reservoir Inflow: Case Study of the Kotmale Reservoir in Sri Lanka</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=E.%20U.%20Dampage">E. U. Dampage</a>, <a href="https://publications.waset.org/abstracts/search?q=Ovindi%20D.%20Bandara"> Ovindi D. Bandara</a>, <a href="https://publications.waset.org/abstracts/search?q=Vinushi%20S.%20Waraketiya"> Vinushi S. Waraketiya</a>, <a href="https://publications.waset.org/abstracts/search?q=Samitha%20S.%20R.%20De%20Silva"> Samitha S. R. De Silva</a>, <a href="https://publications.waset.org/abstracts/search?q=Yasiru%20S.%20Gunarathne"> Yasiru S. Gunarathne</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The knowledge of water inflow figures is paramount in decision making on the allocation for consumption for numerous purposes; irrigation, hydropower, domestic and industrial usage, and flood control. The understanding of how reservoir inflows are affected by different climatic and hydrological conditions is crucial to enable effective water management and downstream flood control. In this research, we propose a method using a Long Short Term Memory (LSTM) Artificial Neural Network (ANN) to assist the aforesaid decision-making process. The Kotmale reservoir, which is the uppermost reservoir in the Mahaweli reservoir complex in Sri Lanka, was used as the test bed for this research. The ANN uses the runoff in the Kotmale reservoir catchment area and the effect of Sea Surface Temperatures (SST) to make a forecast for seven days ahead. Three types of ANN are tested; Multi-Layer Perceptron (MLP), Convolutional Neural Network (CNN), and LSTM. The extensive field trials and validation endeavors found that the LSTM ANN provides superior performance in the aspects of accuracy and latency. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=convolutional%20neural%20network" title="convolutional neural network">convolutional neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=CNN" title=" CNN"> CNN</a>, <a href="https://publications.waset.org/abstracts/search?q=inflow" title=" inflow"> inflow</a>, <a href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory" title=" long short-term memory"> long short-term memory</a>, <a href="https://publications.waset.org/abstracts/search?q=LSTM" title=" LSTM"> LSTM</a>, <a href="https://publications.waset.org/abstracts/search?q=multi-layer%20perceptron" title=" multi-layer perceptron"> multi-layer perceptron</a>, <a href="https://publications.waset.org/abstracts/search?q=MLP" title=" MLP"> MLP</a>, <a href="https://publications.waset.org/abstracts/search?q=neural%20network" title=" neural network"> neural network</a> </p> <a href="https://publications.waset.org/abstracts/126767/artificial-neural-network-for-forecasting-of-daily-reservoir-inflow-case-study-of-the-kotmale-reservoir-in-sri-lanka" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/126767.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">151</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6999</span> Forecasting the Temperature at a Weather Station Using Deep Neural Networks</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Debneil%20Saha%20Roy">Debneil Saha Roy</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Weather forecasting is a complex topic and is well suited for analysis by deep learning approaches. With the wide availability of weather observation data nowadays, these approaches can be utilized to identify immediate comparisons between historical weather forecasts and current observations. This work explores the application of deep learning techniques to weather forecasting in order to accurately predict the weather over a given forecast horizon. Three deep neural networks are used in this study, namely, Multi-Layer Perceptron (MLP), Long Short Tunn Memory Network (LSTM) and a combination of Convolutional Neural Network (CNN) and LSTM. The predictive performance of these models is compared using two evaluation metrics. The results show that forecasting accuracy increases with an increase in the complexity of deep neural networks. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=convolutional%20neural%20network" title="convolutional neural network">convolutional neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title=" deep learning"> deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=long%20short%20term%20memory" title=" long short term memory"> long short term memory</a>, <a href="https://publications.waset.org/abstracts/search?q=multi-layer%20perceptron" title=" multi-layer perceptron"> multi-layer perceptron</a> </p> <a href="https://publications.waset.org/abstracts/124787/forecasting-the-temperature-at-a-weather-station-using-deep-neural-networks" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/124787.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">177</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6998</span> Ground Surface Temperature History Prediction Using Long-Short Term Memory Neural Network Architecture</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Venkat%20S.%20Somayajula">Venkat S. Somayajula</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Ground surface temperature history prediction model plays a vital role in determining standards for international nuclear waste management. International standards for borehole based nuclear waste disposal require paleoclimate cycle predictions on scale of a million forward years for the place of waste disposal. This research focuses on developing a paleoclimate cycle prediction model using Bayesian long-short term memory (LSTM) neural architecture operated on accumulated borehole temperature history data. Bayesian models have been previously used for paleoclimate cycle prediction based on Monte-Carlo weight method, but due to limitations pertaining model coupling with certain other prediction networks, Bayesian models in past couldn’t accommodate prediction cycle’s over 1000 years. LSTM has provided frontier to couple developed models with other prediction networks with ease. Paleoclimate cycle developed using this process will be trained on existing borehole data and then will be coupled to surface temperature history prediction networks which give endpoints for backpropagation of LSTM network and optimize the cycle of prediction for larger prediction time scales. Trained LSTM will be tested on past data for validation and then propagated for forward prediction of temperatures at borehole locations. This research will be beneficial for study pertaining to nuclear waste management, anthropological cycle predictions and geophysical features <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Bayesian%20long-short%20term%20memory%20neural%20network" title="Bayesian long-short term memory neural network">Bayesian long-short term memory neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=borehole%20temperature" title=" borehole temperature"> borehole temperature</a>, <a href="https://publications.waset.org/abstracts/search?q=ground%20surface%20temperature%20history" title=" ground surface temperature history"> ground surface temperature history</a>, <a href="https://publications.waset.org/abstracts/search?q=paleoclimate%20cycle" title=" paleoclimate cycle"> paleoclimate cycle</a> </p> <a href="https://publications.waset.org/abstracts/124063/ground-surface-temperature-history-prediction-using-long-short-term-memory-neural-network-architecture" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/124063.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">128</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6997</span> Memory Based Reinforcement Learning with Transformers for Long Horizon Timescales and Continuous Action Spaces</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Shweta%20Singh">Shweta Singh</a>, <a href="https://publications.waset.org/abstracts/search?q=Sudaman%20Katti"> Sudaman Katti</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The most well-known sequence models make use of complex recurrent neural networks in an encoder-decoder configuration. The model used in this research makes use of a transformer, which is based purely on a self-attention mechanism, without relying on recurrence at all. More specifically, encoders and decoders which make use of self-attention and operate based on a memory, are used. In this research work, results for various 3D visual and non-visual reinforcement learning tasks designed in Unity software were obtained. Convolutional neural networks, more specifically, nature CNN architecture, are used for input processing in visual tasks, and comparison with standard long short-term memory (LSTM) architecture is performed for both visual tasks based on CNNs and non-visual tasks based on coordinate inputs. This research work combines the transformer architecture with the proximal policy optimization technique used popularly in reinforcement learning for stability and better policy updates while training, especially for continuous action spaces, which are used in this research work. Certain tasks in this paper are long horizon tasks that carry on for a longer duration and require extensive use of memory-based functionalities like storage of experiences and choosing appropriate actions based on recall. The transformer, which makes use of memory and self-attention mechanism in an encoder-decoder configuration proved to have better performance when compared to LSTM in terms of exploration and rewards achieved. Such memory based architectures can be used extensively in the field of cognitive robotics and reinforcement learning. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=convolutional%20neural%20networks" title="convolutional neural networks">convolutional neural networks</a>, <a href="https://publications.waset.org/abstracts/search?q=reinforcement%20learning" title=" reinforcement learning"> reinforcement learning</a>, <a href="https://publications.waset.org/abstracts/search?q=self-attention" title=" self-attention"> self-attention</a>, <a href="https://publications.waset.org/abstracts/search?q=transformers" title=" transformers"> transformers</a>, <a href="https://publications.waset.org/abstracts/search?q=unity" title=" unity"> unity</a> </p> <a href="https://publications.waset.org/abstracts/163301/memory-based-reinforcement-learning-with-transformers-for-long-horizon-timescales-and-continuous-action-spaces" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/163301.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">136</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6996</span> Image Captioning with Vision-Language Models</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Promise%20Ekpo%20Osaine">Promise Ekpo Osaine</a>, <a href="https://publications.waset.org/abstracts/search?q=Daniel%20Melesse"> Daniel Melesse</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Image captioning is an active area of research in the multi-modal artificial intelligence (AI) community as it connects vision and language understanding, especially in settings where it is required that a model understands the content shown in an image and generates semantically and grammatically correct descriptions. In this project, we followed a standard approach to a deep learning-based image captioning model, injecting architecture for the encoder-decoder setup, where the encoder extracts image features, and the decoder generates a sequence of words that represents the image content. As such, we investigated image encoders, which are ResNet101, InceptionResNetV2, EfficientNetB7, EfficientNetV2M, and CLIP. As a caption generation structure, we explored long short-term memory (LSTM). The CLIP-LSTM model demonstrated superior performance compared to the encoder-decoder models, achieving a BLEU-1 score of 0.904 and a BLEU-4 score of 0.640. Additionally, among the CNN-LSTM models, EfficientNetV2M-LSTM exhibited the highest performance with a BLEU-1 score of 0.896 and a BLEU-4 score of 0.586 while using a single-layer LSTM. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=multi-modal%20AI%20systems" title="multi-modal AI systems">multi-modal AI systems</a>, <a href="https://publications.waset.org/abstracts/search?q=image%20captioning" title=" image captioning"> image captioning</a>, <a href="https://publications.waset.org/abstracts/search?q=encoder" title=" encoder"> encoder</a>, <a href="https://publications.waset.org/abstracts/search?q=decoder" title=" decoder"> decoder</a>, <a href="https://publications.waset.org/abstracts/search?q=BLUE%20score" title=" BLUE score"> BLUE score</a> </p> <a href="https://publications.waset.org/abstracts/181849/image-captioning-with-vision-language-models" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/181849.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">77</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6995</span> Deep Learning-Based Channel Estimation for RIS-Assisted Unmanned Aerial Vehicle-Enabled Wireless Communication System</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Getaneh%20Berie%20Tarekegn">Getaneh Berie Tarekegn</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Wireless communication via unmanned aerial vehicles (UAVs) has drawn a great deal of attention due to its flexibility in establishing line-of-sight (LoS) communications. However, in complex urban and dynamic environments, the movement of UAVs can be blocked by trees and high-rise buildings that obstruct directional paths. With reconfigurable intelligent surfaces (RIS), this problem can be effectively addressed. To achieve this goal, accurate channel estimation in RIS-assisted UAV-enabled wireless communications is crucial. This paper proposes an accurate channel estimation model using long short-term memory (LSTM) for a multi-user RIS-assisted UAV-enabled wireless communication system. According to simulation results, LSTM can improve the channel estimation performance of RIS-assisted UAV-enabled wireless communication. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=channel%20estimation" title="channel estimation">channel estimation</a>, <a href="https://publications.waset.org/abstracts/search?q=reconfigurable%20intelligent%20surfaces" title=" reconfigurable intelligent surfaces"> reconfigurable intelligent surfaces</a>, <a href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory" title=" long short-term memory"> long short-term memory</a>, <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20vehicles" title=" unmanned aerial vehicles"> unmanned aerial vehicles</a> </p> <a href="https://publications.waset.org/abstracts/184507/deep-learning-based-channel-estimation-for-ris-assisted-unmanned-aerial-vehicle-enabled-wireless-communication-system" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/184507.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">57</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6994</span> Generating Swarm Satellite Data Using Long Short-Term Memory and Generative Adversarial Networks for the Detection of Seismic Precursors</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yaxin%20Bi">Yaxin Bi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Accurate prediction and understanding of the evolution mechanisms of earthquakes remain challenging in the fields of geology, geophysics, and seismology. This study leverages Long Short-Term Memory (LSTM) networks and Generative Adversarial Networks (GANs), a generative model tailored to time-series data, for generating synthetic time series data based on Swarm satellite data, which will be used for detecting seismic anomalies. LSTMs demonstrated commendable predictive performance in generating synthetic data across multiple countries. In contrast, the GAN models struggled to generate synthetic data, often producing non-informative values, although they were able to capture the data distribution of the time series. These findings highlight both the promise and challenges associated with applying deep learning techniques to generate synthetic data, underscoring the potential of deep learning in generating synthetic electromagnetic satellite data. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=LSTM" title="LSTM">LSTM</a>, <a href="https://publications.waset.org/abstracts/search?q=GAN" title=" GAN"> GAN</a>, <a href="https://publications.waset.org/abstracts/search?q=earthquake" title=" earthquake"> earthquake</a>, <a href="https://publications.waset.org/abstracts/search?q=synthetic%20data" title=" synthetic data"> synthetic data</a>, <a href="https://publications.waset.org/abstracts/search?q=generative%20AI" title=" generative AI"> generative AI</a>, <a href="https://publications.waset.org/abstracts/search?q=seismic%20precursors" title=" seismic precursors"> seismic precursors</a> </p> <a href="https://publications.waset.org/abstracts/187478/generating-swarm-satellite-data-using-long-short-term-memory-and-generative-adversarial-networks-for-the-detection-of-seismic-precursors" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/187478.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">32</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6993</span> Deep Learning-Based Channel Estimation for Reconfigurable Intelligent Surface-Assisted Unmanned Aerial Vehicle-Enabled Wireless Communication System</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Getaneh%20Berie%20Tarekegn">Getaneh Berie Tarekegn</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Wireless communication via unmanned aerial vehicles (UAVs) has drawn a great deal of attention due to its flexibility in establishing line-of-sight (LoS) communications. However, in complex urban and dynamic environments, the movement of UAVs can be blocked by trees and high-rise buildings that obstruct directional paths. With reconfigurable intelligent surfaces (RIS), this problem can be effectively addressed. To achieve this goal, accurate channel estimation in RIS-assisted UAV-enabled wireless communications is crucial. This paper proposes an accurate channel estimation model using long short-term memory (LSTM) for a multi-user RIS-assisted UAV-enabled wireless communication system. According to simulation results, LSTM can improve the channel estimation performance of RIS-assisted UAV-enabled wireless communication. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=channel%20estimation" title="channel estimation">channel estimation</a>, <a href="https://publications.waset.org/abstracts/search?q=reconfigurable%20intelligent%20surfaces" title=" reconfigurable intelligent surfaces"> reconfigurable intelligent surfaces</a>, <a href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory" title=" long short-term memory"> long short-term memory</a>, <a href="https://publications.waset.org/abstracts/search?q=unmanned%20aerial%20vehicles" title=" unmanned aerial vehicles"> unmanned aerial vehicles</a> </p> <a href="https://publications.waset.org/abstracts/174128/deep-learning-based-channel-estimation-for-reconfigurable-intelligent-surface-assisted-unmanned-aerial-vehicle-enabled-wireless-communication-system" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/174128.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">110</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6992</span> Deep Learning for Renewable Power Forecasting: An Approach Using LSTM Neural Networks</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Faz%C4%B1l%20G%C3%B6kg%C3%B6z">Fazıl Gökgöz</a>, <a href="https://publications.waset.org/abstracts/search?q=Fahrettin%20Filiz"> Fahrettin Filiz</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Load forecasting has become crucial in recent years and become popular in forecasting area. Many different power forecasting models have been tried out for this purpose. Electricity load forecasting is necessary for energy policies, healthy and reliable grid systems. Effective power forecasting of renewable energy load leads the decision makers to minimize the costs of electric utilities and power plants. Forecasting tools are required that can be used to predict how much renewable energy can be utilized. The purpose of this study is to explore the effectiveness of LSTM-based neural networks for estimating renewable energy loads. In this study, we present models for predicting renewable energy loads based on deep neural networks, especially the Long Term Memory (LSTM) algorithms. Deep learning allows multiple layers of models to learn representation of data. LSTM algorithms are able to store information for long periods of time. Deep learning models have recently been used to forecast the renewable energy sources such as predicting wind and solar energy power. Historical load and weather information represent the most important variables for the inputs within the power forecasting models. The dataset contained power consumption measurements are gathered between January 2016 and December 2017 with one-hour resolution. Models use publicly available data from the Turkish Renewable Energy Resources Support Mechanism. Forecasting studies have been carried out with these data via deep neural networks approach including LSTM technique for Turkish electricity markets. 432 different models are created by changing layers cell count and dropout. The adaptive moment estimation (ADAM) algorithm is used for training as a gradient-based optimizer instead of SGD (stochastic gradient). ADAM performed better than SGD in terms of faster convergence and lower error rates. Models performance is compared according to MAE (Mean Absolute Error) and MSE (Mean Squared Error). Best five MAE results out of 432 tested models are 0.66, 0.74, 0.85 and 1.09. The forecasting performance of the proposed LSTM models gives successful results compared to literature searches. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title="deep learning">deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=long%20short%20term%20memory" title=" long short term memory"> long short term memory</a>, <a href="https://publications.waset.org/abstracts/search?q=energy" title=" energy"> energy</a>, <a href="https://publications.waset.org/abstracts/search?q=renewable%20energy%20load%20forecasting" title=" renewable energy load forecasting"> renewable energy load forecasting</a> </p> <a href="https://publications.waset.org/abstracts/91058/deep-learning-for-renewable-power-forecasting-an-approach-using-lstm-neural-networks" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/91058.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">266</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6991</span> A Long Short-Term Memory Based Deep Learning Model for Corporate Bond Price Predictions</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Vikrant%20Gupta">Vikrant Gupta</a>, <a href="https://publications.waset.org/abstracts/search?q=Amrit%20Goswami"> Amrit Goswami</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The fixed income market forms the basis of the modern financial market. All other assets in financial markets derive their value from the bond market. Owing to its over-the-counter nature, corporate bonds have relatively less data publicly available and thus is researched upon far less compared to Equities. Bond price prediction is a complex financial time series forecasting problem and is considered very crucial in the domain of finance. The bond prices are highly volatile and full of noise which makes it very difficult for traditional statistical time-series models to capture the complexity in series patterns which leads to inefficient forecasts. To overcome the inefficiencies of statistical models, various machine learning techniques were initially used in the literature for more accurate forecasting of time-series. However, simple machine learning methods such as linear regression, support vectors, random forests fail to provide efficient results when tested on highly complex sequences such as stock prices and bond prices. hence to capture these intricate sequence patterns, various deep learning-based methodologies have been discussed in the literature. In this study, a recurrent neural network-based deep learning model using long short term networks for prediction of corporate bond prices has been discussed. Long Short Term networks (LSTM) have been widely used in the literature for various sequence learning tasks in various domains such as machine translation, speech recognition, etc. In recent years, various studies have discussed the effectiveness of LSTMs in forecasting complex time-series sequences and have shown promising results when compared to other methodologies. LSTMs are a special kind of recurrent neural networks which are capable of learning long term dependencies due to its memory function which traditional neural networks fail to capture. In this study, a simple LSTM, Stacked LSTM and a Masked LSTM based model has been discussed with respect to varying input sequences (three days, seven days and 14 days). In order to facilitate faster learning and to gradually decompose the complexity of bond price sequence, an Empirical Mode Decomposition (EMD) has been used, which has resulted in accuracy improvement of the standalone LSTM model. With a variety of Technical Indicators and EMD decomposed time series, Masked LSTM outperformed the other two counterparts in terms of prediction accuracy. To benchmark the proposed model, the results have been compared with traditional time series models (ARIMA), shallow neural networks and above discussed three different LSTM models. In summary, our results show that the use of LSTM models provide more accurate results and should be explored more within the asset management industry. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=bond%20prices" title="bond prices">bond prices</a>, <a href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory" title=" long short-term memory"> long short-term memory</a>, <a href="https://publications.waset.org/abstracts/search?q=time%20series%20forecasting" title=" time series forecasting"> time series forecasting</a>, <a href="https://publications.waset.org/abstracts/search?q=empirical%20mode%20decomposition" title=" empirical mode decomposition "> empirical mode decomposition </a> </p> <a href="https://publications.waset.org/abstracts/129312/a-long-short-term-memory-based-deep-learning-model-for-corporate-bond-price-predictions" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/129312.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">136</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6990</span> Long Short-Term Memory Based Model for Modeling Nicotine Consumption Using an Electronic Cigarette and Internet of Things Devices</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hamdi%20Amroun">Hamdi Amroun</a>, <a href="https://publications.waset.org/abstracts/search?q=Yacine%20Benziani"> Yacine Benziani</a>, <a href="https://publications.waset.org/abstracts/search?q=Mehdi%20Ammi"> Mehdi Ammi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, we want to determine whether the accurate prediction of nicotine concentration can be obtained by using a network of smart objects and an e-cigarette. The approach consists of, first, the recognition of factors influencing smoking cessation such as physical activity recognition and participant’s behaviors (using both smartphone and smartwatch), then the prediction of the configuration of the e-cigarette (in terms of nicotine concentration, power, and resistance of e-cigarette). The study uses a network of commonly connected objects; a smartwatch, a smartphone, and an e-cigarette transported by the participants during an uncontrolled experiment. The data obtained from sensors carried in the three devices were trained by a Long short-term memory algorithm (LSTM). Results show that our LSTM-based model allows predicting the configuration of the e-cigarette in terms of nicotine concentration, power, and resistance with a root mean square error percentage of 12.9%, 9.15%, and 11.84%, respectively. This study can help to better control consumption of nicotine and offer an intelligent configuration of the e-cigarette to users. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Iot" title="Iot">Iot</a>, <a href="https://publications.waset.org/abstracts/search?q=activity%20recognition" title=" activity recognition"> activity recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=automatic%20classification" title=" automatic classification"> automatic classification</a>, <a href="https://publications.waset.org/abstracts/search?q=unconstrained%20environment" title=" unconstrained environment"> unconstrained environment</a> </p> <a href="https://publications.waset.org/abstracts/89965/long-short-term-memory-based-model-for-modeling-nicotine-consumption-using-an-electronic-cigarette-and-internet-of-things-devices" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/89965.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">224</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6989</span> Preparation on Sentimental Analysis on Social Media Comments with Bidirectional Long Short-Term Memory Gated Recurrent Unit and Model Glove in Portuguese</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Leonardo%20Alfredo%20Mendoza">Leonardo Alfredo Mendoza</a>, <a href="https://publications.waset.org/abstracts/search?q=Cristian%20Munoz"> Cristian Munoz</a>, <a href="https://publications.waset.org/abstracts/search?q=Marco%20Aurelio%20Pacheco"> Marco Aurelio Pacheco</a>, <a href="https://publications.waset.org/abstracts/search?q=Manoela%20Kohler"> Manoela Kohler</a>, <a href="https://publications.waset.org/abstracts/search?q=Evelyn%20%20Batista"> Evelyn Batista</a>, <a href="https://publications.waset.org/abstracts/search?q=Rodrigo%20Moura"> Rodrigo Moura</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Natural Language Processing (NLP) techniques are increasingly more powerful to be able to interpret the feelings and reactions of a person to a product or service. Sentiment analysis has become a fundamental tool for this interpretation but has few applications in languages other than English. This paper presents a classification of sentiment analysis in Portuguese with a base of comments from social networks in Portuguese. A word embedding's representation was used with a 50-Dimension GloVe pre-trained model, generated through a corpus completely in Portuguese. To generate this classification, the bidirectional long short-term memory and bidirectional Gated Recurrent Unit (GRU) models are used, reaching results of 99.1%. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=natural%20processing%20language" title="natural processing language">natural processing language</a>, <a href="https://publications.waset.org/abstracts/search?q=sentiment%20analysis" title=" sentiment analysis"> sentiment analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=bidirectional%20long%20short-term%20memory" title=" bidirectional long short-term memory"> bidirectional long short-term memory</a>, <a href="https://publications.waset.org/abstracts/search?q=BI-LSTM" title=" BI-LSTM"> BI-LSTM</a>, <a href="https://publications.waset.org/abstracts/search?q=gated%20recurrent%20unit" title=" gated recurrent unit"> gated recurrent unit</a>, <a href="https://publications.waset.org/abstracts/search?q=GRU" title=" GRU"> GRU</a> </p> <a href="https://publications.waset.org/abstracts/131061/preparation-on-sentimental-analysis-on-social-media-comments-with-bidirectional-long-short-term-memory-gated-recurrent-unit-and-model-glove-in-portuguese" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/131061.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">159</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6988</span> Efficient Fake News Detection Using Machine Learning and Deep Learning Approaches</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Chaima%20Babi">Chaima Babi</a>, <a href="https://publications.waset.org/abstracts/search?q=Said%20Gadri"> Said Gadri</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The rapid increase in fake news continues to grow at a very fast rate; this requires implementing efficient techniques that allow testing the re-liability of online content. For that, the current research strives to illuminate the fake news problem using deep learning DL and machine learning ML ap-proaches. We have developed the traditional LSTM (Long short-term memory), and the bidirectional BiLSTM model. A such process is to perform a training task on almost of samples of the dataset, validate the model on a subset called the test set to provide an unbiased evaluation of the final model fit on the training dataset, then compute the accuracy of detecting classifica-tion and comparing the results. For the programming stage, we used Tensor-Flow and Keras libraries on Python to support Graphical Processing Units (GPUs) that are being used for developing deep learning applications. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title="machine learning">machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title=" deep learning"> deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=natural%20language" title=" natural language"> natural language</a>, <a href="https://publications.waset.org/abstracts/search?q=fake%20news" title=" fake news"> fake news</a>, <a href="https://publications.waset.org/abstracts/search?q=Bi-LSTM" title=" Bi-LSTM"> Bi-LSTM</a>, <a href="https://publications.waset.org/abstracts/search?q=LSTM" title=" LSTM"> LSTM</a>, <a href="https://publications.waset.org/abstracts/search?q=multiclass%20classification" title=" multiclass classification"> multiclass classification</a> </p> <a href="https://publications.waset.org/abstracts/176098/efficient-fake-news-detection-using-machine-learning-and-deep-learning-approaches" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/176098.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">95</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6987</span> A Conv-Long Short-term Memory Deep Learning Model for Traffic Flow Prediction</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ali%20Reza%20Sattarzadeh">Ali Reza Sattarzadeh</a>, <a href="https://publications.waset.org/abstracts/search?q=Ronny%20J.%20Kutadinata"> Ronny J. Kutadinata</a>, <a href="https://publications.waset.org/abstracts/search?q=Pubudu%20N.%20Pathirana"> Pubudu N. Pathirana</a>, <a href="https://publications.waset.org/abstracts/search?q=Van%20Thanh%20Huynh"> Van Thanh Huynh</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Traffic congestion has become a severe worldwide problem, affecting everyday life, fuel consumption, time, and air pollution. The primary causes of these issues are inadequate transportation infrastructure, poor traffic signal management, and rising population. Traffic flow forecasting is one of the essential and effective methods in urban congestion and traffic management, which has attracted the attention of researchers. With the development of technology, undeniable progress has been achieved in existing methods. However, there is a possibility of improvement in the extraction of temporal and spatial features to determine the importance of traffic flow sequences and extraction features. In the proposed model, we implement the convolutional neural network (CNN) and long short-term memory (LSTM) deep learning models for mining nonlinear correlations and their effectiveness in increasing the accuracy of traffic flow prediction in the real dataset. According to the experiments, the results indicate that implementing Conv-LSTM networks increases the productivity and accuracy of deep learning models for traffic flow prediction. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=deep%20learning%20algorithms" title="deep learning algorithms">deep learning algorithms</a>, <a href="https://publications.waset.org/abstracts/search?q=intelligent%20transportation%20systems" title=" intelligent transportation systems"> intelligent transportation systems</a>, <a href="https://publications.waset.org/abstracts/search?q=spatiotemporal%20features" title=" spatiotemporal features"> spatiotemporal features</a>, <a href="https://publications.waset.org/abstracts/search?q=traffic%20flow%20prediction" title=" traffic flow prediction"> traffic flow prediction</a> </p> <a href="https://publications.waset.org/abstracts/153966/a-conv-long-short-term-memory-deep-learning-model-for-traffic-flow-prediction" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/153966.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">171</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6986</span> Synthetic Data-Driven Prediction Using GANs and LSTMs for Smart Traffic Management</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Srinivas%20Peri">Srinivas Peri</a>, <a href="https://publications.waset.org/abstracts/search?q=Siva%20Abhishek%20Sirivella"> Siva Abhishek Sirivella</a>, <a href="https://publications.waset.org/abstracts/search?q=Tejaswini%20Kallakuri"> Tejaswini Kallakuri</a>, <a href="https://publications.waset.org/abstracts/search?q=Uzair%20Ahmad"> Uzair Ahmad</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Smart cities and intelligent transportation systems rely heavily on effective traffic management and infrastructure planning. This research tackles the data scarcity challenge by generating realistically synthetic traffic data from the PeMS-Bay dataset, enhancing predictive modeling accuracy and reliability. Advanced techniques like TimeGAN and GaussianCopula are utilized to create synthetic data that mimics the statistical and structural characteristics of real-world traffic. The future integration of Spatial-Temporal Generative Adversarial Networks (ST-GAN) is anticipated to capture both spatial and temporal correlations, further improving data quality and realism. Each synthetic data generation model's performance is evaluated against real-world data to identify the most effective models for accurately replicating traffic patterns. Long Short-Term Memory (LSTM) networks are employed to model and predict complex temporal dependencies within traffic patterns. This holistic approach aims to identify areas with low vehicle counts, reveal underlying traffic issues, and guide targeted infrastructure interventions. By combining GAN-based synthetic data generation with LSTM-based traffic modeling, this study facilitates data-driven decision-making that improves urban mobility, safety, and the overall efficiency of city planning initiatives. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=GAN" title="GAN">GAN</a>, <a href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory%20%28LSTM%29" title=" long short-term memory (LSTM)"> long short-term memory (LSTM)</a>, <a href="https://publications.waset.org/abstracts/search?q=synthetic%20%20data%20generation" title=" synthetic data generation"> synthetic data generation</a>, <a href="https://publications.waset.org/abstracts/search?q=traffic%20management" title=" traffic management"> traffic management</a> </p> <a href="https://publications.waset.org/abstracts/192173/synthetic-data-driven-prediction-using-gans-and-lstms-for-smart-traffic-management" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/192173.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">14</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6985</span> Forecasting Nokoué Lake Water Levels Using Long Short-Term Memory Network</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Namwinwelbere%20Dabire">Namwinwelbere Dabire</a>, <a href="https://publications.waset.org/abstracts/search?q=Eugene%20C.%20Ezin"> Eugene C. Ezin</a>, <a href="https://publications.waset.org/abstracts/search?q=Adandedji%20M.%20Firmin"> Adandedji M. Firmin</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The prediction of hydrological flows (rainfall-depth or rainfall-discharge) is becoming increasingly important in the management of hydrological risks such as floods. In this study, the Long Short-Term Memory (LSTM) network, a state-of-the-art algorithm dedicated to time series, is applied to predict the daily water level of Nokoue Lake in Benin. This paper aims to provide an effective and reliable method enable of reproducing the future daily water level of Nokoue Lake, which is influenced by a combination of two phenomena: rainfall and river flow (runoff from the Ouémé River, the Sô River, the Porto-Novo lagoon, and the Atlantic Ocean). Performance analysis based on the forecasting horizon indicates that LSTM can predict the water level of Nokoué Lake up to a forecast horizon of t+10 days. Performance metrics such as Root Mean Square Error (RMSE), coefficient of correlation (R²), Nash-Sutcliffe Efficiency (NSE), and Mean Absolute Error (MAE) agree on a forecast horizon of up to t+3 days. The values of these metrics remain stable for forecast horizons of t+1 days, t+2 days, and t+3 days. The values of R² and NSE are greater than 0.97 during the training and testing phases in the Nokoué Lake basin. Based on the evaluation indices used to assess the model's performance for the appropriate forecast horizon of water level in the Nokoué Lake basin, the forecast horizon of t+3 days is chosen for predicting future daily water levels. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=forecasting" title="forecasting">forecasting</a>, <a href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory%20cell" title=" long short-term memory cell"> long short-term memory cell</a>, <a href="https://publications.waset.org/abstracts/search?q=recurrent%20artificial%20neural%20network" title=" recurrent artificial neural network"> recurrent artificial neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=Nokou%C3%A9%20lake" title=" Nokoué lake"> Nokoué lake</a> </p> <a href="https://publications.waset.org/abstracts/176864/forecasting-nokoue-lake-water-levels-using-long-short-term-memory-network" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/176864.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">64</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6984</span> Statistically Accurate Synthetic Data Generation for Enhanced Traffic Predictive Modeling Using Generative Adversarial Networks and Long Short-Term Memory</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Srinivas%20Peri">Srinivas Peri</a>, <a href="https://publications.waset.org/abstracts/search?q=Siva%20Abhishek%20Sirivella"> Siva Abhishek Sirivella</a>, <a href="https://publications.waset.org/abstracts/search?q=Tejaswini%20Kallakuri"> Tejaswini Kallakuri</a>, <a href="https://publications.waset.org/abstracts/search?q=Uzair%20Ahmad"> Uzair Ahmad</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Effective traffic management and infrastructure planning are crucial for the development of smart cities and intelligent transportation systems. This study addresses the challenge of data scarcity by generating realistic synthetic traffic data using the PeMS-Bay dataset, improving the accuracy and reliability of predictive modeling. Advanced synthetic data generation techniques, including TimeGAN, GaussianCopula, and PAR Synthesizer, are employed to produce synthetic data that replicates the statistical and structural characteristics of real-world traffic. Future integration of Spatial-Temporal Generative Adversarial Networks (ST-GAN) is planned to capture both spatial and temporal correlations, further improving data quality and realism. The performance of each synthetic data generation model is evaluated against real-world data to identify the best models for accurately replicating traffic patterns. Long Short-Term Memory (LSTM) networks are utilized to model and predict complex temporal dependencies within traffic patterns. This comprehensive approach aims to pinpoint areas with low vehicle counts, uncover underlying traffic issues, and inform targeted infrastructure interventions. By combining GAN-based synthetic data generation with LSTM-based traffic modeling, this study supports data-driven decision-making that enhances urban mobility, safety, and the overall efficiency of city planning initiatives. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=GAN" title="GAN">GAN</a>, <a href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory" title=" long short-term memory"> long short-term memory</a>, <a href="https://publications.waset.org/abstracts/search?q=synthetic%20data%20generation" title=" synthetic data generation"> synthetic data generation</a>, <a href="https://publications.waset.org/abstracts/search?q=traffic%20management" title=" traffic management"> traffic management</a> </p> <a href="https://publications.waset.org/abstracts/191235/statistically-accurate-synthetic-data-generation-for-enhanced-traffic-predictive-modeling-using-generative-adversarial-networks-and-long-short-term-memory" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/191235.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">25</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6983</span> Groundwater Level Prediction Using hybrid Particle Swarm Optimization-Long-Short Term Memory Model and Performance Evaluation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sneha%20Thakur">Sneha Thakur</a>, <a href="https://publications.waset.org/abstracts/search?q=Sanjeev%20Karmakar"> Sanjeev Karmakar</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper proposed hybrid Particle Swarm Optimization (PSO) – Long-Short Term Memory (LSTM) model for groundwater level prediction. The evaluation of the performance is realized using the parameters: root mean square error (RMSE) and mean absolute error (MAE). Ground water level forecasting will be very effective for planning water harvesting. Proper calculation of water level forecasting can overcome the problem of drought and flood to some extent. The objective of this work is to develop a ground water level forecasting model using deep learning technique integrated with optimization technique PSO by applying 29 years data of Chhattisgarh state, In-dia. It is important to find the precise forecasting in case of ground water level so that various water resource planning and water harvesting can be managed effectively. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory" title="long short-term memory">long short-term memory</a>, <a href="https://publications.waset.org/abstracts/search?q=particle%20swarm%20optimization" title=" particle swarm optimization"> particle swarm optimization</a>, <a href="https://publications.waset.org/abstracts/search?q=prediction" title=" prediction"> prediction</a>, <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title=" deep learning"> deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=groundwater%20level" title=" groundwater level"> groundwater level</a> </p> <a href="https://publications.waset.org/abstracts/171101/groundwater-level-prediction-using-hybrid-particle-swarm-optimization-long-short-term-memory-model-and-performance-evaluation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/171101.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">78</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6982</span> An Auxiliary Technique for Coronary Heart Disease Prediction by Analyzing Electrocardiogram Based on ResNet and Bi-Long Short-Term Memory</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yang%20Zhang">Yang Zhang</a>, <a href="https://publications.waset.org/abstracts/search?q=Jian%20He"> Jian He</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Heart disease is one of the leading causes of death in the world, and coronary heart disease (CHD) is one of the major heart diseases. Electrocardiogram (ECG) is widely used in the detection of heart diseases, but the traditional manual method for CHD prediction by analyzing ECG requires lots of professional knowledge for doctors. This paper introduces sliding window and continuous wavelet transform (CWT) to transform ECG signals into images, and then ResNet and Bi-LSTM are introduced to build the ECG feature extraction network (namely ECGNet). At last, an auxiliary system for coronary heart disease prediction was developed based on modified ResNet18 and Bi-LSTM, and the public ECG dataset of CHD from MIMIC-3 was used to train and test the system. The experimental results show that the accuracy of the method is 83%, and the F1-score is 83%. Compared with the available methods for CHD prediction based on ECG, such as kNN, decision tree, VGGNet, etc., this method not only improves the prediction accuracy but also could avoid the degradation phenomenon of the deep learning network. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Bi-LSTM" title="Bi-LSTM">Bi-LSTM</a>, <a href="https://publications.waset.org/abstracts/search?q=CHD" title=" CHD"> CHD</a>, <a href="https://publications.waset.org/abstracts/search?q=ECG" title=" ECG"> ECG</a>, <a href="https://publications.waset.org/abstracts/search?q=ResNet" title=" ResNet"> ResNet</a>, <a href="https://publications.waset.org/abstracts/search?q=sliding%C2%A0window" title=" sliding window"> sliding window</a> </p> <a href="https://publications.waset.org/abstracts/165165/an-auxiliary-technique-for-coronary-heart-disease-prediction-by-analyzing-electrocardiogram-based-on-resnet-and-bi-long-short-term-memory" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/165165.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">89</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6981</span> Prediction on Housing Price Based on Deep Learning</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Li%20Yu">Li Yu</a>, <a href="https://publications.waset.org/abstracts/search?q=Chenlu%20Jiao"> Chenlu Jiao</a>, <a href="https://publications.waset.org/abstracts/search?q=Hongrun%20Xin"> Hongrun Xin</a>, <a href="https://publications.waset.org/abstracts/search?q=Yan%20Wang"> Yan Wang</a>, <a href="https://publications.waset.org/abstracts/search?q=Kaiyang%20Wang"> Kaiyang Wang</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In order to study the impact of various factors on the housing price, we propose to build different prediction models based on deep learning to determine the existing data of the real estate in order to more accurately predict the housing price or its changing trend in the future. Considering that the factors which affect the housing price vary widely, the proposed prediction models include two categories. The first one is based on multiple characteristic factors of the real estate. We built Convolution Neural Network (CNN) prediction model and Long Short-Term Memory (LSTM) neural network prediction model based on deep learning, and logical regression model was implemented to make a comparison between these three models. Another prediction model is time series model. Based on deep learning, we proposed an LSTM-1 model purely regard to time series, then implementing and comparing the LSTM model and the Auto-Regressive and Moving Average (ARMA) model. In this paper, comprehensive study of the second-hand housing price in Beijing has been conducted from three aspects: crawling and analyzing, housing price predicting, and the result comparing. Ultimately the best model program was produced, which is of great significance to evaluation and prediction of the housing price in the real estate industry. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title="deep learning">deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=convolutional%20neural%20network" title=" convolutional neural network"> convolutional neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=LSTM" title=" LSTM"> LSTM</a>, <a href="https://publications.waset.org/abstracts/search?q=housing%20prediction" title=" housing prediction"> housing prediction</a> </p> <a href="https://publications.waset.org/abstracts/84747/prediction-on-housing-price-based-on-deep-learning" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/84747.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">306</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">6980</span> Directed-Wald Test for Distinguishing Long Memory and Nonlinearity Time Series: Power and Size Simulation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Heri%20Kuswanto">Heri Kuswanto</a>, <a href="https://publications.waset.org/abstracts/search?q=Philipp%20Sibbertsen"> Philipp Sibbertsen</a>, <a href="https://publications.waset.org/abstracts/search?q=Irhamah"> Irhamah </a> </p> <p class="card-text"><strong>Abstract:</strong></p> A Wald type test to distinguish between long memory and ESTAR nonlinearity has been developed. The test uses a directed-Wald statistic to overcome the problem of restricted parameters under the alternative. The test is derived from a model specification i.e. allows the transition parameter to appear as a nuisance parameter in the transition function. A simulation study has been conducted and it indicates that the approach leads a test with good size and power properties to distinguish between stationary long memory and ESTAR. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=directed-Wald%20test" title="directed-Wald test">directed-Wald test</a>, <a href="https://publications.waset.org/abstracts/search?q=ESTAR" title=" ESTAR"> ESTAR</a>, <a href="https://publications.waset.org/abstracts/search?q=long%20memory" title=" long memory"> long memory</a>, <a href="https://publications.waset.org/abstracts/search?q=distinguish" title=" distinguish"> distinguish</a> </p> <a href="https://publications.waset.org/abstracts/21296/directed-wald-test-for-distinguishing-long-memory-and-nonlinearity-time-series-power-and-size-simulation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/21296.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">480</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">‹</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory%20%28LSTM%29&page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory%20%28LSTM%29&page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory%20%28LSTM%29&page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory%20%28LSTM%29&page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory%20%28LSTM%29&page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory%20%28LSTM%29&page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory%20%28LSTM%29&page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory%20%28LSTM%29&page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory%20%28LSTM%29&page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory%20%28LSTM%29&page=233">233</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory%20%28LSTM%29&page=234">234</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=long%20short-term%20memory%20%28LSTM%29&page=2" rel="next">›</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">© 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">×</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>