CINXE.COM
Search results for: Ensemble Algorithm
<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: Ensemble Algorithm</title> <meta name="description" content="Search results for: Ensemble Algorithm"> <meta name="keywords" content="Ensemble Algorithm"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="Ensemble Algorithm" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="Ensemble Algorithm"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 3738</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: Ensemble Algorithm</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3738</span> Fuzzy Optimization Multi-Objective Clustering Ensemble Model for Multi-Source Data Analysis</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=C.%20B.%20Le">C. B. Le</a>, <a href="https://publications.waset.org/abstracts/search?q=V.%20N.%20Pham"> V. N. Pham</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In modern data analysis, multi-source data appears more and more in real applications. Multi-source data clustering has emerged as a important issue in the data mining and machine learning community. Different data sources provide information about different data. Therefore, multi-source data linking is essential to improve clustering performance. However, in practice multi-source data is often heterogeneous, uncertain, and large. This issue is considered a major challenge from multi-source data. Ensemble is a versatile machine learning model in which learning techniques can work in parallel, with big data. Clustering ensemble has been shown to outperform any standard clustering algorithm in terms of accuracy and robustness. However, most of the traditional clustering ensemble approaches are based on single-objective function and single-source data. This paper proposes a new clustering ensemble method for multi-source data analysis. The fuzzy optimized multi-objective clustering ensemble method is called FOMOCE. Firstly, a clustering ensemble mathematical model based on the structure of multi-objective clustering function, multi-source data, and dark knowledge is introduced. Then, rules for extracting dark knowledge from the input data, clustering algorithms, and base clusterings are designed and applied. Finally, a clustering ensemble algorithm is proposed for multi-source data analysis. The experiments were performed on the standard sample data set. The experimental results demonstrate the superior performance of the FOMOCE method compared to the existing clustering ensemble methods and multi-source clustering methods. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=clustering%20ensemble" title="clustering ensemble">clustering ensemble</a>, <a href="https://publications.waset.org/abstracts/search?q=multi-source" title=" multi-source"> multi-source</a>, <a href="https://publications.waset.org/abstracts/search?q=multi-objective" title=" multi-objective"> multi-objective</a>, <a href="https://publications.waset.org/abstracts/search?q=fuzzy%20clustering" title=" fuzzy clustering"> fuzzy clustering</a> </p> <a href="https://publications.waset.org/abstracts/136598/fuzzy-optimization-multi-objective-clustering-ensemble-model-for-multi-source-data-analysis" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/136598.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">189</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3737</span> Sentiment Analysis of Ensemble-Based Classifiers for E-Mail Data</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Muthukumarasamy%20Govindarajan">Muthukumarasamy Govindarajan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Detection of unwanted, unsolicited mails called spam from email is an interesting area of research. It is necessary to evaluate the performance of any new spam classifier using standard data sets. Recently, ensemble-based classifiers have gained popularity in this domain. In this research work, an efficient email filtering approach based on ensemble methods is addressed for developing an accurate and sensitive spam classifier. The proposed approach employs Naive Bayes (NB), Support Vector Machine (SVM) and Genetic Algorithm (GA) as base classifiers along with different ensemble methods. The experimental results show that the ensemble classifier was performing with accuracy greater than individual classifiers, and also hybrid model results are found to be better than the combined models for the e-mail dataset. The proposed ensemble-based classifiers turn out to be good in terms of classification accuracy, which is considered to be an important criterion for building a robust spam classifier. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=accuracy" title="accuracy">accuracy</a>, <a href="https://publications.waset.org/abstracts/search?q=arcing" title=" arcing"> arcing</a>, <a href="https://publications.waset.org/abstracts/search?q=bagging" title=" bagging"> bagging</a>, <a href="https://publications.waset.org/abstracts/search?q=genetic%20algorithm" title=" genetic algorithm"> genetic algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=Naive%20Bayes" title=" Naive Bayes"> Naive Bayes</a>, <a href="https://publications.waset.org/abstracts/search?q=sentiment%20mining" title=" sentiment mining"> sentiment mining</a>, <a href="https://publications.waset.org/abstracts/search?q=support%20vector%20machine" title=" support vector machine"> support vector machine</a> </p> <a href="https://publications.waset.org/abstracts/112240/sentiment-analysis-of-ensemble-based-classifiers-for-e-mail-data" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/112240.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">142</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3736</span> Rank-Based Chain-Mode Ensemble for Binary Classification</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Chongya%20Song">Chongya Song</a>, <a href="https://publications.waset.org/abstracts/search?q=Kang%20Yen"> Kang Yen</a>, <a href="https://publications.waset.org/abstracts/search?q=Alexander%20Pons"> Alexander Pons</a>, <a href="https://publications.waset.org/abstracts/search?q=Jin%20Liu"> Jin Liu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In the field of machine learning, the ensemble has been employed as a common methodology to improve the performance upon multiple base classifiers. However, the true predictions are often canceled out by the false ones during consensus due to a phenomenon called “curse of correlation” which is represented as the strong interferences among the predictions produced by the base classifiers. In addition, the existing practices are still not able to effectively mitigate the problem of imbalanced classification. Based on the analysis on our experiment results, we conclude that the two problems are caused by some inherent deficiencies in the approach of consensus. Therefore, we create an enhanced ensemble algorithm which adopts a designed rank-based chain-mode consensus to overcome the two problems. In order to evaluate the proposed ensemble algorithm, we employ a well-known benchmark data set NSL-KDD (the improved version of dataset KDDCup99 produced by University of New Brunswick) to make comparisons between the proposed and 8 common ensemble algorithms. Particularly, each compared ensemble classifier uses the same 22 base classifiers, so that the differences in terms of the improvements toward the accuracy and reliability upon the base classifiers can be truly revealed. As a result, the proposed rank-based chain-mode consensus is proved to be a more effective ensemble solution than the traditional consensus approach, which outperforms the 8 ensemble algorithms by 20% on almost all compared metrices which include accuracy, precision, recall, F1-score and area under receiver operating characteristic curve. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=consensus" title="consensus">consensus</a>, <a href="https://publications.waset.org/abstracts/search?q=curse%20of%20correlation" title=" curse of correlation"> curse of correlation</a>, <a href="https://publications.waset.org/abstracts/search?q=imbalance%20classification" title=" imbalance classification"> imbalance classification</a>, <a href="https://publications.waset.org/abstracts/search?q=rank-based%20chain-mode%20ensemble" title=" rank-based chain-mode ensemble"> rank-based chain-mode ensemble</a> </p> <a href="https://publications.waset.org/abstracts/112891/rank-based-chain-mode-ensemble-for-binary-classification" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/112891.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">138</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3735</span> Random Subspace Ensemble of CMAC Classifiers </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Somaiyeh%20Dehghan">Somaiyeh Dehghan</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohammad%20Reza%20Kheirkhahan%20Haghighi"> Mohammad Reza Kheirkhahan Haghighi </a> </p> <p class="card-text"><strong>Abstract:</strong></p> The rapid growth of domains that have data with a large number of features, while the number of samples is limited has caused difficulty in constructing strong classifiers. To reduce the dimensionality of the feature space becomes an essential step in classification task. Random subspace method (or attribute bagging) is an ensemble classifier that consists of several classifiers that each base learner in ensemble has subset of features. In the present paper, we introduce Random Subspace Ensemble of CMAC neural network (RSE-CMAC), each of which has training with subset of features. Then we use this model for classification task. For evaluation performance of our model, we compare it with bagging algorithm on 36 UCI datasets. The results reveal that the new model has better performance. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=classification" title="classification">classification</a>, <a href="https://publications.waset.org/abstracts/search?q=random%20subspace" title=" random subspace"> random subspace</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble" title=" ensemble"> ensemble</a>, <a href="https://publications.waset.org/abstracts/search?q=CMAC%20neural%20network" title=" CMAC neural network"> CMAC neural network</a> </p> <a href="https://publications.waset.org/abstracts/14371/random-subspace-ensemble-of-cmac-classifiers" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/14371.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">329</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3734</span> Feature Evaluation Based on Random Subspace and Multiple-K Ensemble</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jaehong%20Yu">Jaehong Yu</a>, <a href="https://publications.waset.org/abstracts/search?q=Seoung%20Bum%20Kim"> Seoung Bum Kim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Clustering analysis can facilitate the extraction of intrinsic patterns in a dataset and reveal its natural groupings without requiring class information. For effective clustering analysis in high dimensional datasets, unsupervised dimensionality reduction is an important task. Unsupervised dimensionality reduction can generally be achieved by feature extraction or feature selection. In many situations, feature selection methods are more appropriate than feature extraction methods because of their clear interpretation with respect to the original features. The unsupervised feature selection can be categorized as feature subset selection and feature ranking method, and we focused on unsupervised feature ranking methods which evaluate the features based on their importance scores. Recently, several unsupervised feature ranking methods were developed based on ensemble approaches to achieve their higher accuracy and stability. However, most of the ensemble-based feature ranking methods require the true number of clusters. Furthermore, these algorithms evaluate the feature importance depending on the ensemble clustering solution, and they produce undesirable evaluation results if the clustering solutions are inaccurate. To address these limitations, we proposed an ensemble-based feature ranking method with random subspace and multiple-k ensemble (FRRM). The proposed FRRM algorithm evaluates the importance of each feature with the random subspace ensemble, and all evaluation results are combined with the ensemble importance scores. Moreover, FRRM does not require the determination of the true number of clusters in advance through the use of the multiple-k ensemble idea. Experiments on various benchmark datasets were conducted to examine the properties of the proposed FRRM algorithm and to compare its performance with that of existing feature ranking methods. The experimental results demonstrated that the proposed FRRM outperformed the competitors. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=clustering%20analysis" title="clustering analysis">clustering analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=multiple-k%20ensemble" title=" multiple-k ensemble"> multiple-k ensemble</a>, <a href="https://publications.waset.org/abstracts/search?q=random%20subspace-based%20feature%20evaluation" title=" random subspace-based feature evaluation"> random subspace-based feature evaluation</a>, <a href="https://publications.waset.org/abstracts/search?q=unsupervised%20feature%20ranking" title=" unsupervised feature ranking"> unsupervised feature ranking</a> </p> <a href="https://publications.waset.org/abstracts/52081/feature-evaluation-based-on-random-subspace-and-multiple-k-ensemble" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/52081.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">339</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3733</span> An Ensemble-based Method for Vehicle Color Recognition</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Saeedeh%20Barzegar%20Khalilsaraei">Saeedeh Barzegar Khalilsaraei</a>, <a href="https://publications.waset.org/abstracts/search?q=Manoocheher%20Kelarestaghi"> Manoocheher Kelarestaghi</a>, <a href="https://publications.waset.org/abstracts/search?q=Farshad%20Eshghi"> Farshad Eshghi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The vehicle color, as a prominent and stable feature, helps to identify a vehicle more accurately. As a result, vehicle color recognition is of great importance in intelligent transportation systems. Unlike conventional methods which use only a single Convolutional Neural Network (CNN) for feature extraction or classification, in this paper, four CNNs, with different architectures well-performing in different classes, are trained to extract various features from the input image. To take advantage of the distinct capability of each network, the multiple outputs are combined using a stack generalization algorithm as an ensemble technique. As a result, the final model performs better than each CNN individually in vehicle color identification. The evaluation results in terms of overall average accuracy and accuracy variance show the proposed method’s outperformance compared to the state-of-the-art rivals. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Vehicle%20Color%20Recognition" title="Vehicle Color Recognition">Vehicle Color Recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=Ensemble%20Algorithm" title="Ensemble Algorithm">Ensemble Algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=Stack%20Generalization" title="Stack Generalization">Stack Generalization</a>, <a href="https://publications.waset.org/abstracts/search?q=Convolutional%20Neural%20Network" title="Convolutional Neural Network">Convolutional Neural Network</a> </p> <a href="https://publications.waset.org/abstracts/146909/an-ensemble-based-method-for-vehicle-color-recognition" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/146909.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">85</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3732</span> A Genetic Algorithm Based Ensemble Method with Pairwise Consensus Score on Malware Cacophonous Labels</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Shih-Yu%20Wang">Shih-Yu Wang</a>, <a href="https://publications.waset.org/abstracts/search?q=Shun-Wen%20Hsiao"> Shun-Wen Hsiao</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In the field of cybersecurity, there exists many vendors giving malware samples classified results, namely naming after the label that contains some important information which is also called AV label. Lots of researchers relay on AV labels for research. Unfortunately, AV labels are too cluttered. They do not have a fixed format and fixed naming rules because the naming results were based on each classifiers' viewpoints. A way to fix the problem is taking a majority vote. However, voting can sometimes create problems of bias. Thus, we create a novel ensemble approach which does not rely on the cacophonous naming result but depend on group identification to aggregate everyone's opinion. To achieve this purpose, we develop an scoring system called Pairwise Consensus Score (PCS) to calculate result similarity. The entire method architecture combine Genetic Algorithm and PCS to find maximum consensus in the group. Experimental results revealed that our method outperformed the majority voting by 10% in term of the score. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=genetic%20algorithm" title="genetic algorithm">genetic algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20learning" title=" ensemble learning"> ensemble learning</a>, <a href="https://publications.waset.org/abstracts/search?q=malware%20family" title=" malware family"> malware family</a>, <a href="https://publications.waset.org/abstracts/search?q=malware%20labeling" title=" malware labeling"> malware labeling</a>, <a href="https://publications.waset.org/abstracts/search?q=AV%20labels" title=" AV labels"> AV labels</a> </p> <a href="https://publications.waset.org/abstracts/159376/a-genetic-algorithm-based-ensemble-method-with-pairwise-consensus-score-on-malware-cacophonous-labels" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/159376.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">86</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3731</span> Gene Prediction in DNA Sequences Using an Ensemble Algorithm Based on Goertzel Algorithm and Anti-Notch Filter</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Hamidreza%20Saberkari">Hamidreza Saberkari</a>, <a href="https://publications.waset.org/abstracts/search?q=Mousa%20Shamsi"> Mousa Shamsi</a>, <a href="https://publications.waset.org/abstracts/search?q=Hossein%20Ahmadi"> Hossein Ahmadi</a>, <a href="https://publications.waset.org/abstracts/search?q=Saeed%20Vaali"> Saeed Vaali</a>, <a href="https://publications.waset.org/abstracts/search?q="></a>, <a href="https://publications.waset.org/abstracts/search?q=MohammadHossein%20Sedaaghi">MohammadHossein Sedaaghi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In the recent years, using signal processing tools for accurate identification of the protein coding regions has become a challenge in bioinformatics. Most of the genomic signal processing methods is based on the period-3 characteristics of the nucleoids in DNA strands and consequently, spectral analysis is applied to the numerical sequences of DNA to find the location of periodical components. In this paper, a novel ensemble algorithm for gene selection in DNA sequences has been presented which is based on the combination of Goertzel algorithm and anti-notch filter (ANF). The proposed algorithm has many advantages when compared to other conventional methods. Firstly, it leads to identify the coding protein regions more accurate due to using the Goertzel algorithm which is tuned at the desired frequency. Secondly, faster detection time is achieved. The proposed algorithm is applied on several genes, including genes available in databases BG570 and HMR195 and their results are compared to other methods based on the nucleotide level evaluation criteria. Implementation results show the excellent performance of the proposed algorithm in identifying protein coding regions, specifically in identification of small-scale gene areas. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=protein%20coding%20regions" title="protein coding regions">protein coding regions</a>, <a href="https://publications.waset.org/abstracts/search?q=period-3" title=" period-3"> period-3</a>, <a href="https://publications.waset.org/abstracts/search?q=anti-notch%20filter" title=" anti-notch filter"> anti-notch filter</a>, <a href="https://publications.waset.org/abstracts/search?q=Goertzel%20algorithm" title=" Goertzel algorithm"> Goertzel algorithm</a> </p> <a href="https://publications.waset.org/abstracts/10286/gene-prediction-in-dna-sequences-using-an-ensemble-algorithm-based-on-goertzel-algorithm-and-anti-notch-filter" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/10286.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">387</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3730</span> A Video Surveillance System Using an Ensemble of Simple Neural Network Classifiers</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Rodrigo%20S.%20Moreira">Rodrigo S. Moreira</a>, <a href="https://publications.waset.org/abstracts/search?q=Nelson%20F.%20F.%20Ebecken"> Nelson F. F. Ebecken</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper proposes a maritime vessel tracker composed of an ensemble of WiSARD weightless neural network classifiers. A failure detector analyzes vessel movement with a Kalman filter and corrects the tracking, if necessary, using FFT matching. The use of the WiSARD neural network to track objects is uncommon. The additional contributions of the present study include a performance comparison with four state-of-art trackers, an experimental study of the features that improve maritime vessel tracking, the first use of an ensemble of classifiers to track maritime vessels and a new quantization algorithm that compares the values of pixel pairs. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ram%20memory" title="ram memory">ram memory</a>, <a href="https://publications.waset.org/abstracts/search?q=WiSARD%20weightless%20neural%20network" title=" WiSARD weightless neural network"> WiSARD weightless neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=object%20tracking" title=" object tracking"> object tracking</a>, <a href="https://publications.waset.org/abstracts/search?q=quantization" title=" quantization"> quantization</a> </p> <a href="https://publications.waset.org/abstracts/49928/a-video-surveillance-system-using-an-ensemble-of-simple-neural-network-classifiers" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/49928.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">310</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3729</span> Multilabel Classification with Neural Network Ensemble Method</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sezin%20Ek%C5%9Fio%C4%9Flu">Sezin Ekşioğlu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Multilabel classification has a huge importance for several applications, it is also a challenging research topic. It is a kind of supervised learning that contains binary targets. The distance between multilabel and binary classification is having more than one class in multilabel classification problems. Features can belong to one class or many classes. There exists a wide range of applications for multi label prediction such as image labeling, text categorization, gene functionality. Even though features are classified in many classes, they may not always be properly classified. There are many ensemble methods for the classification. However, most of the researchers have been concerned about better multilabel methods. Especially little ones focus on both efficiency of classifiers and pairwise relationships at the same time in order to implement better multilabel classification. In this paper, we worked on modified ensemble methods by getting benefit from k-Nearest Neighbors and neural network structure to address issues within a beneficial way and to get better impacts from the multilabel classification. Publicly available datasets (yeast, emotion, scene and birds) are performed to demonstrate the developed algorithm efficiency and the technique is measured by accuracy, F1 score and hamming loss metrics. Our algorithm boosts benchmarks for each datasets with different metrics. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=multilabel" title="multilabel">multilabel</a>, <a href="https://publications.waset.org/abstracts/search?q=classification" title=" classification"> classification</a>, <a href="https://publications.waset.org/abstracts/search?q=neural%20network" title=" neural network"> neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=KNN" title=" KNN"> KNN</a> </p> <a href="https://publications.waset.org/abstracts/148169/multilabel-classification-with-neural-network-ensemble-method" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/148169.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">155</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3728</span> Breast Cancer Survivability Prediction via Classifier Ensemble</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mohamed%20Al-Badrashiny">Mohamed Al-Badrashiny</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdelghani%20Bellaachia"> Abdelghani Bellaachia</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=classifier%20ensemble" title="classifier ensemble">classifier ensemble</a>, <a href="https://publications.waset.org/abstracts/search?q=breast%20cancer%20survivability" title=" breast cancer survivability"> breast cancer survivability</a>, <a href="https://publications.waset.org/abstracts/search?q=data%20mining" title=" data mining"> data mining</a>, <a href="https://publications.waset.org/abstracts/search?q=SEER" title=" SEER"> SEER</a> </p> <a href="https://publications.waset.org/abstracts/42621/breast-cancer-survivability-prediction-via-classifier-ensemble" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/42621.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">328</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3727</span> Evaluation of Ensemble Classifiers for Intrusion Detection </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=M.%20Govindarajan">M. Govindarajan </a> </p> <p class="card-text"><strong>Abstract:</strong></p> One of the major developments in machine learning in the past decade is the ensemble method, which finds highly accurate classifier by combining many moderately accurate component classifiers. In this research work, new ensemble classification methods are proposed with homogeneous ensemble classifier using bagging and heterogeneous ensemble classifier using arcing and their performances are analyzed in terms of accuracy. A Classifier ensemble is designed using Radial Basis Function (RBF) and Support Vector Machine (SVM) as base classifiers. The feasibility and the benefits of the proposed approaches are demonstrated by the means of standard datasets of intrusion detection. The main originality of the proposed approach is based on three main parts: preprocessing phase, classification phase, and combining phase. A wide range of comparative experiments is conducted for standard datasets of intrusion detection. The performance of the proposed homogeneous and heterogeneous ensemble classifiers are compared to the performance of other standard homogeneous and heterogeneous ensemble methods. The standard homogeneous ensemble methods include Error correcting output codes, Dagging and heterogeneous ensemble methods include majority voting, stacking. The proposed ensemble methods provide significant improvement of accuracy compared to individual classifiers and the proposed bagged RBF and SVM performs significantly better than ECOC and Dagging and the proposed hybrid RBF-SVM performs significantly better than voting and stacking. Also heterogeneous models exhibit better results than homogeneous models for standard datasets of intrusion detection. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=data%20mining" title="data mining">data mining</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble" title=" ensemble"> ensemble</a>, <a href="https://publications.waset.org/abstracts/search?q=radial%20basis%20function" title=" radial basis function"> radial basis function</a>, <a href="https://publications.waset.org/abstracts/search?q=support%20vector%20machine" title=" support vector machine"> support vector machine</a>, <a href="https://publications.waset.org/abstracts/search?q=accuracy" title=" accuracy"> accuracy</a> </p> <a href="https://publications.waset.org/abstracts/43650/evaluation-of-ensemble-classifiers-for-intrusion-detection" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/43650.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">248</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3726</span> A Dynamic Ensemble Learning Approach for Online Anomaly Detection in Alibaba Datacenters</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Wanyi%20Zhu">Wanyi Zhu</a>, <a href="https://publications.waset.org/abstracts/search?q=Xia%20Ming"> Xia Ming</a>, <a href="https://publications.waset.org/abstracts/search?q=Huafeng%20Wang"> Huafeng Wang</a>, <a href="https://publications.waset.org/abstracts/search?q=Junda%20Chen"> Junda Chen</a>, <a href="https://publications.waset.org/abstracts/search?q=Lu%20Liu"> Lu Liu</a>, <a href="https://publications.waset.org/abstracts/search?q=Jiangwei%20Jiang"> Jiangwei Jiang</a>, <a href="https://publications.waset.org/abstracts/search?q=Guohua%20Liu"> Guohua Liu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Anomaly detection is a first and imperative step needed to respond to unexpected problems and to assure high performance and security in large data center management. This paper presents an online anomaly detection system through an innovative approach of ensemble machine learning and adaptive differentiation algorithms, and applies them to performance data collected from a continuous monitoring system for multi-tier web applications running in Alibaba data centers. We evaluate the effectiveness and efficiency of this algorithm with production traffic data and compare with the traditional anomaly detection approaches such as a static threshold and other deviation-based detection techniques. The experiment results show that our algorithm correctly identifies the unexpected performance variances of any running application, with an acceptable false positive rate. This proposed approach has already been deployed in real-time production environments to enhance the efficiency and stability in daily data center operations. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Alibaba%20data%20centers" title="Alibaba data centers">Alibaba data centers</a>, <a href="https://publications.waset.org/abstracts/search?q=anomaly%20detection" title=" anomaly detection"> anomaly detection</a>, <a href="https://publications.waset.org/abstracts/search?q=big%20data%20computation" title=" big data computation"> big data computation</a>, <a href="https://publications.waset.org/abstracts/search?q=dynamic%20ensemble%20learning" title=" dynamic ensemble learning"> dynamic ensemble learning</a> </p> <a href="https://publications.waset.org/abstracts/86171/a-dynamic-ensemble-learning-approach-for-online-anomaly-detection-in-alibaba-datacenters" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/86171.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">200</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3725</span> Umbrella Reinforcement Learning – A Tool for Hard Problems</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Egor%20E.%20Nuzhin">Egor E. Nuzhin</a>, <a href="https://publications.waset.org/abstracts/search?q=Nikolay%20V.%20Brilliantov">Nikolay V. Brilliantov</a> </p> <p class="card-text"><strong>Abstract:</strong></p> We propose an approach for addressing Reinforcement Learning (RL) problems. It combines the ideas of umbrella sampling, borrowed from Monte Carlo technique of computational physics and chemistry, with optimal control methods, and is realized on the base of neural networks. This results in a powerful algorithm, designed to solve hard RL problems – the problems, with long-time delayed reward, state-traps sticking and a lack of terminal states. It outperforms the prominent algorithms, such as PPO, RND, iLQR and VI, which are among the most efficient for the hard problems. The new algorithm deals with a continuous ensemble of agents and expected return, that includes the ensemble entropy. This results in a quick and efficient search of the optimal policy in terms of ”exploration-exploitation trade-off” in the state-action space. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=umbrella%20sampling" title="umbrella sampling">umbrella sampling</a>, <a href="https://publications.waset.org/abstracts/search?q=reinforcement%20learning" title=" reinforcement learning"> reinforcement learning</a>, <a href="https://publications.waset.org/abstracts/search?q=policy%20gradient" title=" policy gradient"> policy gradient</a>, <a href="https://publications.waset.org/abstracts/search?q=dynamic%20programming" title=" dynamic programming"> dynamic programming</a> </p> <a href="https://publications.waset.org/abstracts/192151/umbrella-reinforcement-learning-a-tool-for-hard-problems" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/192151.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">21</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3724</span> Decision Trees Constructing Based on K-Means Clustering Algorithm</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Loai%20Abdallah">Loai Abdallah</a>, <a href="https://publications.waset.org/abstracts/search?q=Malik%20Yousef"> Malik Yousef</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A domain space for the data should reflect the actual similarity between objects. Since objects belonging to the same cluster usually share some common traits even though their geometric distance might be relatively large. In general, the Euclidean distance of data points that represented by large number of features is not capturing the actual relation between those points. In this study, we propose a new method to construct a different space that is based on clustering to form a new distance metric. The new distance space is based on ensemble clustering (EC). The EC distance space is defined by tracking the membership of the points over multiple runs of clustering algorithm metric. Over this distance, we train the decision trees classifier (DT-EC). The results obtained by applying DT-EC on 10 datasets confirm our hypotheses that embedding the EC space as a distance metric would improve the performance. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ensemble%20clustering" title="ensemble clustering">ensemble clustering</a>, <a href="https://publications.waset.org/abstracts/search?q=decision%20trees" title=" decision trees"> decision trees</a>, <a href="https://publications.waset.org/abstracts/search?q=classification" title=" classification"> classification</a>, <a href="https://publications.waset.org/abstracts/search?q=K%20nearest%20neighbors" title=" K nearest neighbors"> K nearest neighbors</a> </p> <a href="https://publications.waset.org/abstracts/89656/decision-trees-constructing-based-on-k-means-clustering-algorithm" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/89656.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">190</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3723</span> Methods for Enhancing Ensemble Learning or Improving Classifiers of This Technique in the Analysis and Classification of Brain Signals</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Seyed%20Mehdi%20Ghezi">Seyed Mehdi Ghezi</a>, <a href="https://publications.waset.org/abstracts/search?q=Hesam%20Hasanpoor"> Hesam Hasanpoor</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This scientific article explores enhancement methods for ensemble learning with the aim of improving the performance of classifiers in the analysis and classification of brain signals. The research approach in this field consists of two main parts, each with its own strengths and weaknesses. The choice of approach depends on the specific research question and available resources. By combining these approaches and leveraging their respective strengths, researchers can enhance the accuracy and reliability of classification results, consequently advancing our understanding of the brain and its functions. The first approach focuses on utilizing machine learning methods to identify the best features among the vast array of features present in brain signals. The selection of features varies depending on the research objective, and different techniques have been employed for this purpose. For instance, the genetic algorithm has been used in some studies to identify the best features, while optimization methods have been utilized in others to identify the most influential features. Additionally, machine learning techniques have been applied to determine the influential electrodes in classification. Ensemble learning plays a crucial role in identifying the best features that contribute to learning, thereby improving the overall results. The second approach concentrates on designing and implementing methods for selecting the best classifier or utilizing meta-classifiers to enhance the final results in ensemble learning. In a different section of the research, a single classifier is used instead of multiple classifiers, employing different sets of features to improve the results. The article provides an in-depth examination of each technique, highlighting their advantages and limitations. By integrating these techniques, researchers can enhance the performance of classifiers in the analysis and classification of brain signals. This advancement in ensemble learning methodologies contributes to a better understanding of the brain and its functions, ultimately leading to improved accuracy and reliability in brain signal analysis and classification. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ensemble%20learning" title="ensemble learning">ensemble learning</a>, <a href="https://publications.waset.org/abstracts/search?q=brain%20signals" title=" brain signals"> brain signals</a>, <a href="https://publications.waset.org/abstracts/search?q=classification" title=" classification"> classification</a>, <a href="https://publications.waset.org/abstracts/search?q=feature%20selection" title=" feature selection"> feature selection</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title=" machine learning"> machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=genetic%20algorithm" title=" genetic algorithm"> genetic algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=optimization%20methods" title=" optimization methods"> optimization methods</a>, <a href="https://publications.waset.org/abstracts/search?q=influential%20features" title=" influential features"> influential features</a>, <a href="https://publications.waset.org/abstracts/search?q=influential%20electrodes" title=" influential electrodes"> influential electrodes</a>, <a href="https://publications.waset.org/abstracts/search?q=meta-classifiers" title=" meta-classifiers"> meta-classifiers</a> </p> <a href="https://publications.waset.org/abstracts/177312/methods-for-enhancing-ensemble-learning-or-improving-classifiers-of-this-technique-in-the-analysis-and-classification-of-brain-signals" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/177312.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">75</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3722</span> Ensemble-Based SVM Classification Approach for miRNA Prediction</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sondos%20M.%20Hammad">Sondos M. Hammad</a>, <a href="https://publications.waset.org/abstracts/search?q=Sherin%20M.%20ElGokhy"> Sherin M. ElGokhy</a>, <a href="https://publications.waset.org/abstracts/search?q=Mahmoud%20M.%20Fahmy"> Mahmoud M. Fahmy</a>, <a href="https://publications.waset.org/abstracts/search?q=Elsayed%20A.%20Sallam"> Elsayed A. Sallam</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, an ensemble-based Support Vector Machine (SVM) classification approach is proposed. It is used for miRNA prediction. Three problems, commonly associated with previous approaches, are alleviated. These problems arise due to impose assumptions on the secondary structural of premiRNA, imbalance between the numbers of the laboratory checked miRNAs and the pseudo-hairpins, and finally using a training data set that does not consider all the varieties of samples in different species. We aggregate the predicted outputs of three well-known SVM classifiers; namely, Triplet-SVM, Virgo and Mirident, weighted by their variant features without any structural assumptions. An additional SVM layer is used in aggregating the final output. The proposed approach is trained and then tested with balanced data sets. The results of the proposed approach outperform the three base classifiers. Improved values for the metrics of 88.88% f-score, 92.73% accuracy, 90.64% precision, 96.64% specificity, 87.2% sensitivity, and the area under the ROC curve is 0.91 are achieved. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=MiRNAs" title="MiRNAs">MiRNAs</a>, <a href="https://publications.waset.org/abstracts/search?q=SVM%20classification" title=" SVM classification"> SVM classification</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20algorithm" title=" ensemble algorithm"> ensemble algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=assumption%20problem" title=" assumption problem"> assumption problem</a>, <a href="https://publications.waset.org/abstracts/search?q=imbalance%20data" title=" imbalance data"> imbalance data</a> </p> <a href="https://publications.waset.org/abstracts/32331/ensemble-based-svm-classification-approach-for-mirna-prediction" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/32331.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">349</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3721</span> Extreme Temperature Response to Solar Radiation Management in Southeast Asia</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Heri%20Kuswanto">Heri Kuswanto</a>, <a href="https://publications.waset.org/abstracts/search?q=Brina%20Miftahurrohmah"> Brina Miftahurrohmah</a>, <a href="https://publications.waset.org/abstracts/search?q=Fatkhurokhman%20Fauzi"> Fatkhurokhman Fauzi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Southeast Asia has experienced rising temperatures and is predicted to reach a 1.5°C increase by 2030, which is earlier than the Paris Agreement target. Solar Radiation Management (SRM) has been proposed as an alternative to combat global warming. This research investigates changes in the annual maximum temperature (TXx) with and without SRM over southeast Asia. We examined outputs from three ensemble members of the Geoengineering Large Ensemble Project (GLENS) experiment for the period 2051 to 2080. One ensemble member generated outputs that significantly deviated from the others, leading to the removal of ensemble 3 from the impact analysis. Our observations indicate that the magnitude of TXx changes with SRM is heterogeneous across countries. We found that SRM significantly reduces TXx levels compared to historical periods. Furthermore, SRM can reduce temperatures by up to 5°C compared to scenarios without SRM, with even more pronounced effects in Thailand, Cambodia, Laos, and Myanmar. This indicates that SRM can mitigate climate change by lowering future TXx levels. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=solar%20radiation%20management" title="solar radiation management">solar radiation management</a>, <a href="https://publications.waset.org/abstracts/search?q=GLENS" title=" GLENS"> GLENS</a>, <a href="https://publications.waset.org/abstracts/search?q=extreme" title=" extreme"> extreme</a>, <a href="https://publications.waset.org/abstracts/search?q=temperature" title=" temperature"> temperature</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble" title=" ensemble"> ensemble</a> </p> <a href="https://publications.waset.org/abstracts/193495/extreme-temperature-response-to-solar-radiation-management-in-southeast-asia" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/193495.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">14</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3720</span> Ensemble Machine Learning Approach for Estimating Missing Data from CO₂ Time Series</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Atbin%20Mahabbati">Atbin Mahabbati</a>, <a href="https://publications.waset.org/abstracts/search?q=Jason%20Beringer"> Jason Beringer</a>, <a href="https://publications.waset.org/abstracts/search?q=Matthias%20Leopold"> Matthias Leopold</a> </p> <p class="card-text"><strong>Abstract:</strong></p> To address the global challenges of climate and environmental changes, there is a need for quantifying and reducing uncertainties in environmental data, including observations of carbon, water, and energy. Global eddy covariance flux tower networks (FLUXNET), and their regional counterparts (i.e., OzFlux, AmeriFlux, China Flux, etc.) were established in the late 1990s and early 2000s to address the demand. Despite the capability of eddy covariance in validating process modelling analyses, field surveys and remote sensing assessments, there are some serious concerns regarding the challenges associated with the technique, e.g. data gaps and uncertainties. To address these concerns, this research has developed an ensemble model to fill the data gaps of CO₂ flux to avoid the limitations of using a single algorithm, and therefore, provide less error and decline the uncertainties associated with the gap-filling process. In this study, the data of five towers in the OzFlux Network (Alice Springs Mulga, Calperum, Gingin, Howard Springs and Tumbarumba) during 2013 were used to develop an ensemble machine learning model, using five feedforward neural networks (FFNN) with different structures combined with an eXtreme Gradient Boosting (XGB) algorithm. The former methods, FFNN, provided the primary estimations in the first layer, while the later, XGB, used the outputs of the first layer as its input to provide the final estimations of CO₂ flux. The introduced model showed slight superiority over each single FFNN and the XGB, while each of these two methods was used individually, overall RMSE: 2.64, 2.91, and 3.54 g C m⁻² yr⁻¹ respectively (3.54 provided by the best FFNN). The most significant improvement happened to the estimation of the extreme diurnal values (during midday and sunrise), as well as nocturnal estimations, which is generally considered as one of the most challenging parts of CO₂ flux gap-filling. The towers, as well as seasonality, showed different levels of sensitivity to improvements provided by the ensemble model. For instance, Tumbarumba showed more sensitivity compared to Calperum, where the differences between the Ensemble model on the one hand and the FFNNs and XGB, on the other hand, were the least of all 5 sites. Besides, the performance difference between the ensemble model and its components individually were more significant during the warm season (Jan, Feb, Mar, Oct, Nov, and Dec) compared to the cold season (Apr, May, Jun, Jul, Aug, and Sep) due to the higher amount of photosynthesis of plants, which led to a larger range of CO₂ exchange. In conclusion, the introduced ensemble model slightly improved the accuracy of CO₂ flux gap-filling and robustness of the model. Therefore, using ensemble machine learning models is potentially capable of improving data estimation and regression outcome when it seems to be no more room for improvement while using a single algorithm. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=carbon%20flux" title="carbon flux">carbon flux</a>, <a href="https://publications.waset.org/abstracts/search?q=Eddy%20covariance" title=" Eddy covariance"> Eddy covariance</a>, <a href="https://publications.waset.org/abstracts/search?q=extreme%20gradient%20boosting" title=" extreme gradient boosting"> extreme gradient boosting</a>, <a href="https://publications.waset.org/abstracts/search?q=gap-filling%20comparison" title=" gap-filling comparison"> gap-filling comparison</a>, <a href="https://publications.waset.org/abstracts/search?q=hybrid%20model" title=" hybrid model"> hybrid model</a>, <a href="https://publications.waset.org/abstracts/search?q=OzFlux%20network" title=" OzFlux network"> OzFlux network</a> </p> <a href="https://publications.waset.org/abstracts/122496/ensemble-machine-learning-approach-for-estimating-missing-data-from-co2-time-series" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/122496.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">139</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3719</span> Machine Learning Predictive Models for Hydroponic Systems: A Case Study Nutrient Film Technique and Deep Flow Technique</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Kritiyaporn%20Kunsook">Kritiyaporn Kunsook</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Machine learning algorithms (MLAs) such us artificial neural networks (ANNs), decision tree, support vector machines (SVMs), Naïve Bayes, and ensemble classifier by voting are powerful data driven methods that are relatively less widely used in the mapping of technique of system, and thus have not been comparatively evaluated together thoroughly in this field. The performances of a series of MLAs, ANNs, decision tree, SVMs, Naïve Bayes, and ensemble classifier by voting in technique of hydroponic systems prospectively modeling are compared based on the accuracy of each model. Classification of hydroponic systems only covers the test samples from vegetables grown with Nutrient film technique (NFT) and Deep flow technique (DFT). The feature, which are the characteristics of vegetables compose harvesting height width, temperature, require light and color. The results indicate that the classification performance of the ANNs is 98%, decision tree is 98%, SVMs is 97.33%, Naïve Bayes is 96.67%, and ensemble classifier by voting is 98.96% algorithm respectively. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=artificial%20neural%20networks" title="artificial neural networks">artificial neural networks</a>, <a href="https://publications.waset.org/abstracts/search?q=decision%20tree" title=" decision tree"> decision tree</a>, <a href="https://publications.waset.org/abstracts/search?q=support%20vector%20machines" title=" support vector machines"> support vector machines</a>, <a href="https://publications.waset.org/abstracts/search?q=na%C3%AFve%20Bayes" title=" naïve Bayes"> naïve Bayes</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20classifier%20by%20voting" title=" ensemble classifier by voting"> ensemble classifier by voting</a> </p> <a href="https://publications.waset.org/abstracts/91070/machine-learning-predictive-models-for-hydroponic-systems-a-case-study-nutrient-film-technique-and-deep-flow-technique" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/91070.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">372</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3718</span> Enhancing Predictive Accuracy in Pharmaceutical Sales through an Ensemble Kernel Gaussian Process Regression Approach</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Shahin%20Mirshekari">Shahin Mirshekari</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohammadreza%20Moradi"> Mohammadreza Moradi</a>, <a href="https://publications.waset.org/abstracts/search?q=Hossein%20Jafari"> Hossein Jafari</a>, <a href="https://publications.waset.org/abstracts/search?q=Mehdi%20Jafari"> Mehdi Jafari</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohammad%20Ensaf"> Mohammad Ensaf</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This research employs Gaussian Process Regression (GPR) with an ensemble kernel, integrating Exponential Squared, Revised Matern, and Rational Quadratic kernels to analyze pharmaceutical sales data. Bayesian optimization was used to identify optimal kernel weights: 0.76 for Exponential Squared, 0.21 for Revised Matern, and 0.13 for Rational Quadratic. The ensemble kernel demonstrated superior performance in predictive accuracy, achieving an R² score near 1.0, and significantly lower values in MSE, MAE, and RMSE. These findings highlight the efficacy of ensemble kernels in GPR for predictive analytics in complex pharmaceutical sales datasets. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Gaussian%20process%20regression" title="Gaussian process regression">Gaussian process regression</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20kernels" title=" ensemble kernels"> ensemble kernels</a>, <a href="https://publications.waset.org/abstracts/search?q=bayesian%20optimization" title=" bayesian optimization"> bayesian optimization</a>, <a href="https://publications.waset.org/abstracts/search?q=pharmaceutical%20sales%20analysis" title=" pharmaceutical sales analysis"> pharmaceutical sales analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=time%20series%20forecasting" title=" time series forecasting"> time series forecasting</a>, <a href="https://publications.waset.org/abstracts/search?q=data%20analysis" title=" data analysis"> data analysis</a> </p> <a href="https://publications.waset.org/abstracts/181581/enhancing-predictive-accuracy-in-pharmaceutical-sales-through-an-ensemble-kernel-gaussian-process-regression-approach" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/181581.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">71</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3717</span> Stacking Ensemble Approach for Combining Different Methods in Real Estate Prediction</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sol%20Girouard">Sol Girouard</a>, <a href="https://publications.waset.org/abstracts/search?q=Zona%20Kostic"> Zona Kostic</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A home is often the largest and most expensive purchase a person makes. Whether the decision leads to a successful outcome will be determined by a combination of critical factors. In this paper, we propose a method that efficiently handles all the factors in residential real estate and performs predictions given a feature space with high dimensionality while controlling for overfitting. The proposed method was built on gradient descent and boosting algorithms and uses a mixed optimizing technique to improve the prediction power. Usually, a single model cannot handle all the cases thus our approach builds multiple models based on different subsets of the predictors. The algorithm was tested on 3 million homes across the U.S., and the experimental results demonstrate the efficiency of this approach by outperforming techniques currently used in forecasting prices. With everyday changes on the real estate market, our proposed algorithm capitalizes from new events allowing more efficient predictions. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=real%20estate%20prediction" title="real estate prediction">real estate prediction</a>, <a href="https://publications.waset.org/abstracts/search?q=gradient%20descent" title=" gradient descent"> gradient descent</a>, <a href="https://publications.waset.org/abstracts/search?q=boosting" title=" boosting"> boosting</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20methods" title=" ensemble methods"> ensemble methods</a>, <a href="https://publications.waset.org/abstracts/search?q=active%20learning" title=" active learning"> active learning</a>, <a href="https://publications.waset.org/abstracts/search?q=training" title=" training"> training</a> </p> <a href="https://publications.waset.org/abstracts/90597/stacking-ensemble-approach-for-combining-different-methods-in-real-estate-prediction" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/90597.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">277</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3716</span> Application of Bayesian Model Averaging and Geostatistical Output Perturbation to Generate Calibrated Ensemble Weather Forecast</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Muhammad%20Luthfi">Muhammad Luthfi</a>, <a href="https://publications.waset.org/abstracts/search?q=Sutikno%20Sutikno"> Sutikno Sutikno</a>, <a href="https://publications.waset.org/abstracts/search?q=Purhadi%20Purhadi"> Purhadi Purhadi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Weather forecast has necessarily been improved to provide the communities an accurate and objective prediction as well. To overcome such issue, the numerical-based weather forecast was extensively developed to reduce the subjectivity of forecast. Yet the Numerical Weather Predictions (NWPs) outputs are unfortunately issued without taking dynamical weather behavior and local terrain features into account. Thus, NWPs outputs are not able to accurately forecast the weather quantities, particularly for medium and long range forecast. The aim of this research is to aid and extend the development of ensemble forecast for Meteorology, Climatology, and Geophysics Agency of Indonesia. Ensemble method is an approach combining various deterministic forecast to produce more reliable one. However, such forecast is biased and uncalibrated due to its underdispersive or overdispersive nature. As one of the parametric methods, Bayesian Model Averaging (BMA) generates the calibrated ensemble forecast and constructs predictive PDF for specified period. Such method is able to utilize ensemble of any size but does not take spatial correlation into account. Whereas space dependencies involve the site of interest and nearby site, influenced by dynamic weather behavior. Meanwhile, Geostatistical Output Perturbation (GOP) reckons the spatial correlation to generate future weather quantities, though merely built by a single deterministic forecast, and is able to generate an ensemble of any size as well. This research conducts both BMA and GOP to generate the calibrated ensemble forecast for the daily temperature at few meteorological sites nearby Indonesia international airport. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Bayesian%20Model%20Averaging" title="Bayesian Model Averaging">Bayesian Model Averaging</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20forecast" title=" ensemble forecast"> ensemble forecast</a>, <a href="https://publications.waset.org/abstracts/search?q=geostatistical%20output%20perturbation" title=" geostatistical output perturbation"> geostatistical output perturbation</a>, <a href="https://publications.waset.org/abstracts/search?q=numerical%20weather%20prediction" title=" numerical weather prediction"> numerical weather prediction</a>, <a href="https://publications.waset.org/abstracts/search?q=temperature" title=" temperature"> temperature</a> </p> <a href="https://publications.waset.org/abstracts/68771/application-of-bayesian-model-averaging-and-geostatistical-output-perturbation-to-generate-calibrated-ensemble-weather-forecast" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/68771.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">280</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3715</span> Lipschitz Classifiers Ensembles: Usage for Classification of Target Events in C-OTDR Monitoring Systems </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Andrey%20V.%20Timofeev">Andrey V. Timofeev</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper introduces an original method for guaranteed estimation of the accuracy of an ensemble of Lipschitz classifiers. The solution was obtained as a finite closed set of alternative hypotheses, which contains an object of classification with a probability of not less than the specified value. Thus, the classification is represented by a set of hypothetical classes. In this case, the smaller the cardinality of the discrete set of hypothetical classes is, the higher is the classification accuracy. Experiments have shown that if the cardinality of the classifiers ensemble is increased then the cardinality of this set of hypothetical classes is reduced. The problem of the guaranteed estimation of the accuracy of an ensemble of Lipschitz classifiers is relevant in the multichannel classification of target events in C-OTDR monitoring systems. Results of suggested approach practical usage to accuracy control in C-OTDR monitoring systems are present. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Lipschitz%20classifiers" title="Lipschitz classifiers">Lipschitz classifiers</a>, <a href="https://publications.waset.org/abstracts/search?q=confidence%20set" title=" confidence set"> confidence set</a>, <a href="https://publications.waset.org/abstracts/search?q=C-OTDR%20monitoring" title=" C-OTDR monitoring"> C-OTDR monitoring</a>, <a href="https://publications.waset.org/abstracts/search?q=classifiers%20accuracy" title=" classifiers accuracy"> classifiers accuracy</a>, <a href="https://publications.waset.org/abstracts/search?q=classifiers%20ensemble" title=" classifiers ensemble"> classifiers ensemble</a> </p> <a href="https://publications.waset.org/abstracts/21073/lipschitz-classifiers-ensembles-usage-for-classification-of-target-events-in-c-otdr-monitoring-systems" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/21073.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">492</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3714</span> Simulation of Optimal Runoff Hydrograph Using Ensemble of Radar Rainfall and Blending of Runoffs Model</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Myungjin%20Lee">Myungjin Lee</a>, <a href="https://publications.waset.org/abstracts/search?q=Daegun%20Han"> Daegun Han</a>, <a href="https://publications.waset.org/abstracts/search?q=Jongsung%20Kim"> Jongsung Kim</a>, <a href="https://publications.waset.org/abstracts/search?q=Soojun%20Kim"> Soojun Kim</a>, <a href="https://publications.waset.org/abstracts/search?q=Hung%20Soo%20Kim"> Hung Soo Kim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Recently, the localized heavy rainfall and typhoons are frequently occurred due to the climate change and the damage is becoming bigger. Therefore, we may need a more accurate prediction of the rainfall and runoff. However, the gauge rainfall has the limited accuracy in space. Radar rainfall is better than gauge rainfall for the explanation of the spatial variability of rainfall but it is mostly underestimated with the uncertainty involved. Therefore, the ensemble of radar rainfall was simulated using error structure to overcome the uncertainty and gauge rainfall. The simulated ensemble was used as the input data of the rainfall-runoff models for obtaining the ensemble of runoff hydrographs. The previous studies discussed about the accuracy of the rainfall-runoff model. Even if the same input data such as rainfall is used for the runoff analysis using the models in the same basin, the models can have different results because of the uncertainty involved in the models. Therefore, we used two models of the SSARR model which is the lumped model, and the Vflo model which is a distributed model and tried to simulate the optimum runoff considering the uncertainty of each rainfall-runoff model. The study basin is located in Han river basin and we obtained one integrated runoff hydrograph which is an optimum runoff hydrograph using the blending methods such as Multi-Model Super Ensemble (MMSE), Simple Model Average (SMA), Mean Square Error (MSE). From this study, we could confirm the accuracy of rainfall and rainfall-runoff model using ensemble scenario and various rainfall-runoff model and we can use this result to study flood control measure due to climate change. Acknowledgements: This work is supported by the Korea Agency for Infrastructure Technology Advancement(KAIA) grant funded by the Ministry of Land, Infrastructure and Transport (Grant 18AWMP-B083066-05). <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=radar%20rainfall%20ensemble" title="radar rainfall ensemble">radar rainfall ensemble</a>, <a href="https://publications.waset.org/abstracts/search?q=rainfall-runoff%20models" title=" rainfall-runoff models"> rainfall-runoff models</a>, <a href="https://publications.waset.org/abstracts/search?q=blending%20method" title=" blending method"> blending method</a>, <a href="https://publications.waset.org/abstracts/search?q=optimum%20runoff%20hydrograph" title=" optimum runoff hydrograph"> optimum runoff hydrograph</a> </p> <a href="https://publications.waset.org/abstracts/76203/simulation-of-optimal-runoff-hydrograph-using-ensemble-of-radar-rainfall-and-blending-of-runoffs-model" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/76203.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">280</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3713</span> Multiple Relaxation Times in the Gibbs Ensemble Monte Carlo Simulation of Phase Separation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Bina%20Kumari">Bina Kumari</a>, <a href="https://publications.waset.org/abstracts/search?q=Subir%20K.%20Sarkar"> Subir K. Sarkar</a>, <a href="https://publications.waset.org/abstracts/search?q=Pradipta%20Bandyopadhyay"> Pradipta Bandyopadhyay</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The autocorrelation function of the density fluctuation is studied in each of the two phases in a Gibbs Ensemble Monte Carlo (GEMC) simulation of the problem of phase separation for a square well potential with various values of its range. We find that the normalized autocorrelation function is described very well as a linear combination of an exponential function with a time scale τ₂ and a stretched exponential function with a time scale τ₁ and an exponent α. Dependence of (α, τ₁, τ₂) on the parameters of the GEMC algorithm and the range of the square well potential is investigated and interpreted. We also analyse the issue of how to choose the parameters of the GEMC simulation optimally. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=autocorrelation%20function" title="autocorrelation function">autocorrelation function</a>, <a href="https://publications.waset.org/abstracts/search?q=density%20fluctuation" title=" density fluctuation"> density fluctuation</a>, <a href="https://publications.waset.org/abstracts/search?q=GEMC" title=" GEMC"> GEMC</a>, <a href="https://publications.waset.org/abstracts/search?q=simulation" title=" simulation"> simulation</a> </p> <a href="https://publications.waset.org/abstracts/131552/multiple-relaxation-times-in-the-gibbs-ensemble-monte-carlo-simulation-of-phase-separation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/131552.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">188</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3712</span> Neuroevolution Based on Adaptive Ensembles of Biologically Inspired Optimization Algorithms Applied for Modeling a Chemical Engineering Process</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sabina-Adriana%20Floria">Sabina-Adriana Floria</a>, <a href="https://publications.waset.org/abstracts/search?q=Marius%20Gavrilescu"> Marius Gavrilescu</a>, <a href="https://publications.waset.org/abstracts/search?q=Florin%20Leon"> Florin Leon</a>, <a href="https://publications.waset.org/abstracts/search?q=Silvia%20Curteanu"> Silvia Curteanu</a>, <a href="https://publications.waset.org/abstracts/search?q=Costel%20Anton"> Costel Anton</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Neuroevolution is a subfield of artificial intelligence used to solve various problems in different application areas. Specifically, neuroevolution is a technique that applies biologically inspired methods to generate neural network architectures and optimize their parameters automatically. In this paper, we use different biologically inspired optimization algorithms in an ensemble strategy with the aim of training multilayer perceptron neural networks, resulting in regression models used to simulate the industrial chemical process of obtaining bricks from silicone-based materials. Installations in the raw ceramics industry, i.e., bricks, are characterized by significant energy consumption and large quantities of emissions. In addition, the initial conditions that were taken into account during the design and commissioning of the installation can change over time, which leads to the need to add new mixes to adjust the operating conditions for the desired purpose, e.g., material properties and energy saving. The present approach follows the study by simulation of a process of obtaining bricks from silicone-based materials, i.e., the modeling and optimization of the process. Optimization aims to determine the working conditions that minimize the emissions represented by nitrogen monoxide. We first use a search procedure to find the best values for the parameters of various biologically inspired optimization algorithms. Then, we propose an adaptive ensemble strategy that uses only a subset of the best algorithms identified in the search stage. The adaptive ensemble strategy combines the results of selected algorithms and automatically assigns more processing capacity to the more efficient algorithms. Their efficiency may also vary at different stages of the optimization process. In a given ensemble iteration, the most efficient algorithms aim to maintain good convergence, while the less efficient algorithms can improve population diversity. The proposed adaptive ensemble strategy outperforms the individual optimizers and the non-adaptive ensemble strategy in convergence speed, and the obtained results provide lower error values. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=optimization" title="optimization">optimization</a>, <a href="https://publications.waset.org/abstracts/search?q=biologically%20inspired%20algorithm" title=" biologically inspired algorithm"> biologically inspired algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=neuroevolution" title=" neuroevolution"> neuroevolution</a>, <a href="https://publications.waset.org/abstracts/search?q=ensembles" title=" ensembles"> ensembles</a>, <a href="https://publications.waset.org/abstracts/search?q=bricks" title=" bricks"> bricks</a>, <a href="https://publications.waset.org/abstracts/search?q=emission%20minimization" title=" emission minimization"> emission minimization</a> </p> <a href="https://publications.waset.org/abstracts/162135/neuroevolution-based-on-adaptive-ensembles-of-biologically-inspired-optimization-algorithms-applied-for-modeling-a-chemical-engineering-process" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/162135.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">116</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3711</span> Machine Learning Model to Predict TB Bacteria-Resistant Drugs from TB Isolates</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Rosa%20Tsegaye%20Aga">Rosa Tsegaye Aga</a>, <a href="https://publications.waset.org/abstracts/search?q=Xuan%20Jiang"> Xuan Jiang</a>, <a href="https://publications.waset.org/abstracts/search?q=Pavel%20Vazquez%20Faci"> Pavel Vazquez Faci</a>, <a href="https://publications.waset.org/abstracts/search?q=Siqing%20Liu"> Siqing Liu</a>, <a href="https://publications.waset.org/abstracts/search?q=Simon%20Rayner"> Simon Rayner</a>, <a href="https://publications.waset.org/abstracts/search?q=Endalkachew%20Alemu"> Endalkachew Alemu</a>, <a href="https://publications.waset.org/abstracts/search?q=Markos%0D%0AAbebe"> Markos Abebe</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Tuberculosis (TB) is a major cause of disease globally. In most cases, TB is treatable and curable, but only with the proper treatment. There is a time when drug-resistant TB occurs when bacteria become resistant to the drugs that are used to treat TB. Current strategies to identify drug-resistant TB bacteria are laboratory-based, and it takes a longer time to identify the drug-resistant bacteria and treat the patient accordingly. But machine learning (ML) and data science approaches can offer new approaches to the problem. In this study, we propose to develop an ML-based model to predict the antibiotic resistance phenotypes of TB isolates in minutes and give the right treatment to the patient immediately. The study has been using the whole genome sequence (WGS) of TB isolates as training data that have been extracted from the NCBI repository and contain different countries’ samples to build the ML models. The reason that different countries’ samples have been included is to generalize the large group of TB isolates from different regions in the world. This supports the model to train different behaviors of the TB bacteria and makes the model robust. The model training has been considering three pieces of information that have been extracted from the WGS data to train the model. These are all variants that have been found within the candidate genes (F1), predetermined resistance-associated variants (F2), and only resistance-associated gene information for the particular drug. Two major datasets have been constructed using these three information. F1 and F2 information have been considered as two independent datasets, and the third information is used as a class to label the two datasets. Five machine learning algorithms have been considered to train the model. These are Support Vector Machine (SVM), Random forest (RF), Logistic regression (LR), Gradient Boosting, and Ada boost algorithms. The models have been trained on the datasets F1, F2, and F1F2 that is the F1 and the F2 dataset merged. Additionally, an ensemble approach has been used to train the model. The ensemble approach has been considered to run F1 and F2 datasets on gradient boosting algorithm and use the output as one dataset that is called F1F2 ensemble dataset and train a model using this dataset on the five algorithms. As the experiment shows, the ensemble approach model that has been trained on the Gradient Boosting algorithm outperformed the rest of the models. In conclusion, this study suggests the ensemble approach, that is, the RF + Gradient boosting model, to predict the antibiotic resistance phenotypes of TB isolates by outperforming the rest of the models. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title="machine learning">machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=MTB" title=" MTB"> MTB</a>, <a href="https://publications.waset.org/abstracts/search?q=WGS" title=" WGS"> WGS</a>, <a href="https://publications.waset.org/abstracts/search?q=drug%20resistant%20TB" title=" drug resistant TB"> drug resistant TB</a> </p> <a href="https://publications.waset.org/abstracts/181519/machine-learning-model-to-predict-tb-bacteria-resistant-drugs-from-tb-isolates" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/181519.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">52</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3710</span> A Hybrid Data Mining Algorithm Based System for Intelligent Defence Mission Readiness and Maintenance Scheduling</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Shivam%20Dwivedi">Shivam Dwivedi</a>, <a href="https://publications.waset.org/abstracts/search?q=Sumit%20Prakash%20Gupta"> Sumit Prakash Gupta</a>, <a href="https://publications.waset.org/abstracts/search?q=Durga%20Toshniwal"> Durga Toshniwal</a> </p> <p class="card-text"><strong>Abstract:</strong></p> It is a challenging task in today’s date to keep defence forces in the highest state of combat readiness with budgetary constraints. A huge amount of time and money is squandered in the unnecessary and expensive traditional maintenance activities. To overcome this limitation Defence Intelligent Mission Readiness and Maintenance Scheduling System has been proposed, which ameliorates the maintenance system by diagnosing the condition and predicting the maintenance requirements. Based on new data mining algorithms, this system intelligently optimises mission readiness for imminent operations and maintenance scheduling in repair echelons. With modified data mining algorithms such as Weighted Feature Ranking Genetic Algorithm and SVM-Random Forest Linear ensemble, it improves the reliability, availability and safety, alongside reducing maintenance cost and Equipment Out of Action (EOA) time. The results clearly conclude that the introduced algorithms have an edge over the conventional data mining algorithms. The system utilizing the intelligent condition-based maintenance approach improves the operational and maintenance decision strategy of the defence force. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=condition%20based%20maintenance" title="condition based maintenance">condition based maintenance</a>, <a href="https://publications.waset.org/abstracts/search?q=data%20mining" title=" data mining"> data mining</a>, <a href="https://publications.waset.org/abstracts/search?q=defence%20maintenance" title=" defence maintenance"> defence maintenance</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble" title=" ensemble"> ensemble</a>, <a href="https://publications.waset.org/abstracts/search?q=genetic%20algorithms" title=" genetic algorithms"> genetic algorithms</a>, <a href="https://publications.waset.org/abstracts/search?q=maintenance%20scheduling" title=" maintenance scheduling"> maintenance scheduling</a>, <a href="https://publications.waset.org/abstracts/search?q=mission%20capability" title=" mission capability"> mission capability</a> </p> <a href="https://publications.waset.org/abstracts/73229/a-hybrid-data-mining-algorithm-based-system-for-intelligent-defence-mission-readiness-and-maintenance-scheduling" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/73229.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">297</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">3709</span> Statistical Comparison of Ensemble Based Storm Surge Forecasting Models</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Amin%20Salighehdar">Amin Salighehdar</a>, <a href="https://publications.waset.org/abstracts/search?q=Ziwen%20Ye"> Ziwen Ye</a>, <a href="https://publications.waset.org/abstracts/search?q=Mingzhe%20Liu"> Mingzhe Liu</a>, <a href="https://publications.waset.org/abstracts/search?q=Ionut%20%20Florescu"> Ionut Florescu</a>, <a href="https://publications.waset.org/abstracts/search?q=Alan%20F.%20Blumberg"> Alan F. Blumberg</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Bayesian%20learning" title="Bayesian learning">Bayesian learning</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20model" title=" ensemble model"> ensemble model</a>, <a href="https://publications.waset.org/abstracts/search?q=statistical%20analysis" title=" statistical analysis"> statistical analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=storm%20surge%20prediction" title=" storm surge prediction"> storm surge prediction</a> </p> <a href="https://publications.waset.org/abstracts/70123/statistical-comparison-of-ensemble-based-storm-surge-forecasting-models" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/70123.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">309</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">‹</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=Ensemble%20Algorithm&page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=Ensemble%20Algorithm&page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=Ensemble%20Algorithm&page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=Ensemble%20Algorithm&page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=Ensemble%20Algorithm&page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=Ensemble%20Algorithm&page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=Ensemble%20Algorithm&page=8">8</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=Ensemble%20Algorithm&page=9">9</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=Ensemble%20Algorithm&page=10">10</a></li> <li class="page-item disabled"><span class="page-link">...</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=Ensemble%20Algorithm&page=124">124</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=Ensemble%20Algorithm&page=125">125</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=Ensemble%20Algorithm&page=2" rel="next">›</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">© 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">×</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>