CINXE.COM
Search results for: multi-model ensemble
<!DOCTYPE html> <html lang="en" dir="ltr"> <head> <!-- Google tag (gtag.js) --> <script async src="https://www.googletagmanager.com/gtag/js?id=G-P63WKM1TM1"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-P63WKM1TM1'); </script> <!-- Yandex.Metrika counter --> <script type="text/javascript" > (function(m,e,t,r,i,k,a){m[i]=m[i]||function(){(m[i].a=m[i].a||[]).push(arguments)}; m[i].l=1*new Date(); for (var j = 0; j < document.scripts.length; j++) {if (document.scripts[j].src === r) { return; }} k=e.createElement(t),a=e.getElementsByTagName(t)[0],k.async=1,k.src=r,a.parentNode.insertBefore(k,a)}) (window, document, "script", "https://mc.yandex.ru/metrika/tag.js", "ym"); ym(55165297, "init", { clickmap:false, trackLinks:true, accurateTrackBounce:true, webvisor:false }); </script> <noscript><div><img src="https://mc.yandex.ru/watch/55165297" style="position:absolute; left:-9999px;" alt="" /></div></noscript> <!-- /Yandex.Metrika counter --> <!-- Matomo --> <!-- End Matomo Code --> <title>Search results for: multi-model ensemble</title> <meta name="description" content="Search results for: multi-model ensemble"> <meta name="keywords" content="multi-model ensemble"> <meta name="viewport" content="width=device-width, initial-scale=1, minimum-scale=1, maximum-scale=1, user-scalable=no"> <meta charset="utf-8"> <link href="https://cdn.waset.org/favicon.ico" type="image/x-icon" rel="shortcut icon"> <link href="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/css/bootstrap.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/plugins/fontawesome/css/all.min.css" rel="stylesheet"> <link href="https://cdn.waset.org/static/css/site.css?v=150220211555" rel="stylesheet"> </head> <body> <header> <div class="container"> <nav class="navbar navbar-expand-lg navbar-light"> <a class="navbar-brand" href="https://waset.org"> <img src="https://cdn.waset.org/static/images/wasetc.png" alt="Open Science Research Excellence" title="Open Science Research Excellence" /> </a> <button class="d-block d-lg-none navbar-toggler ml-auto" type="button" data-toggle="collapse" data-target="#navbarMenu" aria-controls="navbarMenu" aria-expanded="false" aria-label="Toggle navigation"> <span class="navbar-toggler-icon"></span> </button> <div class="w-100"> <div class="d-none d-lg-flex flex-row-reverse"> <form method="get" action="https://waset.org/search" class="form-inline my-2 my-lg-0"> <input class="form-control mr-sm-2" type="search" placeholder="Search Conferences" value="multi-model ensemble" name="q" aria-label="Search"> <button class="btn btn-light my-2 my-sm-0" type="submit"><i class="fas fa-search"></i></button> </form> </div> <div class="collapse navbar-collapse mt-1" id="navbarMenu"> <ul class="navbar-nav ml-auto align-items-center" id="mainNavMenu"> <li class="nav-item"> <a class="nav-link" href="https://waset.org/conferences" title="Conferences in 2024/2025/2026">Conferences</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/disciplines" title="Disciplines">Disciplines</a> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/committees" rel="nofollow">Committees</a> </li> <li class="nav-item dropdown"> <a class="nav-link dropdown-toggle" href="#" id="navbarDropdownPublications" role="button" data-toggle="dropdown" aria-haspopup="true" aria-expanded="false"> Publications </a> <div class="dropdown-menu" aria-labelledby="navbarDropdownPublications"> <a class="dropdown-item" href="https://publications.waset.org/abstracts">Abstracts</a> <a class="dropdown-item" href="https://publications.waset.org">Periodicals</a> <a class="dropdown-item" href="https://publications.waset.org/archive">Archive</a> </div> </li> <li class="nav-item"> <a class="nav-link" href="https://waset.org/page/support" title="Support">Support</a> </li> </ul> </div> </div> </nav> </div> </header> <main> <div class="container mt-4"> <div class="row"> <div class="col-md-9 mx-auto"> <form method="get" action="https://publications.waset.org/abstracts/search"> <div id="custom-search-input"> <div class="input-group"> <i class="fas fa-search"></i> <input type="text" class="search-query" name="q" placeholder="Author, Title, Abstract, Keywords" value="multi-model ensemble"> <input type="submit" class="btn_search" value="Search"> </div> </div> </form> </div> </div> <div class="row mt-3"> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Commenced</strong> in January 2007</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Frequency:</strong> Monthly</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Edition:</strong> International</div> </div> </div> <div class="col-sm-3"> <div class="card"> <div class="card-body"><strong>Paper Count:</strong> 192</div> </div> </div> </div> <h1 class="mt-3 mb-3 text-center" style="font-size:1.6rem;">Search results for: multi-model ensemble</h1> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">192</span> Evaluation of Ensemble Classifiers for Intrusion Detection </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=M.%20Govindarajan">M. Govindarajan </a> </p> <p class="card-text"><strong>Abstract:</strong></p> One of the major developments in machine learning in the past decade is the ensemble method, which finds highly accurate classifier by combining many moderately accurate component classifiers. In this research work, new ensemble classification methods are proposed with homogeneous ensemble classifier using bagging and heterogeneous ensemble classifier using arcing and their performances are analyzed in terms of accuracy. A Classifier ensemble is designed using Radial Basis Function (RBF) and Support Vector Machine (SVM) as base classifiers. The feasibility and the benefits of the proposed approaches are demonstrated by the means of standard datasets of intrusion detection. The main originality of the proposed approach is based on three main parts: preprocessing phase, classification phase, and combining phase. A wide range of comparative experiments is conducted for standard datasets of intrusion detection. The performance of the proposed homogeneous and heterogeneous ensemble classifiers are compared to the performance of other standard homogeneous and heterogeneous ensemble methods. The standard homogeneous ensemble methods include Error correcting output codes, Dagging and heterogeneous ensemble methods include majority voting, stacking. The proposed ensemble methods provide significant improvement of accuracy compared to individual classifiers and the proposed bagged RBF and SVM performs significantly better than ECOC and Dagging and the proposed hybrid RBF-SVM performs significantly better than voting and stacking. Also heterogeneous models exhibit better results than homogeneous models for standard datasets of intrusion detection. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=data%20mining" title="data mining">data mining</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble" title=" ensemble"> ensemble</a>, <a href="https://publications.waset.org/abstracts/search?q=radial%20basis%20function" title=" radial basis function"> radial basis function</a>, <a href="https://publications.waset.org/abstracts/search?q=support%20vector%20machine" title=" support vector machine"> support vector machine</a>, <a href="https://publications.waset.org/abstracts/search?q=accuracy" title=" accuracy"> accuracy</a> </p> <a href="https://publications.waset.org/abstracts/43650/evaluation-of-ensemble-classifiers-for-intrusion-detection" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/43650.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">248</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">191</span> Random Subspace Ensemble of CMAC Classifiers </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Somaiyeh%20Dehghan">Somaiyeh Dehghan</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohammad%20Reza%20Kheirkhahan%20Haghighi"> Mohammad Reza Kheirkhahan Haghighi </a> </p> <p class="card-text"><strong>Abstract:</strong></p> The rapid growth of domains that have data with a large number of features, while the number of samples is limited has caused difficulty in constructing strong classifiers. To reduce the dimensionality of the feature space becomes an essential step in classification task. Random subspace method (or attribute bagging) is an ensemble classifier that consists of several classifiers that each base learner in ensemble has subset of features. In the present paper, we introduce Random Subspace Ensemble of CMAC neural network (RSE-CMAC), each of which has training with subset of features. Then we use this model for classification task. For evaluation performance of our model, we compare it with bagging algorithm on 36 UCI datasets. The results reveal that the new model has better performance. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=classification" title="classification">classification</a>, <a href="https://publications.waset.org/abstracts/search?q=random%20subspace" title=" random subspace"> random subspace</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble" title=" ensemble"> ensemble</a>, <a href="https://publications.waset.org/abstracts/search?q=CMAC%20neural%20network" title=" CMAC neural network"> CMAC neural network</a> </p> <a href="https://publications.waset.org/abstracts/14371/random-subspace-ensemble-of-cmac-classifiers" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/14371.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">329</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">190</span> Fuzzy Optimization Multi-Objective Clustering Ensemble Model for Multi-Source Data Analysis</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=C.%20B.%20Le">C. B. Le</a>, <a href="https://publications.waset.org/abstracts/search?q=V.%20N.%20Pham"> V. N. Pham</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In modern data analysis, multi-source data appears more and more in real applications. Multi-source data clustering has emerged as a important issue in the data mining and machine learning community. Different data sources provide information about different data. Therefore, multi-source data linking is essential to improve clustering performance. However, in practice multi-source data is often heterogeneous, uncertain, and large. This issue is considered a major challenge from multi-source data. Ensemble is a versatile machine learning model in which learning techniques can work in parallel, with big data. Clustering ensemble has been shown to outperform any standard clustering algorithm in terms of accuracy and robustness. However, most of the traditional clustering ensemble approaches are based on single-objective function and single-source data. This paper proposes a new clustering ensemble method for multi-source data analysis. The fuzzy optimized multi-objective clustering ensemble method is called FOMOCE. Firstly, a clustering ensemble mathematical model based on the structure of multi-objective clustering function, multi-source data, and dark knowledge is introduced. Then, rules for extracting dark knowledge from the input data, clustering algorithms, and base clusterings are designed and applied. Finally, a clustering ensemble algorithm is proposed for multi-source data analysis. The experiments were performed on the standard sample data set. The experimental results demonstrate the superior performance of the FOMOCE method compared to the existing clustering ensemble methods and multi-source clustering methods. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=clustering%20ensemble" title="clustering ensemble">clustering ensemble</a>, <a href="https://publications.waset.org/abstracts/search?q=multi-source" title=" multi-source"> multi-source</a>, <a href="https://publications.waset.org/abstracts/search?q=multi-objective" title=" multi-objective"> multi-objective</a>, <a href="https://publications.waset.org/abstracts/search?q=fuzzy%20clustering" title=" fuzzy clustering"> fuzzy clustering</a> </p> <a href="https://publications.waset.org/abstracts/136598/fuzzy-optimization-multi-objective-clustering-ensemble-model-for-multi-source-data-analysis" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/136598.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">189</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">189</span> Rank-Based Chain-Mode Ensemble for Binary Classification</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Chongya%20Song">Chongya Song</a>, <a href="https://publications.waset.org/abstracts/search?q=Kang%20Yen"> Kang Yen</a>, <a href="https://publications.waset.org/abstracts/search?q=Alexander%20Pons"> Alexander Pons</a>, <a href="https://publications.waset.org/abstracts/search?q=Jin%20Liu"> Jin Liu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In the field of machine learning, the ensemble has been employed as a common methodology to improve the performance upon multiple base classifiers. However, the true predictions are often canceled out by the false ones during consensus due to a phenomenon called “curse of correlation” which is represented as the strong interferences among the predictions produced by the base classifiers. In addition, the existing practices are still not able to effectively mitigate the problem of imbalanced classification. Based on the analysis on our experiment results, we conclude that the two problems are caused by some inherent deficiencies in the approach of consensus. Therefore, we create an enhanced ensemble algorithm which adopts a designed rank-based chain-mode consensus to overcome the two problems. In order to evaluate the proposed ensemble algorithm, we employ a well-known benchmark data set NSL-KDD (the improved version of dataset KDDCup99 produced by University of New Brunswick) to make comparisons between the proposed and 8 common ensemble algorithms. Particularly, each compared ensemble classifier uses the same 22 base classifiers, so that the differences in terms of the improvements toward the accuracy and reliability upon the base classifiers can be truly revealed. As a result, the proposed rank-based chain-mode consensus is proved to be a more effective ensemble solution than the traditional consensus approach, which outperforms the 8 ensemble algorithms by 20% on almost all compared metrices which include accuracy, precision, recall, F1-score and area under receiver operating characteristic curve. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=consensus" title="consensus">consensus</a>, <a href="https://publications.waset.org/abstracts/search?q=curse%20of%20correlation" title=" curse of correlation"> curse of correlation</a>, <a href="https://publications.waset.org/abstracts/search?q=imbalance%20classification" title=" imbalance classification"> imbalance classification</a>, <a href="https://publications.waset.org/abstracts/search?q=rank-based%20chain-mode%20ensemble" title=" rank-based chain-mode ensemble"> rank-based chain-mode ensemble</a> </p> <a href="https://publications.waset.org/abstracts/112891/rank-based-chain-mode-ensemble-for-binary-classification" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/112891.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">138</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">188</span> Sentiment Analysis of Ensemble-Based Classifiers for E-Mail Data</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Muthukumarasamy%20Govindarajan">Muthukumarasamy Govindarajan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Detection of unwanted, unsolicited mails called spam from email is an interesting area of research. It is necessary to evaluate the performance of any new spam classifier using standard data sets. Recently, ensemble-based classifiers have gained popularity in this domain. In this research work, an efficient email filtering approach based on ensemble methods is addressed for developing an accurate and sensitive spam classifier. The proposed approach employs Naive Bayes (NB), Support Vector Machine (SVM) and Genetic Algorithm (GA) as base classifiers along with different ensemble methods. The experimental results show that the ensemble classifier was performing with accuracy greater than individual classifiers, and also hybrid model results are found to be better than the combined models for the e-mail dataset. The proposed ensemble-based classifiers turn out to be good in terms of classification accuracy, which is considered to be an important criterion for building a robust spam classifier. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=accuracy" title="accuracy">accuracy</a>, <a href="https://publications.waset.org/abstracts/search?q=arcing" title=" arcing"> arcing</a>, <a href="https://publications.waset.org/abstracts/search?q=bagging" title=" bagging"> bagging</a>, <a href="https://publications.waset.org/abstracts/search?q=genetic%20algorithm" title=" genetic algorithm"> genetic algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=Naive%20Bayes" title=" Naive Bayes"> Naive Bayes</a>, <a href="https://publications.waset.org/abstracts/search?q=sentiment%20mining" title=" sentiment mining"> sentiment mining</a>, <a href="https://publications.waset.org/abstracts/search?q=support%20vector%20machine" title=" support vector machine"> support vector machine</a> </p> <a href="https://publications.waset.org/abstracts/112240/sentiment-analysis-of-ensemble-based-classifiers-for-e-mail-data" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/112240.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">142</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">187</span> Extreme Temperature Response to Solar Radiation Management in Southeast Asia</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Heri%20Kuswanto">Heri Kuswanto</a>, <a href="https://publications.waset.org/abstracts/search?q=Brina%20Miftahurrohmah"> Brina Miftahurrohmah</a>, <a href="https://publications.waset.org/abstracts/search?q=Fatkhurokhman%20Fauzi"> Fatkhurokhman Fauzi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Southeast Asia has experienced rising temperatures and is predicted to reach a 1.5°C increase by 2030, which is earlier than the Paris Agreement target. Solar Radiation Management (SRM) has been proposed as an alternative to combat global warming. This research investigates changes in the annual maximum temperature (TXx) with and without SRM over southeast Asia. We examined outputs from three ensemble members of the Geoengineering Large Ensemble Project (GLENS) experiment for the period 2051 to 2080. One ensemble member generated outputs that significantly deviated from the others, leading to the removal of ensemble 3 from the impact analysis. Our observations indicate that the magnitude of TXx changes with SRM is heterogeneous across countries. We found that SRM significantly reduces TXx levels compared to historical periods. Furthermore, SRM can reduce temperatures by up to 5°C compared to scenarios without SRM, with even more pronounced effects in Thailand, Cambodia, Laos, and Myanmar. This indicates that SRM can mitigate climate change by lowering future TXx levels. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=solar%20radiation%20management" title="solar radiation management">solar radiation management</a>, <a href="https://publications.waset.org/abstracts/search?q=GLENS" title=" GLENS"> GLENS</a>, <a href="https://publications.waset.org/abstracts/search?q=extreme" title=" extreme"> extreme</a>, <a href="https://publications.waset.org/abstracts/search?q=temperature" title=" temperature"> temperature</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble" title=" ensemble"> ensemble</a> </p> <a href="https://publications.waset.org/abstracts/193495/extreme-temperature-response-to-solar-radiation-management-in-southeast-asia" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/193495.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">14</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">186</span> Enhancing Predictive Accuracy in Pharmaceutical Sales through an Ensemble Kernel Gaussian Process Regression Approach</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Shahin%20Mirshekari">Shahin Mirshekari</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohammadreza%20Moradi"> Mohammadreza Moradi</a>, <a href="https://publications.waset.org/abstracts/search?q=Hossein%20Jafari"> Hossein Jafari</a>, <a href="https://publications.waset.org/abstracts/search?q=Mehdi%20Jafari"> Mehdi Jafari</a>, <a href="https://publications.waset.org/abstracts/search?q=Mohammad%20Ensaf"> Mohammad Ensaf</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This research employs Gaussian Process Regression (GPR) with an ensemble kernel, integrating Exponential Squared, Revised Matern, and Rational Quadratic kernels to analyze pharmaceutical sales data. Bayesian optimization was used to identify optimal kernel weights: 0.76 for Exponential Squared, 0.21 for Revised Matern, and 0.13 for Rational Quadratic. The ensemble kernel demonstrated superior performance in predictive accuracy, achieving an R² score near 1.0, and significantly lower values in MSE, MAE, and RMSE. These findings highlight the efficacy of ensemble kernels in GPR for predictive analytics in complex pharmaceutical sales datasets. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Gaussian%20process%20regression" title="Gaussian process regression">Gaussian process regression</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20kernels" title=" ensemble kernels"> ensemble kernels</a>, <a href="https://publications.waset.org/abstracts/search?q=bayesian%20optimization" title=" bayesian optimization"> bayesian optimization</a>, <a href="https://publications.waset.org/abstracts/search?q=pharmaceutical%20sales%20analysis" title=" pharmaceutical sales analysis"> pharmaceutical sales analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=time%20series%20forecasting" title=" time series forecasting"> time series forecasting</a>, <a href="https://publications.waset.org/abstracts/search?q=data%20analysis" title=" data analysis"> data analysis</a> </p> <a href="https://publications.waset.org/abstracts/181581/enhancing-predictive-accuracy-in-pharmaceutical-sales-through-an-ensemble-kernel-gaussian-process-regression-approach" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/181581.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">71</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">185</span> Feature Evaluation Based on Random Subspace and Multiple-K Ensemble</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jaehong%20Yu">Jaehong Yu</a>, <a href="https://publications.waset.org/abstracts/search?q=Seoung%20Bum%20Kim"> Seoung Bum Kim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Clustering analysis can facilitate the extraction of intrinsic patterns in a dataset and reveal its natural groupings without requiring class information. For effective clustering analysis in high dimensional datasets, unsupervised dimensionality reduction is an important task. Unsupervised dimensionality reduction can generally be achieved by feature extraction or feature selection. In many situations, feature selection methods are more appropriate than feature extraction methods because of their clear interpretation with respect to the original features. The unsupervised feature selection can be categorized as feature subset selection and feature ranking method, and we focused on unsupervised feature ranking methods which evaluate the features based on their importance scores. Recently, several unsupervised feature ranking methods were developed based on ensemble approaches to achieve their higher accuracy and stability. However, most of the ensemble-based feature ranking methods require the true number of clusters. Furthermore, these algorithms evaluate the feature importance depending on the ensemble clustering solution, and they produce undesirable evaluation results if the clustering solutions are inaccurate. To address these limitations, we proposed an ensemble-based feature ranking method with random subspace and multiple-k ensemble (FRRM). The proposed FRRM algorithm evaluates the importance of each feature with the random subspace ensemble, and all evaluation results are combined with the ensemble importance scores. Moreover, FRRM does not require the determination of the true number of clusters in advance through the use of the multiple-k ensemble idea. Experiments on various benchmark datasets were conducted to examine the properties of the proposed FRRM algorithm and to compare its performance with that of existing feature ranking methods. The experimental results demonstrated that the proposed FRRM outperformed the competitors. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=clustering%20analysis" title="clustering analysis">clustering analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=multiple-k%20ensemble" title=" multiple-k ensemble"> multiple-k ensemble</a>, <a href="https://publications.waset.org/abstracts/search?q=random%20subspace-based%20feature%20evaluation" title=" random subspace-based feature evaluation"> random subspace-based feature evaluation</a>, <a href="https://publications.waset.org/abstracts/search?q=unsupervised%20feature%20ranking" title=" unsupervised feature ranking"> unsupervised feature ranking</a> </p> <a href="https://publications.waset.org/abstracts/52081/feature-evaluation-based-on-random-subspace-and-multiple-k-ensemble" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/52081.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">339</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">184</span> Application of Bayesian Model Averaging and Geostatistical Output Perturbation to Generate Calibrated Ensemble Weather Forecast</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Muhammad%20Luthfi">Muhammad Luthfi</a>, <a href="https://publications.waset.org/abstracts/search?q=Sutikno%20Sutikno"> Sutikno Sutikno</a>, <a href="https://publications.waset.org/abstracts/search?q=Purhadi%20Purhadi"> Purhadi Purhadi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Weather forecast has necessarily been improved to provide the communities an accurate and objective prediction as well. To overcome such issue, the numerical-based weather forecast was extensively developed to reduce the subjectivity of forecast. Yet the Numerical Weather Predictions (NWPs) outputs are unfortunately issued without taking dynamical weather behavior and local terrain features into account. Thus, NWPs outputs are not able to accurately forecast the weather quantities, particularly for medium and long range forecast. The aim of this research is to aid and extend the development of ensemble forecast for Meteorology, Climatology, and Geophysics Agency of Indonesia. Ensemble method is an approach combining various deterministic forecast to produce more reliable one. However, such forecast is biased and uncalibrated due to its underdispersive or overdispersive nature. As one of the parametric methods, Bayesian Model Averaging (BMA) generates the calibrated ensemble forecast and constructs predictive PDF for specified period. Such method is able to utilize ensemble of any size but does not take spatial correlation into account. Whereas space dependencies involve the site of interest and nearby site, influenced by dynamic weather behavior. Meanwhile, Geostatistical Output Perturbation (GOP) reckons the spatial correlation to generate future weather quantities, though merely built by a single deterministic forecast, and is able to generate an ensemble of any size as well. This research conducts both BMA and GOP to generate the calibrated ensemble forecast for the daily temperature at few meteorological sites nearby Indonesia international airport. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Bayesian%20Model%20Averaging" title="Bayesian Model Averaging">Bayesian Model Averaging</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20forecast" title=" ensemble forecast"> ensemble forecast</a>, <a href="https://publications.waset.org/abstracts/search?q=geostatistical%20output%20perturbation" title=" geostatistical output perturbation"> geostatistical output perturbation</a>, <a href="https://publications.waset.org/abstracts/search?q=numerical%20weather%20prediction" title=" numerical weather prediction"> numerical weather prediction</a>, <a href="https://publications.waset.org/abstracts/search?q=temperature" title=" temperature"> temperature</a> </p> <a href="https://publications.waset.org/abstracts/68771/application-of-bayesian-model-averaging-and-geostatistical-output-perturbation-to-generate-calibrated-ensemble-weather-forecast" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/68771.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">280</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">183</span> Lipschitz Classifiers Ensembles: Usage for Classification of Target Events in C-OTDR Monitoring Systems </h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Andrey%20V.%20Timofeev">Andrey V. Timofeev</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper introduces an original method for guaranteed estimation of the accuracy of an ensemble of Lipschitz classifiers. The solution was obtained as a finite closed set of alternative hypotheses, which contains an object of classification with a probability of not less than the specified value. Thus, the classification is represented by a set of hypothetical classes. In this case, the smaller the cardinality of the discrete set of hypothetical classes is, the higher is the classification accuracy. Experiments have shown that if the cardinality of the classifiers ensemble is increased then the cardinality of this set of hypothetical classes is reduced. The problem of the guaranteed estimation of the accuracy of an ensemble of Lipschitz classifiers is relevant in the multichannel classification of target events in C-OTDR monitoring systems. Results of suggested approach practical usage to accuracy control in C-OTDR monitoring systems are present. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Lipschitz%20classifiers" title="Lipschitz classifiers">Lipschitz classifiers</a>, <a href="https://publications.waset.org/abstracts/search?q=confidence%20set" title=" confidence set"> confidence set</a>, <a href="https://publications.waset.org/abstracts/search?q=C-OTDR%20monitoring" title=" C-OTDR monitoring"> C-OTDR monitoring</a>, <a href="https://publications.waset.org/abstracts/search?q=classifiers%20accuracy" title=" classifiers accuracy"> classifiers accuracy</a>, <a href="https://publications.waset.org/abstracts/search?q=classifiers%20ensemble" title=" classifiers ensemble"> classifiers ensemble</a> </p> <a href="https://publications.waset.org/abstracts/21073/lipschitz-classifiers-ensembles-usage-for-classification-of-target-events-in-c-otdr-monitoring-systems" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/21073.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">492</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">182</span> Simulation of Optimal Runoff Hydrograph Using Ensemble of Radar Rainfall and Blending of Runoffs Model</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Myungjin%20Lee">Myungjin Lee</a>, <a href="https://publications.waset.org/abstracts/search?q=Daegun%20Han"> Daegun Han</a>, <a href="https://publications.waset.org/abstracts/search?q=Jongsung%20Kim"> Jongsung Kim</a>, <a href="https://publications.waset.org/abstracts/search?q=Soojun%20Kim"> Soojun Kim</a>, <a href="https://publications.waset.org/abstracts/search?q=Hung%20Soo%20Kim"> Hung Soo Kim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Recently, the localized heavy rainfall and typhoons are frequently occurred due to the climate change and the damage is becoming bigger. Therefore, we may need a more accurate prediction of the rainfall and runoff. However, the gauge rainfall has the limited accuracy in space. Radar rainfall is better than gauge rainfall for the explanation of the spatial variability of rainfall but it is mostly underestimated with the uncertainty involved. Therefore, the ensemble of radar rainfall was simulated using error structure to overcome the uncertainty and gauge rainfall. The simulated ensemble was used as the input data of the rainfall-runoff models for obtaining the ensemble of runoff hydrographs. The previous studies discussed about the accuracy of the rainfall-runoff model. Even if the same input data such as rainfall is used for the runoff analysis using the models in the same basin, the models can have different results because of the uncertainty involved in the models. Therefore, we used two models of the SSARR model which is the lumped model, and the Vflo model which is a distributed model and tried to simulate the optimum runoff considering the uncertainty of each rainfall-runoff model. The study basin is located in Han river basin and we obtained one integrated runoff hydrograph which is an optimum runoff hydrograph using the blending methods such as Multi-Model Super Ensemble (MMSE), Simple Model Average (SMA), Mean Square Error (MSE). From this study, we could confirm the accuracy of rainfall and rainfall-runoff model using ensemble scenario and various rainfall-runoff model and we can use this result to study flood control measure due to climate change. Acknowledgements: This work is supported by the Korea Agency for Infrastructure Technology Advancement(KAIA) grant funded by the Ministry of Land, Infrastructure and Transport (Grant 18AWMP-B083066-05). <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=radar%20rainfall%20ensemble" title="radar rainfall ensemble">radar rainfall ensemble</a>, <a href="https://publications.waset.org/abstracts/search?q=rainfall-runoff%20models" title=" rainfall-runoff models"> rainfall-runoff models</a>, <a href="https://publications.waset.org/abstracts/search?q=blending%20method" title=" blending method"> blending method</a>, <a href="https://publications.waset.org/abstracts/search?q=optimum%20runoff%20hydrograph" title=" optimum runoff hydrograph"> optimum runoff hydrograph</a> </p> <a href="https://publications.waset.org/abstracts/76203/simulation-of-optimal-runoff-hydrograph-using-ensemble-of-radar-rainfall-and-blending-of-runoffs-model" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/76203.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">280</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">181</span> Breast Cancer Survivability Prediction via Classifier Ensemble</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Mohamed%20Al-Badrashiny">Mohamed Al-Badrashiny</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdelghani%20Bellaachia"> Abdelghani Bellaachia</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=classifier%20ensemble" title="classifier ensemble">classifier ensemble</a>, <a href="https://publications.waset.org/abstracts/search?q=breast%20cancer%20survivability" title=" breast cancer survivability"> breast cancer survivability</a>, <a href="https://publications.waset.org/abstracts/search?q=data%20mining" title=" data mining"> data mining</a>, <a href="https://publications.waset.org/abstracts/search?q=SEER" title=" SEER"> SEER</a> </p> <a href="https://publications.waset.org/abstracts/42621/breast-cancer-survivability-prediction-via-classifier-ensemble" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/42621.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">328</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">180</span> A Video Surveillance System Using an Ensemble of Simple Neural Network Classifiers</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Rodrigo%20S.%20Moreira">Rodrigo S. Moreira</a>, <a href="https://publications.waset.org/abstracts/search?q=Nelson%20F.%20F.%20Ebecken"> Nelson F. F. Ebecken</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This paper proposes a maritime vessel tracker composed of an ensemble of WiSARD weightless neural network classifiers. A failure detector analyzes vessel movement with a Kalman filter and corrects the tracking, if necessary, using FFT matching. The use of the WiSARD neural network to track objects is uncommon. The additional contributions of the present study include a performance comparison with four state-of-art trackers, an experimental study of the features that improve maritime vessel tracking, the first use of an ensemble of classifiers to track maritime vessels and a new quantization algorithm that compares the values of pixel pairs. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ram%20memory" title="ram memory">ram memory</a>, <a href="https://publications.waset.org/abstracts/search?q=WiSARD%20weightless%20neural%20network" title=" WiSARD weightless neural network"> WiSARD weightless neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=object%20tracking" title=" object tracking"> object tracking</a>, <a href="https://publications.waset.org/abstracts/search?q=quantization" title=" quantization"> quantization</a> </p> <a href="https://publications.waset.org/abstracts/49928/a-video-surveillance-system-using-an-ensemble-of-simple-neural-network-classifiers" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/49928.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">310</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">179</span> Statistical Comparison of Ensemble Based Storm Surge Forecasting Models</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Amin%20Salighehdar">Amin Salighehdar</a>, <a href="https://publications.waset.org/abstracts/search?q=Ziwen%20Ye"> Ziwen Ye</a>, <a href="https://publications.waset.org/abstracts/search?q=Mingzhe%20Liu"> Mingzhe Liu</a>, <a href="https://publications.waset.org/abstracts/search?q=Ionut%20%20Florescu"> Ionut Florescu</a>, <a href="https://publications.waset.org/abstracts/search?q=Alan%20F.%20Blumberg"> Alan F. Blumberg</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Bayesian%20learning" title="Bayesian learning">Bayesian learning</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20model" title=" ensemble model"> ensemble model</a>, <a href="https://publications.waset.org/abstracts/search?q=statistical%20analysis" title=" statistical analysis"> statistical analysis</a>, <a href="https://publications.waset.org/abstracts/search?q=storm%20surge%20prediction" title=" storm surge prediction"> storm surge prediction</a> </p> <a href="https://publications.waset.org/abstracts/70123/statistical-comparison-of-ensemble-based-storm-surge-forecasting-models" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/70123.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">309</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">178</span> Faster, Lighter, More Accurate: A Deep Learning Ensemble for Content Moderation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Arian%20Hosseini">Arian Hosseini</a>, <a href="https://publications.waset.org/abstracts/search?q=Mahmudul%20Hasan"> Mahmudul Hasan</a> </p> <p class="card-text"><strong>Abstract:</strong></p> To address the increasing need for efficient and accurate content moderation, we propose an efficient and lightweight deep classification ensemble structure. Our approach is based on a combination of simple visual features, designed for high-accuracy classification of violent content with low false positives. Our ensemble architecture utilizes a set of lightweight models with narrowed-down color features, and we apply it to both images and videos. We evaluated our approach using a large dataset of explosion and blast contents and compared its performance to popular deep learning models such as ResNet-50. Our evaluation results demonstrate significant improvements in prediction accuracy, while benefiting from 7.64x faster inference and lower computation cost. While our approach is tailored to explosion detection, it can be applied to other similar content moderation and violence detection use cases as well. Based on our experiments, we propose a "think small, think many" philosophy in classification scenarios. We argue that transforming a single, large, monolithic deep model into a verification-based step model ensemble of multiple small, simple, and lightweight models with narrowed-down visual features can possibly lead to predictions with higher accuracy. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=deep%20classification" title="deep classification">deep classification</a>, <a href="https://publications.waset.org/abstracts/search?q=content%20moderation" title=" content moderation"> content moderation</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20learning" title=" ensemble learning"> ensemble learning</a>, <a href="https://publications.waset.org/abstracts/search?q=explosion%20detection" title=" explosion detection"> explosion detection</a>, <a href="https://publications.waset.org/abstracts/search?q=video%20processing" title=" video processing"> video processing</a> </p> <a href="https://publications.waset.org/abstracts/183644/faster-lighter-more-accurate-a-deep-learning-ensemble-for-content-moderation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/183644.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">55</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">177</span> Assessing Student Collaboration in Music Ensemble Class: From the Formulation of Grading Rubrics to Their Effective Implementation</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Jason%20Sah">Jason Sah</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Music ensemble class is a non-traditional classroom in the sense that it is always a group effort during rehearsal. When measuring student performance ability in class, it is imperative that the grading rubric includes a collaborative skill component. Assessments that stop short of testing students' ability to make music with others undermine the group mentality by elevating individual prowess. Applying empirical and evidence-based methodology, this research develops a grading rubric that defines the criteria for assessing collaborative skill, and then explores different strategies for implementing this rubric in a timely and effective manner. Findings show that when collaborative skill is regularly tested, students gradually shift their attention from playing their own part well to sharing their part with others. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=assessment" title="assessment">assessment</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20class" title=" ensemble class"> ensemble class</a>, <a href="https://publications.waset.org/abstracts/search?q=grading%20rubric" title=" grading rubric"> grading rubric</a>, <a href="https://publications.waset.org/abstracts/search?q=student%20collaboration" title=" student collaboration"> student collaboration</a> </p> <a href="https://publications.waset.org/abstracts/112833/assessing-student-collaboration-in-music-ensemble-class-from-the-formulation-of-grading-rubrics-to-their-effective-implementation" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/112833.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">135</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">176</span> An Ensemble-based Method for Vehicle Color Recognition</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Saeedeh%20Barzegar%20Khalilsaraei">Saeedeh Barzegar Khalilsaraei</a>, <a href="https://publications.waset.org/abstracts/search?q=Manoocheher%20Kelarestaghi"> Manoocheher Kelarestaghi</a>, <a href="https://publications.waset.org/abstracts/search?q=Farshad%20Eshghi"> Farshad Eshghi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The vehicle color, as a prominent and stable feature, helps to identify a vehicle more accurately. As a result, vehicle color recognition is of great importance in intelligent transportation systems. Unlike conventional methods which use only a single Convolutional Neural Network (CNN) for feature extraction or classification, in this paper, four CNNs, with different architectures well-performing in different classes, are trained to extract various features from the input image. To take advantage of the distinct capability of each network, the multiple outputs are combined using a stack generalization algorithm as an ensemble technique. As a result, the final model performs better than each CNN individually in vehicle color identification. The evaluation results in terms of overall average accuracy and accuracy variance show the proposed method’s outperformance compared to the state-of-the-art rivals. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Vehicle%20Color%20Recognition" title="Vehicle Color Recognition">Vehicle Color Recognition</a>, <a href="https://publications.waset.org/abstracts/search?q=Ensemble%20Algorithm" title="Ensemble Algorithm">Ensemble Algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=Stack%20Generalization" title="Stack Generalization">Stack Generalization</a>, <a href="https://publications.waset.org/abstracts/search?q=Convolutional%20Neural%20Network" title="Convolutional Neural Network">Convolutional Neural Network</a> </p> <a href="https://publications.waset.org/abstracts/146909/an-ensemble-based-method-for-vehicle-color-recognition" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/146909.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">85</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">175</span> Machine Learning Predictive Models for Hydroponic Systems: A Case Study Nutrient Film Technique and Deep Flow Technique</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Kritiyaporn%20Kunsook">Kritiyaporn Kunsook</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Machine learning algorithms (MLAs) such us artificial neural networks (ANNs), decision tree, support vector machines (SVMs), Naïve Bayes, and ensemble classifier by voting are powerful data driven methods that are relatively less widely used in the mapping of technique of system, and thus have not been comparatively evaluated together thoroughly in this field. The performances of a series of MLAs, ANNs, decision tree, SVMs, Naïve Bayes, and ensemble classifier by voting in technique of hydroponic systems prospectively modeling are compared based on the accuracy of each model. Classification of hydroponic systems only covers the test samples from vegetables grown with Nutrient film technique (NFT) and Deep flow technique (DFT). The feature, which are the characteristics of vegetables compose harvesting height width, temperature, require light and color. The results indicate that the classification performance of the ANNs is 98%, decision tree is 98%, SVMs is 97.33%, Naïve Bayes is 96.67%, and ensemble classifier by voting is 98.96% algorithm respectively. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=artificial%20neural%20networks" title="artificial neural networks">artificial neural networks</a>, <a href="https://publications.waset.org/abstracts/search?q=decision%20tree" title=" decision tree"> decision tree</a>, <a href="https://publications.waset.org/abstracts/search?q=support%20vector%20machines" title=" support vector machines"> support vector machines</a>, <a href="https://publications.waset.org/abstracts/search?q=na%C3%AFve%20Bayes" title=" naïve Bayes"> naïve Bayes</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20classifier%20by%20voting" title=" ensemble classifier by voting"> ensemble classifier by voting</a> </p> <a href="https://publications.waset.org/abstracts/91070/machine-learning-predictive-models-for-hydroponic-systems-a-case-study-nutrient-film-technique-and-deep-flow-technique" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/91070.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">372</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">174</span> Ensemble-Based SVM Classification Approach for miRNA Prediction</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sondos%20M.%20Hammad">Sondos M. Hammad</a>, <a href="https://publications.waset.org/abstracts/search?q=Sherin%20M.%20ElGokhy"> Sherin M. ElGokhy</a>, <a href="https://publications.waset.org/abstracts/search?q=Mahmoud%20M.%20Fahmy"> Mahmoud M. Fahmy</a>, <a href="https://publications.waset.org/abstracts/search?q=Elsayed%20A.%20Sallam"> Elsayed A. Sallam</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In this paper, an ensemble-based Support Vector Machine (SVM) classification approach is proposed. It is used for miRNA prediction. Three problems, commonly associated with previous approaches, are alleviated. These problems arise due to impose assumptions on the secondary structural of premiRNA, imbalance between the numbers of the laboratory checked miRNAs and the pseudo-hairpins, and finally using a training data set that does not consider all the varieties of samples in different species. We aggregate the predicted outputs of three well-known SVM classifiers; namely, Triplet-SVM, Virgo and Mirident, weighted by their variant features without any structural assumptions. An additional SVM layer is used in aggregating the final output. The proposed approach is trained and then tested with balanced data sets. The results of the proposed approach outperform the three base classifiers. Improved values for the metrics of 88.88% f-score, 92.73% accuracy, 90.64% precision, 96.64% specificity, 87.2% sensitivity, and the area under the ROC curve is 0.91 are achieved. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=MiRNAs" title="MiRNAs">MiRNAs</a>, <a href="https://publications.waset.org/abstracts/search?q=SVM%20classification" title=" SVM classification"> SVM classification</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20algorithm" title=" ensemble algorithm"> ensemble algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=assumption%20problem" title=" assumption problem"> assumption problem</a>, <a href="https://publications.waset.org/abstracts/search?q=imbalance%20data" title=" imbalance data"> imbalance data</a> </p> <a href="https://publications.waset.org/abstracts/32331/ensemble-based-svm-classification-approach-for-mirna-prediction" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/32331.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">349</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">173</span> Recommender Systems Using Ensemble Techniques</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Yeonjeong%20Lee">Yeonjeong Lee</a>, <a href="https://publications.waset.org/abstracts/search?q=Kyoung-jae%20Kim"> Kyoung-jae Kim</a>, <a href="https://publications.waset.org/abstracts/search?q=Youngtae%20Kim"> Youngtae Kim</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This study proposes a novel recommender system that uses data mining and multi-model ensemble techniques to enhance the recommendation performance through reflecting the precise user’s preference. The proposed model consists of two steps. In the first step, this study uses logistic regression, decision trees, and artificial neural networks to predict customers who have high likelihood to purchase products in each product group. Then, this study combines the results of each predictor using the multi-model ensemble techniques such as bagging and bumping. In the second step, this study uses the market basket analysis to extract association rules for co-purchased products. Finally, the system selects customers who have high likelihood to purchase products in each product group and recommends proper products from same or different product groups to them through above two steps. We test the usability of the proposed system by using prototype and real-world transaction and profile data. In addition, we survey about user satisfaction for the recommended product list from the proposed system and the randomly selected product lists. The results also show that the proposed system may be useful in real-world online shopping store. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=product%20recommender%20system" title="product recommender system">product recommender system</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20technique" title=" ensemble technique"> ensemble technique</a>, <a href="https://publications.waset.org/abstracts/search?q=association%20rules" title=" association rules"> association rules</a>, <a href="https://publications.waset.org/abstracts/search?q=decision%20tree" title=" decision tree"> decision tree</a>, <a href="https://publications.waset.org/abstracts/search?q=artificial%20neural%20networks" title=" artificial neural networks"> artificial neural networks</a> </p> <a href="https://publications.waset.org/abstracts/1875/recommender-systems-using-ensemble-techniques" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/1875.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">294</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">172</span> Differences in the Level of Self-Efficacy and Intensity of Narcissism among Band and Solo Musicians</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Weronika%20Moli%C5%84ska">Weronika Molińska</a>, <a href="https://publications.waset.org/abstracts/search?q=Joanna%20Rajchert"> Joanna Rajchert</a> </p> <p class="card-text"><strong>Abstract:</strong></p> A musical career is not only about the quality of performing or playing music. Musicians can choose from a variety of specializations and career paths. The described study focused on psychological traits which relate to a solo career (performing individually or as a leader) or performing as part of a chamber ensemble, ensemble, choir, or orchestra. The hypothesis predicted that narcissism and self-efficacy would be higher in musicians performing solo. The study involved 124 professional musicians: instrumentalists and soloists, singers (n = 59), and ensemble instrumentalists and singers (n = 65). The results confirmed the hypothesis and showed that soloists were higher on self-efficacy and narcissism. In particular, soloists were higher on leader characteristics, demand for admiration, and vanity than musicians performing in ensembles. The result of these studies is a good introduction to a broader project answering the questions of what can increase or decrease the musician's sense of self-efficacy and whether the decreased self-efficacy could induce musicians to give up their solo careers. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=self-efficacy" title="self-efficacy">self-efficacy</a>, <a href="https://publications.waset.org/abstracts/search?q=musicians" title=" musicians"> musicians</a>, <a href="https://publications.waset.org/abstracts/search?q=musical%20profession" title=" musical profession"> musical profession</a>, <a href="https://publications.waset.org/abstracts/search?q=narcissism" title=" narcissism"> narcissism</a>, <a href="https://publications.waset.org/abstracts/search?q=soloists" title=" soloists"> soloists</a> </p> <a href="https://publications.waset.org/abstracts/146167/differences-in-the-level-of-self-efficacy-and-intensity-of-narcissism-among-band-and-solo-musicians" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/146167.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">65</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">171</span> Improve Student Performance Prediction Using Majority Vote Ensemble Model for Higher Education</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Wade%20Ghribi">Wade Ghribi</a>, <a href="https://publications.waset.org/abstracts/search?q=Abdelmoty%20M.%20Ahmed"> Abdelmoty M. Ahmed</a>, <a href="https://publications.waset.org/abstracts/search?q=Ahmed%20Said%20Badawy"> Ahmed Said Badawy</a>, <a href="https://publications.waset.org/abstracts/search?q=Belgacem%20Bouallegue"> Belgacem Bouallegue</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In higher education institutions, the most pressing priority is to improve student performance and retention. Large volumes of student data are used in Educational Data Mining techniques to find new hidden information from students' learning behavior, particularly to uncover the early symptom of at-risk pupils. On the other hand, data with noise, outliers, and irrelevant information may provide incorrect conclusions. By identifying features of students' data that have the potential to improve performance prediction results, comparing and identifying the most appropriate ensemble learning technique after preprocessing the data, and optimizing the hyperparameters, this paper aims to develop a reliable students' performance prediction model for Higher Education Institutions. Data was gathered from two different systems: a student information system and an e-learning system for undergraduate students in the College of Computer Science of a Saudi Arabian State University. The cases of 4413 students were used in this article. The process includes data collection, data integration, data preprocessing (such as cleaning, normalization, and transformation), feature selection, pattern extraction, and, finally, model optimization and assessment. Random Forest, Bagging, Stacking, Majority Vote, and two types of Boosting techniques, AdaBoost and XGBoost, are ensemble learning approaches, whereas Decision Tree, Support Vector Machine, and Artificial Neural Network are supervised learning techniques. Hyperparameters for ensemble learning systems will be fine-tuned to provide enhanced performance and optimal output. The findings imply that combining features of students' behavior from e-learning and students' information systems using Majority Vote produced better outcomes than the other ensemble techniques. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=educational%20data%20mining" title="educational data mining">educational data mining</a>, <a href="https://publications.waset.org/abstracts/search?q=student%20performance%20prediction" title=" student performance prediction"> student performance prediction</a>, <a href="https://publications.waset.org/abstracts/search?q=e-learning" title=" e-learning"> e-learning</a>, <a href="https://publications.waset.org/abstracts/search?q=classification" title=" classification"> classification</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20learning" title=" ensemble learning"> ensemble learning</a>, <a href="https://publications.waset.org/abstracts/search?q=higher%20education" title=" higher education"> higher education</a> </p> <a href="https://publications.waset.org/abstracts/149220/improve-student-performance-prediction-using-majority-vote-ensemble-model-for-higher-education" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/149220.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">108</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">170</span> An Ensemble Learning Method for Applying Particle Swarm Optimization Algorithms to Systems Engineering Problems</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Ken%20Hampshire">Ken Hampshire</a>, <a href="https://publications.waset.org/abstracts/search?q=Thomas%20Mazzuchi"> Thomas Mazzuchi</a>, <a href="https://publications.waset.org/abstracts/search?q=Shahram%20Sarkani"> Shahram Sarkani</a> </p> <p class="card-text"><strong>Abstract:</strong></p> As a subset of metaheuristics, nature-inspired optimization algorithms such as particle swarm optimization (PSO) have shown promise both in solving intractable problems and in their extensibility to novel problem formulations due to their general approach requiring few assumptions. Unfortunately, single instantiations of algorithms require detailed tuning of parameters and cannot be proven to be best suited to a particular illustrative problem on account of the “no free lunch” (NFL) theorem. Using these algorithms in real-world problems requires exquisite knowledge of the many techniques and is not conducive to reconciling the various approaches to given classes of problems. This research aims to present a unified view of PSO-based approaches from the perspective of relevant systems engineering problems, with the express purpose of then eliciting the best solution for any problem formulation in an ensemble learning bucket of models approach. The central hypothesis of the research is that extending the PSO algorithms found in the literature to real-world optimization problems requires a general ensemble-based method for all problem formulations but a specific implementation and solution for any instance. The main results are a problem-based literature survey and a general method to find more globally optimal solutions for any systems engineering optimization problem. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=particle%20swarm%20optimization" title="particle swarm optimization">particle swarm optimization</a>, <a href="https://publications.waset.org/abstracts/search?q=nature-inspired%20optimization" title=" nature-inspired optimization"> nature-inspired optimization</a>, <a href="https://publications.waset.org/abstracts/search?q=metaheuristics" title=" metaheuristics"> metaheuristics</a>, <a href="https://publications.waset.org/abstracts/search?q=systems%20engineering" title=" systems engineering"> systems engineering</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20learning" title=" ensemble learning"> ensemble learning</a> </p> <a href="https://publications.waset.org/abstracts/167097/an-ensemble-learning-method-for-applying-particle-swarm-optimization-algorithms-to-systems-engineering-problems" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/167097.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">99</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">169</span> Methods for Enhancing Ensemble Learning or Improving Classifiers of This Technique in the Analysis and Classification of Brain Signals</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Seyed%20Mehdi%20Ghezi">Seyed Mehdi Ghezi</a>, <a href="https://publications.waset.org/abstracts/search?q=Hesam%20Hasanpoor"> Hesam Hasanpoor</a> </p> <p class="card-text"><strong>Abstract:</strong></p> This scientific article explores enhancement methods for ensemble learning with the aim of improving the performance of classifiers in the analysis and classification of brain signals. The research approach in this field consists of two main parts, each with its own strengths and weaknesses. The choice of approach depends on the specific research question and available resources. By combining these approaches and leveraging their respective strengths, researchers can enhance the accuracy and reliability of classification results, consequently advancing our understanding of the brain and its functions. The first approach focuses on utilizing machine learning methods to identify the best features among the vast array of features present in brain signals. The selection of features varies depending on the research objective, and different techniques have been employed for this purpose. For instance, the genetic algorithm has been used in some studies to identify the best features, while optimization methods have been utilized in others to identify the most influential features. Additionally, machine learning techniques have been applied to determine the influential electrodes in classification. Ensemble learning plays a crucial role in identifying the best features that contribute to learning, thereby improving the overall results. The second approach concentrates on designing and implementing methods for selecting the best classifier or utilizing meta-classifiers to enhance the final results in ensemble learning. In a different section of the research, a single classifier is used instead of multiple classifiers, employing different sets of features to improve the results. The article provides an in-depth examination of each technique, highlighting their advantages and limitations. By integrating these techniques, researchers can enhance the performance of classifiers in the analysis and classification of brain signals. This advancement in ensemble learning methodologies contributes to a better understanding of the brain and its functions, ultimately leading to improved accuracy and reliability in brain signal analysis and classification. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=ensemble%20learning" title="ensemble learning">ensemble learning</a>, <a href="https://publications.waset.org/abstracts/search?q=brain%20signals" title=" brain signals"> brain signals</a>, <a href="https://publications.waset.org/abstracts/search?q=classification" title=" classification"> classification</a>, <a href="https://publications.waset.org/abstracts/search?q=feature%20selection" title=" feature selection"> feature selection</a>, <a href="https://publications.waset.org/abstracts/search?q=machine%20learning" title=" machine learning"> machine learning</a>, <a href="https://publications.waset.org/abstracts/search?q=genetic%20algorithm" title=" genetic algorithm"> genetic algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=optimization%20methods" title=" optimization methods"> optimization methods</a>, <a href="https://publications.waset.org/abstracts/search?q=influential%20features" title=" influential features"> influential features</a>, <a href="https://publications.waset.org/abstracts/search?q=influential%20electrodes" title=" influential electrodes"> influential electrodes</a>, <a href="https://publications.waset.org/abstracts/search?q=meta-classifiers" title=" meta-classifiers"> meta-classifiers</a> </p> <a href="https://publications.waset.org/abstracts/177312/methods-for-enhancing-ensemble-learning-or-improving-classifiers-of-this-technique-in-the-analysis-and-classification-of-brain-signals" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/177312.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">75</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">168</span> Multilabel Classification with Neural Network Ensemble Method</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sezin%20Ek%C5%9Fio%C4%9Flu">Sezin Ekşioğlu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Multilabel classification has a huge importance for several applications, it is also a challenging research topic. It is a kind of supervised learning that contains binary targets. The distance between multilabel and binary classification is having more than one class in multilabel classification problems. Features can belong to one class or many classes. There exists a wide range of applications for multi label prediction such as image labeling, text categorization, gene functionality. Even though features are classified in many classes, they may not always be properly classified. There are many ensemble methods for the classification. However, most of the researchers have been concerned about better multilabel methods. Especially little ones focus on both efficiency of classifiers and pairwise relationships at the same time in order to implement better multilabel classification. In this paper, we worked on modified ensemble methods by getting benefit from k-Nearest Neighbors and neural network structure to address issues within a beneficial way and to get better impacts from the multilabel classification. Publicly available datasets (yeast, emotion, scene and birds) are performed to demonstrate the developed algorithm efficiency and the technique is measured by accuracy, F1 score and hamming loss metrics. Our algorithm boosts benchmarks for each datasets with different metrics. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=multilabel" title="multilabel">multilabel</a>, <a href="https://publications.waset.org/abstracts/search?q=classification" title=" classification"> classification</a>, <a href="https://publications.waset.org/abstracts/search?q=neural%20network" title=" neural network"> neural network</a>, <a href="https://publications.waset.org/abstracts/search?q=KNN" title=" KNN"> KNN</a> </p> <a href="https://publications.waset.org/abstracts/148169/multilabel-classification-with-neural-network-ensemble-method" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/148169.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">155</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">167</span> A Genetic Algorithm Based Ensemble Method with Pairwise Consensus Score on Malware Cacophonous Labels</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Shih-Yu%20Wang">Shih-Yu Wang</a>, <a href="https://publications.waset.org/abstracts/search?q=Shun-Wen%20Hsiao"> Shun-Wen Hsiao</a> </p> <p class="card-text"><strong>Abstract:</strong></p> In the field of cybersecurity, there exists many vendors giving malware samples classified results, namely naming after the label that contains some important information which is also called AV label. Lots of researchers relay on AV labels for research. Unfortunately, AV labels are too cluttered. They do not have a fixed format and fixed naming rules because the naming results were based on each classifiers' viewpoints. A way to fix the problem is taking a majority vote. However, voting can sometimes create problems of bias. Thus, we create a novel ensemble approach which does not rely on the cacophonous naming result but depend on group identification to aggregate everyone's opinion. To achieve this purpose, we develop an scoring system called Pairwise Consensus Score (PCS) to calculate result similarity. The entire method architecture combine Genetic Algorithm and PCS to find maximum consensus in the group. Experimental results revealed that our method outperformed the majority voting by 10% in term of the score. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=genetic%20algorithm" title="genetic algorithm">genetic algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20learning" title=" ensemble learning"> ensemble learning</a>, <a href="https://publications.waset.org/abstracts/search?q=malware%20family" title=" malware family"> malware family</a>, <a href="https://publications.waset.org/abstracts/search?q=malware%20labeling" title=" malware labeling"> malware labeling</a>, <a href="https://publications.waset.org/abstracts/search?q=AV%20labels" title=" AV labels"> AV labels</a> </p> <a href="https://publications.waset.org/abstracts/159376/a-genetic-algorithm-based-ensemble-method-with-pairwise-consensus-score-on-malware-cacophonous-labels" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/159376.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">86</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">166</span> Neuroevolution Based on Adaptive Ensembles of Biologically Inspired Optimization Algorithms Applied for Modeling a Chemical Engineering Process</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Sabina-Adriana%20Floria">Sabina-Adriana Floria</a>, <a href="https://publications.waset.org/abstracts/search?q=Marius%20Gavrilescu"> Marius Gavrilescu</a>, <a href="https://publications.waset.org/abstracts/search?q=Florin%20Leon"> Florin Leon</a>, <a href="https://publications.waset.org/abstracts/search?q=Silvia%20Curteanu"> Silvia Curteanu</a>, <a href="https://publications.waset.org/abstracts/search?q=Costel%20Anton"> Costel Anton</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Neuroevolution is a subfield of artificial intelligence used to solve various problems in different application areas. Specifically, neuroevolution is a technique that applies biologically inspired methods to generate neural network architectures and optimize their parameters automatically. In this paper, we use different biologically inspired optimization algorithms in an ensemble strategy with the aim of training multilayer perceptron neural networks, resulting in regression models used to simulate the industrial chemical process of obtaining bricks from silicone-based materials. Installations in the raw ceramics industry, i.e., bricks, are characterized by significant energy consumption and large quantities of emissions. In addition, the initial conditions that were taken into account during the design and commissioning of the installation can change over time, which leads to the need to add new mixes to adjust the operating conditions for the desired purpose, e.g., material properties and energy saving. The present approach follows the study by simulation of a process of obtaining bricks from silicone-based materials, i.e., the modeling and optimization of the process. Optimization aims to determine the working conditions that minimize the emissions represented by nitrogen monoxide. We first use a search procedure to find the best values for the parameters of various biologically inspired optimization algorithms. Then, we propose an adaptive ensemble strategy that uses only a subset of the best algorithms identified in the search stage. The adaptive ensemble strategy combines the results of selected algorithms and automatically assigns more processing capacity to the more efficient algorithms. Their efficiency may also vary at different stages of the optimization process. In a given ensemble iteration, the most efficient algorithms aim to maintain good convergence, while the less efficient algorithms can improve population diversity. The proposed adaptive ensemble strategy outperforms the individual optimizers and the non-adaptive ensemble strategy in convergence speed, and the obtained results provide lower error values. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=optimization" title="optimization">optimization</a>, <a href="https://publications.waset.org/abstracts/search?q=biologically%20inspired%20algorithm" title=" biologically inspired algorithm"> biologically inspired algorithm</a>, <a href="https://publications.waset.org/abstracts/search?q=neuroevolution" title=" neuroevolution"> neuroevolution</a>, <a href="https://publications.waset.org/abstracts/search?q=ensembles" title=" ensembles"> ensembles</a>, <a href="https://publications.waset.org/abstracts/search?q=bricks" title=" bricks"> bricks</a>, <a href="https://publications.waset.org/abstracts/search?q=emission%20minimization" title=" emission minimization"> emission minimization</a> </p> <a href="https://publications.waset.org/abstracts/162135/neuroevolution-based-on-adaptive-ensembles-of-biologically-inspired-optimization-algorithms-applied-for-modeling-a-chemical-engineering-process" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/162135.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">116</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">165</span> A Dynamic Ensemble Learning Approach for Online Anomaly Detection in Alibaba Datacenters</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Wanyi%20Zhu">Wanyi Zhu</a>, <a href="https://publications.waset.org/abstracts/search?q=Xia%20Ming"> Xia Ming</a>, <a href="https://publications.waset.org/abstracts/search?q=Huafeng%20Wang"> Huafeng Wang</a>, <a href="https://publications.waset.org/abstracts/search?q=Junda%20Chen"> Junda Chen</a>, <a href="https://publications.waset.org/abstracts/search?q=Lu%20Liu"> Lu Liu</a>, <a href="https://publications.waset.org/abstracts/search?q=Jiangwei%20Jiang"> Jiangwei Jiang</a>, <a href="https://publications.waset.org/abstracts/search?q=Guohua%20Liu"> Guohua Liu</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Anomaly detection is a first and imperative step needed to respond to unexpected problems and to assure high performance and security in large data center management. This paper presents an online anomaly detection system through an innovative approach of ensemble machine learning and adaptive differentiation algorithms, and applies them to performance data collected from a continuous monitoring system for multi-tier web applications running in Alibaba data centers. We evaluate the effectiveness and efficiency of this algorithm with production traffic data and compare with the traditional anomaly detection approaches such as a static threshold and other deviation-based detection techniques. The experiment results show that our algorithm correctly identifies the unexpected performance variances of any running application, with an acceptable false positive rate. This proposed approach has already been deployed in real-time production environments to enhance the efficiency and stability in daily data center operations. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Alibaba%20data%20centers" title="Alibaba data centers">Alibaba data centers</a>, <a href="https://publications.waset.org/abstracts/search?q=anomaly%20detection" title=" anomaly detection"> anomaly detection</a>, <a href="https://publications.waset.org/abstracts/search?q=big%20data%20computation" title=" big data computation"> big data computation</a>, <a href="https://publications.waset.org/abstracts/search?q=dynamic%20ensemble%20learning" title=" dynamic ensemble learning"> dynamic ensemble learning</a> </p> <a href="https://publications.waset.org/abstracts/86171/a-dynamic-ensemble-learning-approach-for-online-anomaly-detection-in-alibaba-datacenters" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/86171.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">201</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">164</span> An Ensemble Deep Learning Architecture for Imbalanced Classification of Thoracic Surgery Patients</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Saba%20%20Ebrahimi">Saba Ebrahimi</a>, <a href="https://publications.waset.org/abstracts/search?q=Saeed%20Ahmadian"> Saeed Ahmadian</a>, <a href="https://publications.waset.org/abstracts/search?q=Hedie%20%20Ashrafi"> Hedie Ashrafi</a> </p> <p class="card-text"><strong>Abstract:</strong></p> Selecting appropriate patients for surgery is one of the main issues in thoracic surgery (TS). Both short-term and long-term risks and benefits of surgery must be considered in the patient selection criteria. There are some limitations in the existing datasets of TS patients because of missing values of attributes and imbalanced distribution of survival classes. In this study, a novel ensemble architecture of deep learning networks is proposed based on stacking different linear and non-linear layers to deal with imbalance datasets. The categorical and numerical features are split using different layers with ability to shrink the unnecessary features. Then, after extracting the insight from the raw features, a novel biased-kernel layer is applied to reinforce the gradient of the minority class and cause the network to be trained better comparing the current methods. Finally, the performance and advantages of our proposed model over the existing models are examined for predicting patient survival after thoracic surgery using a real-life clinical data for lung cancer patients. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=deep%20learning" title="deep learning">deep learning</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20models" title=" ensemble models"> ensemble models</a>, <a href="https://publications.waset.org/abstracts/search?q=imbalanced%20classification" title=" imbalanced classification"> imbalanced classification</a>, <a href="https://publications.waset.org/abstracts/search?q=lung%20cancer" title=" lung cancer"> lung cancer</a>, <a href="https://publications.waset.org/abstracts/search?q=TS%20patient%20selection" title=" TS patient selection"> TS patient selection</a> </p> <a href="https://publications.waset.org/abstracts/128394/an-ensemble-deep-learning-architecture-for-imbalanced-classification-of-thoracic-surgery-patients" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/128394.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">145</span> </span> </div> </div> <div class="card paper-listing mb-3 mt-3"> <h5 class="card-header" style="font-size:.9rem"><span class="badge badge-info">163</span> Ensemble of Deep CNN Architecture for Classifying the Source and Quality of Teff Cereal</h5> <div class="card-body"> <p class="card-text"><strong>Authors:</strong> <a href="https://publications.waset.org/abstracts/search?q=Belayneh%20Matebie">Belayneh Matebie</a>, <a href="https://publications.waset.org/abstracts/search?q=Michael%20Melese"> Michael Melese</a> </p> <p class="card-text"><strong>Abstract:</strong></p> The study focuses on addressing the challenges in classifying and ensuring the quality of Eragrostis Teff, a small and round grain that is the smallest cereal grain. Employing a traditional classification method is challenging because of its small size and the similarity of its environmental characteristics. To overcome this, this study employs a machine learning approach to develop a source and quality classification system for Teff cereal. Data is collected from various production areas in the Amhara regions, considering two types of cereal (high and low quality) across eight classes. A total of 5,920 images are collected, with 740 images for each class. Image enhancement techniques, including scaling, data augmentation, histogram equalization, and noise removal, are applied to preprocess the data. Convolutional Neural Network (CNN) is then used to extract relevant features and reduce dimensionality. The dataset is split into 80% for training and 20% for testing. Different classifiers, including FVGG16, FINCV3, QSCTC, EMQSCTC, SVM, and RF, are employed for classification, achieving accuracy rates ranging from 86.91% to 97.72%. The ensemble of FVGG16, FINCV3, and QSCTC using the Max-Voting approach outperforms individual algorithms. <p class="card-text"><strong>Keywords:</strong> <a href="https://publications.waset.org/abstracts/search?q=Teff" title="Teff">Teff</a>, <a href="https://publications.waset.org/abstracts/search?q=ensemble%20learning" title=" ensemble learning"> ensemble learning</a>, <a href="https://publications.waset.org/abstracts/search?q=max-voting" title=" max-voting"> max-voting</a>, <a href="https://publications.waset.org/abstracts/search?q=CNN" title=" CNN"> CNN</a>, <a href="https://publications.waset.org/abstracts/search?q=SVM" title=" SVM"> SVM</a>, <a href="https://publications.waset.org/abstracts/search?q=RF" title=" RF"> RF</a> </p> <a href="https://publications.waset.org/abstracts/186043/ensemble-of-deep-cnn-architecture-for-classifying-the-source-and-quality-of-teff-cereal" class="btn btn-primary btn-sm">Procedia</a> <a href="https://publications.waset.org/abstracts/186043.pdf" target="_blank" class="btn btn-primary btn-sm">PDF</a> <span class="bg-info text-light px-1 py-1 float-right rounded"> Downloads <span class="badge badge-light">53</span> </span> </div> </div> <ul class="pagination"> <li class="page-item disabled"><span class="page-link">‹</span></li> <li class="page-item active"><span class="page-link">1</span></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=multi-model%20ensemble&page=2">2</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=multi-model%20ensemble&page=3">3</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=multi-model%20ensemble&page=4">4</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=multi-model%20ensemble&page=5">5</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=multi-model%20ensemble&page=6">6</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=multi-model%20ensemble&page=7">7</a></li> <li class="page-item"><a class="page-link" href="https://publications.waset.org/abstracts/search?q=multi-model%20ensemble&page=2" rel="next">›</a></li> </ul> </div> </main> <footer> <div id="infolinks" class="pt-3 pb-2"> <div class="container"> <div style="background-color:#f5f5f5;" class="p-3"> <div class="row"> <div class="col-md-2"> <ul class="list-unstyled"> About <li><a href="https://waset.org/page/support">About Us</a></li> <li><a href="https://waset.org/page/support#legal-information">Legal</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/WASET-16th-foundational-anniversary.pdf">WASET celebrates its 16th foundational anniversary</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Account <li><a href="https://waset.org/profile">My Account</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Explore <li><a href="https://waset.org/disciplines">Disciplines</a></li> <li><a href="https://waset.org/conferences">Conferences</a></li> <li><a href="https://waset.org/conference-programs">Conference Program</a></li> <li><a href="https://waset.org/committees">Committees</a></li> <li><a href="https://publications.waset.org">Publications</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Research <li><a href="https://publications.waset.org/abstracts">Abstracts</a></li> <li><a href="https://publications.waset.org">Periodicals</a></li> <li><a href="https://publications.waset.org/archive">Archive</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Open Science <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Philosophy.pdf">Open Science Philosophy</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Science-Award.pdf">Open Science Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Open-Society-Open-Science-and-Open-Innovation.pdf">Open Innovation</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Postdoctoral-Fellowship-Award.pdf">Postdoctoral Fellowship Award</a></li> <li><a target="_blank" rel="nofollow" href="https://publications.waset.org/static/files/Scholarly-Research-Review.pdf">Scholarly Research Review</a></li> </ul> </div> <div class="col-md-2"> <ul class="list-unstyled"> Support <li><a href="https://waset.org/page/support">Support</a></li> <li><a href="https://waset.org/profile/messages/create">Contact Us</a></li> <li><a href="https://waset.org/profile/messages/create">Report Abuse</a></li> </ul> </div> </div> </div> </div> </div> <div class="container text-center"> <hr style="margin-top:0;margin-bottom:.3rem;"> <a href="https://creativecommons.org/licenses/by/4.0/" target="_blank" class="text-muted small">Creative Commons Attribution 4.0 International License</a> <div id="copy" class="mt-2">© 2024 World Academy of Science, Engineering and Technology</div> </div> </footer> <a href="javascript:" id="return-to-top"><i class="fas fa-arrow-up"></i></a> <div class="modal" id="modal-template"> <div class="modal-dialog"> <div class="modal-content"> <div class="row m-0 mt-1"> <div class="col-md-12"> <button type="button" class="close" data-dismiss="modal" aria-label="Close"><span aria-hidden="true">×</span></button> </div> </div> <div class="modal-body"></div> </div> </div> </div> <script src="https://cdn.waset.org/static/plugins/jquery-3.3.1.min.js"></script> <script src="https://cdn.waset.org/static/plugins/bootstrap-4.2.1/js/bootstrap.bundle.min.js"></script> <script src="https://cdn.waset.org/static/js/site.js?v=150220211556"></script> <script> jQuery(document).ready(function() { /*jQuery.get("https://publications.waset.org/xhr/user-menu", function (response) { jQuery('#mainNavMenu').append(response); });*/ jQuery.get({ url: "https://publications.waset.org/xhr/user-menu", cache: false }).then(function(response){ jQuery('#mainNavMenu').append(response); }); }); </script> </body> </html>