CINXE.COM
SCITEPRESS - SCIENCE AND TECHNOLOGY PUBLICATIONS
<!-- views/paperById.ejs --> <!DOCTYPE html> <html> <head> <meta charset="utf-8"> <title>SCITEPRESS - SCIENCE AND TECHNOLOGY PUBLICATIONS</title> <meta name ="description" content="Digital Library" /> <meta name="citation_language" content="en"> <meta name="citation_title" content="Adaptive Combination of a Genetic Algorithm and Novelty Search for Deep Neuroevolution"> <meta name="citation_abstract" content="Evolutionary Computation (EC) has been shown to be able to quickly train Deep Artificial Neural Networks (DNNs) to solve Reinforcement Learning (RL) problems. While a Genetic Algorithm (GA) is well-suited for exploiting reward functions that are neither deceptive nor sparse, it struggles when the reward function is either of those. To that end, Novelty Search (NS) has been shown to be able to outperform gradient-following optimizers in some cases, while under-performing in others. We propose a new algorithm: Explore-Exploit g-Adaptive Learner (E 2 gAL, or EyAL). By preserving a dynamically-sized niche of novelty-seeking agents, the algorithm manages to maintain population diversity, exploiting the reward signal when possible and exploring otherwise. The algorithm combines both the exploitation power of a GA and the exploration power of NS, while maintaining their simplicity and elegance. Our experiments show that EyAL outperforms NS in most scenarios, while being on par with a GA鈥攁nd in some scenarios it can outperform both. EyAL also allows the substitution of the exploiting component (GA) and the exploring component (NS) with other algorithms, e.g., Evolution Strategy and Surprise Search, thus opening the door for future research."> <meta name="citation_publication_date" content="2022/10/24"> <meta name="citation_conference_title" content="International Joint Conference on Computational Intelligence (IJCCI)"> <meta name="citation_keywords" content="Reinforcement Learning; Evolutionary Computation; Novelty Search; Genetic Algorithm;"> <meta name="citation_doi" content="10.5220/0011550200003332"> <meta name="citation_isbn" content="978-989-758-611-8"> <meta name="citation_volume" content="2"> <meta name="citation_firstpage" content="143"> <meta name="citation_lastpage" content="150"> <meta name="citation_publisher" content="SCITEPRESS"> <meta name="citation_author" content="Eyal Segal" > <meta name="citation_author_institution" content="Department of Computer Science, Ben-Gurion University, Beer Sheva 84105, Israel" > <meta name="citation_author" content="Moshe Sipper" > <meta name="citation_author_institution" content="Department of Computer Science, Ben-Gurion University, Beer Sheva 84105, Israel" > <meta name="citation_abstract_html_url" content="http://www.scitepress.org/Papers/2022/115502"> <meta name="citation_pdf_url" content="http://www.scitepress.org/Papers/2022/115502/115502.pdf"> </head> <body> <article> <a href="/publishedPapers/2022/115502/pdf/index.html"><h1 class="citation_title">Adaptive Combination of a Genetic Algorithm and Novelty Search for Deep Neuroevolution</h1></a> <h3 class="citation_author"> Eyal Segal, Moshe Sipper</h3> <h4 class="citation_publication_date">2022</h4> <h4>Abstract</h4> <p class="citation_abstract">Evolutionary Computation (EC) has been shown to be able to quickly train Deep Artificial Neural Networks (DNNs) to solve Reinforcement Learning (RL) problems. While a Genetic Algorithm (GA) is well-suited for exploiting reward functions that are neither deceptive nor sparse, it struggles when the reward function is either of those. To that end, Novelty Search (NS) has been shown to be able to outperform gradient-following optimizers in some cases, while under-performing in others. We propose a new algorithm: Explore-Exploit g-Adaptive Learner (E 2 gAL, or EyAL). By preserving a dynamically-sized niche of novelty-seeking agents, the algorithm manages to maintain population diversity, exploiting the reward signal when possible and exploring otherwise. The algorithm combines both the exploitation power of a GA and the exploration power of NS, while maintaining their simplicity and elegance. Our experiments show that EyAL outperforms NS in most scenarios, while being on par with a GA鈥攁nd in some scenarios it can outperform both. EyAL also allows the substitution of the exploiting component (GA) and the exploring component (NS) with other algorithms, e.g., Evolution Strategy and Surprise Search, thus opening the door for future research.</p> <a href="http://www.scitepress.org/Papers/2022/115502/115502.pdf" class="citation_pdf_url">Download</a> <br /> <br /> <br/> <h4 style="margin:0;">Paper Citation</h4> <br/> <h4 style="margin:0;">in Harvard Style</h4> <p style="margin:0;">Segal E. and Sipper M. (2022). <b>Adaptive Combination of a Genetic Algorithm and Novelty Search for Deep Neuroevolution</b>. In <i>Proceedings of the 14th International Joint Conference on Computational Intelligence (IJCCI 2022) - Volume 1: ECTA</i>; ISBN 978-989-758-611-8, SciTePress, pages 143-150. DOI: 10.5220/0011550200003332</p> <br/> <h4 style="margin:0;">in Bibtex Style</h4> <p style="margin:0;">@conference{ecta22,<br />author={Eyal Segal and Moshe Sipper},<br />title={Adaptive Combination of a Genetic Algorithm and Novelty Search for Deep Neuroevolution},<br />booktitle={Proceedings of the 14th International Joint Conference on Computational Intelligence (IJCCI 2022) - Volume 1: ECTA},<br />year={2022},<br />pages={143-150},<br />publisher={SciTePress},<br />organization={INSTICC},<br />doi={10.5220/0011550200003332},<br />isbn={978-989-758-611-8},<br />}</p> <br/> <h4 style="margin:0;">in EndNote Style</h4> <p style="margin:0;">TY - CONF <br /><br />JO - Proceedings of the 14th International Joint Conference on Computational Intelligence (IJCCI 2022) - Volume 1: ECTA<br />TI - Adaptive Combination of a Genetic Algorithm and Novelty Search for Deep Neuroevolution<br />SN - 978-989-758-611-8<br />AU - Segal E. <br />AU - Sipper M. <br />PY - 2022<br />SP - 143<br />EP - 150<br />DO - 10.5220/0011550200003332<br />PB - SciTePress<br /></p> <br/> </article> </body> </html>