CINXE.COM

Frontiers | A Deep Learning Approach for Automatic Seizure Detection in Children With Epilepsy

<!doctype html> <html data-n-head-ssr lang="en" data-n-head="%7B%22lang%22:%7B%22ssr%22:%22en%22%7D%7D"> <head > <link data-n-head="ssr" rel="icon" type="image/png" sizes="16x16" href="https://brand.frontiersin.org/m/ed3f9ce840a03d7/favicon_16-tenantFavicon-Frontiers.png"> <link data-n-head="ssr" rel="icon" type="image/png" sizes="32x32" href="https://brand.frontiersin.org/m/ed3f9ce840a03d7/favicon_32-tenantFavicon-Frontiers.png"> <link data-n-head="ssr" rel="apple-touch-icon" type="image/png" sizes="180x180" href="https://brand.frontiersin.org/m/ed3f9ce840a03d7/favicon_180-tenantFavicon-Frontiers.png"> <title>Frontiers | A Deep Learning Approach for Automatic Seizure Detection in Children With Epilepsy</title><meta data-n-head="ssr" charset="utf-8"><meta data-n-head="ssr" name="viewport" content="width=device-width, initial-scale=1"><meta data-n-head="ssr" data-hid="charset" charset="utf-8"><meta data-n-head="ssr" data-hid="mobile-web-app-capable" name="mobile-web-app-capable" content="yes"><meta data-n-head="ssr" data-hid="apple-mobile-web-app-title" name="apple-mobile-web-app-title" content="Frontiers | Articles"><meta data-n-head="ssr" data-hid="theme-color" name="theme-color" content="#0C4DED"><meta data-n-head="ssr" data-hid="description" property="description" name="description" content="Over the last few decades, electroencephalogram (EEG) has become one of the most vital tools used by physicians to diagnose several neurological disorders of..."><meta data-n-head="ssr" data-hid="og:title" property="og:title" name="title" content="Frontiers | A Deep Learning Approach for Automatic Seizure Detection in Children With Epilepsy"><meta data-n-head="ssr" data-hid="og:description" property="og:description" name="description" content="Over the last few decades, electroencephalogram (EEG) has become one of the most vital tools used by physicians to diagnose several neurological disorders of..."><meta data-n-head="ssr" data-hid="keywords" name="keywords" content="deep learning,Epileptic seizure detection,EEG,Autoencoders,Classification,Convolutional Neural Networks,Bidirectional long short term memory (Bi LSTM)"><meta data-n-head="ssr" data-hid="og:site_name" property="og:site_name" name="site_name" content="Frontiers"><meta data-n-head="ssr" data-hid="og:image" property="og:image" name="image" content="https://images-provider.frontiersin.org/api/ipx/w=1200&amp;f=png/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g001.jpg"><meta data-n-head="ssr" data-hid="og:type" property="og:type" name="type" content="article"><meta data-n-head="ssr" data-hid="og:url" property="og:url" name="url" content="https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/full"><meta data-n-head="ssr" data-hid="twitter:card" name="twitter:card" content="summary_large_image"><meta data-n-head="ssr" data-hid="citation_volume" name="citation_volume" content="15"><meta data-n-head="ssr" data-hid="citation_journal_title" name="citation_journal_title" content="Frontiers in Computational Neuroscience"><meta data-n-head="ssr" data-hid="citation_publisher" name="citation_publisher" content="Frontiers"><meta data-n-head="ssr" data-hid="citation_journal_abbrev" name="citation_journal_abbrev" content="Front. Comput. Neurosci."><meta data-n-head="ssr" data-hid="citation_issn" name="citation_issn" content="1662-5188"><meta data-n-head="ssr" data-hid="citation_doi" name="citation_doi" content="10.3389/fncom.2021.650050"><meta data-n-head="ssr" data-hid="citation_firstpage" name="citation_firstpage" content="650050"><meta data-n-head="ssr" data-hid="citation_language" name="citation_language" content="English"><meta data-n-head="ssr" data-hid="citation_title" name="citation_title" content="A Deep Learning Approach for Automatic Seizure Detection in Children With Epilepsy"><meta data-n-head="ssr" data-hid="citation_keywords" name="citation_keywords" content="deep learning; Epileptic seizure detection; EEG; Autoencoders; Classification; Convolutional Neural Networks; Bidirectional long short term memory (Bi LSTM)"><meta data-n-head="ssr" data-hid="citation_abstract" name="citation_abstract" content="&lt;p&gt;Over the last few decades, electroencephalogram (EEG) has become one of the most vital tools used by physicians to diagnose several neurological disorders of the human brain and, in particular, to detect seizures. Because of its peculiar nature, the consequent impact of epileptic seizures on the quality of life of patients made the precise diagnosis of epilepsy extremely essential. Therefore, this article proposes a novel deep-learning approach for detecting seizures in pediatric patients based on the classification of raw multichannel EEG signal recordings that are minimally pre-processed. The new approach takes advantage of the automatic feature learning capabilities of a two-dimensional deep convolution autoencoder (2D-DCAE) linked to a neural network-based classifier to form a unified system that is trained in a supervised way to achieve the best classification accuracy between the ictal and interictal brain state signals. For testing and evaluating our approach, two models were designed and assessed using three different EEG data segment lengths and a 10-fold cross-validation scheme. Based on five evaluation metrics, the best performing model was a supervised deep convolutional autoencoder (SDCAE) model that uses a bidirectional long short-term memory (Bi-LSTM) – based classifier, and EEG segment length of 4 s. Using the public dataset collected from the Children’s Hospital Boston (CHB) and the Massachusetts Institute of Technology (MIT), this model has obtained 98.79 ± 0.53% accuracy, 98.72 ± 0.77% sensitivity, 98.86 ± 0.53% specificity, 98.86 ± 0.53% precision, and an F1-score of 98.79 ± 0.53%, respectively. Based on these results, our new approach was able to present one of the most effective seizure detection methods compared to other existing state-of-the-art methods applied to the same dataset.&lt;/p&gt;"><meta data-n-head="ssr" data-hid="citation_pdf_url" name="citation_pdf_url" content="https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/pdf"><meta data-n-head="ssr" data-hid="citation_online_date" name="citation_online_date" content="2021/03/15"><meta data-n-head="ssr" data-hid="citation_publication_date" name="citation_publication_date" content="2021/04/08"><meta data-n-head="ssr" data-hid="citation_author_0" name="citation_author" content="Abdelhameed, Ahmed"><meta data-n-head="ssr" data-hid="citation_author_institution_0" name="citation_author_institution" content="Department of Electrical and Computer Engineering, University of Louisiana at Lafayette, United States"><meta data-n-head="ssr" data-hid="citation_author_1" name="citation_author" content="Bayoumi, Magdy"><meta data-n-head="ssr" data-hid="citation_author_institution_1" name="citation_author_institution" content="Department of Electrical and Computer Engineering, University of Louisiana at Lafayette, United States"><meta data-n-head="ssr" data-hid="dc.identifier" name="dc.identifier" content="doi:10.3389/fncom.2021.650050"><link data-n-head="ssr" rel="manifest" href="/article-pages/_nuxt/manifest.c499fc0a.json" data-hid="manifest"><link data-n-head="ssr" rel="canonical" href="https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/full"><script data-n-head="ssr" data-hid="newrelic-browser-script" type="text/javascript">window.NREUM||(NREUM={});NREUM.info = {"agent":"","beacon":"bam.nr-data.net","errorBeacon":"bam.nr-data.net","licenseKey":"598a124f17","applicationID":"588603994","agentToken":null,"applicationTime":1.799726,"transactionName":"MQcDMkECCkNSW0YMWghNIgldDQFTRxd1IGFJTQ==","queueTime":0,"ttGuid":"75f21bdac033b69f"}; (window.NREUM||(NREUM={})).init={privacy:{cookies_enabled:true},ajax:{deny_list:["bam.nr-data.net"]},distributed_tracing:{enabled:true}};(window.NREUM||(NREUM={})).loader_config={agentID:"594400880",accountID:"230385",trustKey:"230385",xpid:"VgUHUl5WGwYIXFdSBAgOUg==",licenseKey:"598a124f17",applicationID:"588603994"};;/*! For license information please see nr-loader-spa-1.274.0.min.js.LICENSE.txt */ (()=>{var e,t,r={8122:(e,t,r)=>{"use strict";r.d(t,{a:()=>i});var n=r(944);function i(e,t){try{if(!e||"object"!=typeof e)return(0,n.R)(3);if(!t||"object"!=typeof t)return(0,n.R)(4);const r=Object.create(Object.getPrototypeOf(t),Object.getOwnPropertyDescriptors(t)),o=0===Object.keys(r).length?e:r;for(let a in o)if(void 0!==e[a])try{if(null===e[a]){r[a]=null;continue}Array.isArray(e[a])&&Array.isArray(t[a])?r[a]=Array.from(new Set([...e[a],...t[a]])):"object"==typeof e[a]&&"object"==typeof t[a]?r[a]=i(e[a],t[a]):r[a]=e[a]}catch(e){(0,n.R)(1,e)}return r}catch(e){(0,n.R)(2,e)}}},2555:(e,t,r)=>{"use strict";r.d(t,{Vp:()=>c,fn:()=>s,x1:()=>u});var n=r(384),i=r(8122);const o={beacon:n.NT.beacon,errorBeacon:n.NT.errorBeacon,licenseKey:void 0,applicationID:void 0,sa:void 0,queueTime:void 0,applicationTime:void 0,ttGuid:void 0,user:void 0,account:void 0,product:void 0,extra:void 0,jsAttributes:{},userAttributes:void 0,atts:void 0,transactionName:void 0,tNamePlain:void 0},a={};function s(e){try{const t=c(e);return!!t.licenseKey&&!!t.errorBeacon&&!!t.applicationID}catch(e){return!1}}function c(e){if(!e)throw new Error("All info objects require an agent identifier!");if(!a[e])throw new Error("Info for ".concat(e," was never set"));return a[e]}function u(e,t){if(!e)throw new Error("All info objects require an agent identifier!");a[e]=(0,i.a)(t,o);const r=(0,n.nY)(e);r&&(r.info=a[e])}},9417:(e,t,r)=>{"use strict";r.d(t,{D0:()=>h,gD:()=>g,xN:()=>p});var n=r(993);const i=e=>{if(!e||"string"!=typeof e)return!1;try{document.createDocumentFragment().querySelector(e)}catch{return!1}return!0};var o=r(2614),a=r(944),s=r(384),c=r(8122);const u="[data-nr-mask]",d=()=>{const e={mask_selector:"*",block_selector:"[data-nr-block]",mask_input_options:{color:!1,date:!1,"datetime-local":!1,email:!1,month:!1,number:!1,range:!1,search:!1,tel:!1,text:!1,time:!1,url:!1,week:!1,textarea:!1,select:!1,password:!0}};return{ajax:{deny_list:void 0,block_internal:!0,enabled:!0,harvestTimeSeconds:10,autoStart:!0},distributed_tracing:{enabled:void 0,exclude_newrelic_header:void 0,cors_use_newrelic_header:void 0,cors_use_tracecontext_headers:void 0,allowed_origins:void 0},feature_flags:[],generic_events:{enabled:!0,harvestTimeSeconds:30,autoStart:!0},harvest:{tooManyRequestsDelay:60},jserrors:{enabled:!0,harvestTimeSeconds:10,autoStart:!0},logging:{enabled:!0,harvestTimeSeconds:10,autoStart:!0,level:n.p_.INFO},metrics:{enabled:!0,autoStart:!0},obfuscate:void 0,page_action:{enabled:!0},page_view_event:{enabled:!0,autoStart:!0},page_view_timing:{enabled:!0,harvestTimeSeconds:30,autoStart:!0},performance:{capture_marks:!1,capture_measures:!1},privacy:{cookies_enabled:!0},proxy:{assets:void 0,beacon:void 0},session:{expiresMs:o.wk,inactiveMs:o.BB},session_replay:{autoStart:!0,enabled:!1,harvestTimeSeconds:60,preload:!1,sampling_rate:10,error_sampling_rate:100,collect_fonts:!1,inline_images:!1,fix_stylesheets:!0,mask_all_inputs:!0,get mask_text_selector(){return e.mask_selector},set mask_text_selector(t){i(t)?e.mask_selector="".concat(t,",").concat(u):""===t||null===t?e.mask_selector=u:(0,a.R)(5,t)},get block_class(){return"nr-block"},get ignore_class(){return"nr-ignore"},get mask_text_class(){return"nr-mask"},get block_selector(){return e.block_selector},set block_selector(t){i(t)?e.block_selector+=",".concat(t):""!==t&&(0,a.R)(6,t)},get mask_input_options(){return e.mask_input_options},set mask_input_options(t){t&&"object"==typeof t?e.mask_input_options={...t,password:!0}:(0,a.R)(7,t)}},session_trace:{enabled:!0,harvestTimeSeconds:10,autoStart:!0},soft_navigations:{enabled:!0,harvestTimeSeconds:10,autoStart:!0},spa:{enabled:!0,harvestTimeSeconds:10,autoStart:!0},ssl:void 0,user_actions:{enabled:!0}}},l={},f="All configuration objects require an agent identifier!";function h(e){if(!e)throw new Error(f);if(!l[e])throw new Error("Configuration for ".concat(e," was never set"));return l[e]}function p(e,t){if(!e)throw new Error(f);l[e]=(0,c.a)(t,d());const r=(0,s.nY)(e);r&&(r.init=l[e])}function g(e,t){if(!e)throw new Error(f);var r=h(e);if(r){for(var n=t.split("."),i=0;i<n.length-1;i++)if("object"!=typeof(r=r[n[i]]))return;r=r[n[n.length-1]]}return r}},5603:(e,t,r)=>{"use strict";r.d(t,{a:()=>c,o:()=>s});var n=r(384),i=r(8122);const o={accountID:void 0,trustKey:void 0,agentID:void 0,licenseKey:void 0,applicationID:void 0,xpid:void 0},a={};function s(e){if(!e)throw new Error("All loader-config objects require an agent identifier!");if(!a[e])throw new Error("LoaderConfig for ".concat(e," was never set"));return a[e]}function c(e,t){if(!e)throw new Error("All loader-config objects require an agent identifier!");a[e]=(0,i.a)(t,o);const r=(0,n.nY)(e);r&&(r.loader_config=a[e])}},3371:(e,t,r)=>{"use strict";r.d(t,{V:()=>f,f:()=>l});var n=r(8122),i=r(384),o=r(6154),a=r(9324);let s=0;const c={buildEnv:a.F3,distMethod:a.Xs,version:a.xv,originTime:o.WN},u={customTransaction:void 0,disabled:!1,isolatedBacklog:!1,loaderType:void 0,maxBytes:3e4,onerror:void 0,ptid:void 0,releaseIds:{},appMetadata:{},session:void 0,denyList:void 0,timeKeeper:void 0,obfuscator:void 0},d={};function l(e){if(!e)throw new Error("All runtime objects require an agent identifier!");if(!d[e])throw new Error("Runtime for ".concat(e," was never set"));return d[e]}function f(e,t){if(!e)throw new Error("All runtime objects require an agent identifier!");d[e]={...(0,n.a)(t,u),...c},Object.hasOwnProperty.call(d[e],"harvestCount")||Object.defineProperty(d[e],"harvestCount",{get:()=>++s});const r=(0,i.nY)(e);r&&(r.runtime=d[e])}},9324:(e,t,r)=>{"use strict";r.d(t,{F3:()=>i,Xs:()=>o,Yq:()=>a,xv:()=>n});const n="1.274.0",i="PROD",o="CDN",a="^2.0.0-alpha.17"},6154:(e,t,r)=>{"use strict";r.d(t,{A4:()=>s,OF:()=>d,RI:()=>i,WN:()=>h,bv:()=>o,gm:()=>a,lR:()=>f,m:()=>u,mw:()=>c,sb:()=>l});var n=r(1863);const i="undefined"!=typeof window&&!!window.document,o="undefined"!=typeof WorkerGlobalScope&&("undefined"!=typeof self&&self instanceof WorkerGlobalScope&&self.navigator instanceof WorkerNavigator||"undefined"!=typeof globalThis&&globalThis instanceof WorkerGlobalScope&&globalThis.navigator instanceof WorkerNavigator),a=i?window:"undefined"!=typeof WorkerGlobalScope&&("undefined"!=typeof self&&self instanceof WorkerGlobalScope&&self||"undefined"!=typeof globalThis&&globalThis instanceof WorkerGlobalScope&&globalThis),s="complete"===a?.document?.readyState,c=Boolean("hidden"===a?.document?.visibilityState),u=""+a?.location,d=/iPad|iPhone|iPod/.test(a.navigator?.userAgent),l=d&&"undefined"==typeof SharedWorker,f=(()=>{const e=a.navigator?.userAgent?.match(/Firefox[/\s](\d+\.\d+)/);return Array.isArray(e)&&e.length>=2?+e[1]:0})(),h=Date.now()-(0,n.t)()},7295:(e,t,r)=>{"use strict";r.d(t,{Xv:()=>a,gX:()=>i,iW:()=>o});var n=[];function i(e){if(!e||o(e))return!1;if(0===n.length)return!0;for(var t=0;t<n.length;t++){var r=n[t];if("*"===r.hostname)return!1;if(s(r.hostname,e.hostname)&&c(r.pathname,e.pathname))return!1}return!0}function o(e){return void 0===e.hostname}function a(e){if(n=[],e&&e.length)for(var t=0;t<e.length;t++){let r=e[t];if(!r)continue;0===r.indexOf("http://")?r=r.substring(7):0===r.indexOf("https://")&&(r=r.substring(8));const i=r.indexOf("/");let o,a;i>0?(o=r.substring(0,i),a=r.substring(i)):(o=r,a="");let[s]=o.split(":");n.push({hostname:s,pathname:a})}}function s(e,t){return!(e.length>t.length)&&t.indexOf(e)===t.length-e.length}function c(e,t){return 0===e.indexOf("/")&&(e=e.substring(1)),0===t.indexOf("/")&&(t=t.substring(1)),""===e||e===t}},1687:(e,t,r)=>{"use strict";r.d(t,{Ak:()=>c,Ze:()=>l,x3:()=>u});var n=r(7836),i=r(3606),o=r(860),a=r(2646);const s={};function c(e,t){const r={staged:!1,priority:o.P3[t]||0};d(e),s[e].get(t)||s[e].set(t,r)}function u(e,t){e&&s[e]&&(s[e].get(t)&&s[e].delete(t),h(e,t,!1),s[e].size&&f(e))}function d(e){if(!e)throw new Error("agentIdentifier required");s[e]||(s[e]=new Map)}function l(e="",t="feature",r=!1){if(d(e),!e||!s[e].get(t)||r)return h(e,t);s[e].get(t).staged=!0,f(e)}function f(e){const t=Array.from(s[e]);t.every((([e,t])=>t.staged))&&(t.sort(((e,t)=>e[1].priority-t[1].priority)),t.forEach((([t])=>{s[e].delete(t),h(e,t)})))}function h(e,t,r=!0){const o=e?n.ee.get(e):n.ee,s=i.i.handlers;if(!o.aborted&&o.backlog&&s){if(r){const e=o.backlog[t],r=s[t];if(r){for(let t=0;e&&t<e.length;++t)p(e[t],r);Object.entries(r).forEach((([e,t])=>{Object.values(t||{}).forEach((t=>{t[0]?.on&&t[0]?.context()instanceof a.y&&t[0].on(e,t[1])}))}))}}o.isolatedBacklog||delete s[t],o.backlog[t]=null,o.emit("drain-"+t,[])}}function p(e,t){var r=e[1];Object.values(t[r]||{}).forEach((t=>{var r=e[0];if(t[0]===r){var n=t[1],i=e[3],o=e[2];n.apply(i,o)}}))}},7836:(e,t,r)=>{"use strict";r.d(t,{P:()=>c,ee:()=>u});var n=r(384),i=r(8990),o=r(3371),a=r(2646),s=r(5607);const c="nr@context:".concat(s.W),u=function e(t,r){var n={},s={},d={},l=!1;try{l=16===r.length&&(0,o.f)(r).isolatedBacklog}catch(e){}var f={on:p,addEventListener:p,removeEventListener:function(e,t){var r=n[e];if(!r)return;for(var i=0;i<r.length;i++)r[i]===t&&r.splice(i,1)},emit:function(e,r,n,i,o){!1!==o&&(o=!0);if(u.aborted&&!i)return;t&&o&&t.emit(e,r,n);for(var a=h(n),c=g(e),d=c.length,l=0;l<d;l++)c[l].apply(a,r);var p=v()[s[e]];p&&p.push([f,e,r,a]);return a},get:m,listeners:g,context:h,buffer:function(e,t){const r=v();if(t=t||"feature",f.aborted)return;Object.entries(e||{}).forEach((([e,n])=>{s[n]=t,t in r||(r[t]=[])}))},abort:function(){f._aborted=!0,Object.keys(f.backlog).forEach((e=>{delete f.backlog[e]}))},isBuffering:function(e){return!!v()[s[e]]},debugId:r,backlog:l?{}:t&&"object"==typeof t.backlog?t.backlog:{},isolatedBacklog:l};return Object.defineProperty(f,"aborted",{get:()=>{let e=f._aborted||!1;return e||(t&&(e=t.aborted),e)}}),f;function h(e){return e&&e instanceof a.y?e:e?(0,i.I)(e,c,(()=>new a.y(c))):new a.y(c)}function p(e,t){n[e]=g(e).concat(t)}function g(e){return n[e]||[]}function m(t){return d[t]=d[t]||e(f,t)}function v(){return f.backlog}}(void 0,"globalEE"),d=(0,n.Zm)();d.ee||(d.ee=u)},2646:(e,t,r)=>{"use strict";r.d(t,{y:()=>n});class n{constructor(e){this.contextId=e}}},9908:(e,t,r)=>{"use strict";r.d(t,{d:()=>n,p:()=>i});var n=r(7836).ee.get("handle");function i(e,t,r,i,o){o?(o.buffer([e],i),o.emit(e,t,r)):(n.buffer([e],i),n.emit(e,t,r))}},3606:(e,t,r)=>{"use strict";r.d(t,{i:()=>o});var n=r(9908);o.on=a;var i=o.handlers={};function o(e,t,r,o){a(o||n.d,i,e,t,r)}function a(e,t,r,i,o){o||(o="feature"),e||(e=n.d);var a=t[o]=t[o]||{};(a[r]=a[r]||[]).push([e,i])}},3878:(e,t,r)=>{"use strict";function n(e,t){return{capture:e,passive:!1,signal:t}}function i(e,t,r=!1,i){window.addEventListener(e,t,n(r,i))}function o(e,t,r=!1,i){document.addEventListener(e,t,n(r,i))}r.d(t,{DD:()=>o,jT:()=>n,sp:()=>i})},5607:(e,t,r)=>{"use strict";r.d(t,{W:()=>n});const n=(0,r(9566).bz)()},9566:(e,t,r)=>{"use strict";r.d(t,{LA:()=>s,ZF:()=>c,bz:()=>a,el:()=>u});var n=r(6154);const i="xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx";function o(e,t){return e?15&e[t]:16*Math.random()|0}function a(){const e=n.gm?.crypto||n.gm?.msCrypto;let t,r=0;return e&&e.getRandomValues&&(t=e.getRandomValues(new Uint8Array(30))),i.split("").map((e=>"x"===e?o(t,r++).toString(16):"y"===e?(3&o()|8).toString(16):e)).join("")}function s(e){const t=n.gm?.crypto||n.gm?.msCrypto;let r,i=0;t&&t.getRandomValues&&(r=t.getRandomValues(new Uint8Array(e)));const a=[];for(var s=0;s<e;s++)a.push(o(r,i++).toString(16));return a.join("")}function c(){return s(16)}function u(){return s(32)}},2614:(e,t,r)=>{"use strict";r.d(t,{BB:()=>a,H3:()=>n,g:()=>u,iL:()=>c,tS:()=>s,uh:()=>i,wk:()=>o});const n="NRBA",i="SESSION",o=144e5,a=18e5,s={STARTED:"session-started",PAUSE:"session-pause",RESET:"session-reset",RESUME:"session-resume",UPDATE:"session-update"},c={SAME_TAB:"same-tab",CROSS_TAB:"cross-tab"},u={OFF:0,FULL:1,ERROR:2}},1863:(e,t,r)=>{"use strict";function n(){return Math.floor(performance.now())}r.d(t,{t:()=>n})},7485:(e,t,r)=>{"use strict";r.d(t,{D:()=>i});var n=r(6154);function i(e){if(0===(e||"").indexOf("data:"))return{protocol:"data"};try{const t=new URL(e,location.href),r={port:t.port,hostname:t.hostname,pathname:t.pathname,search:t.search,protocol:t.protocol.slice(0,t.protocol.indexOf(":")),sameOrigin:t.protocol===n.gm?.location?.protocol&&t.host===n.gm?.location?.host};return r.port&&""!==r.port||("http:"===t.protocol&&(r.port="80"),"https:"===t.protocol&&(r.port="443")),r.pathname&&""!==r.pathname?r.pathname.startsWith("/")||(r.pathname="/".concat(r.pathname)):r.pathname="/",r}catch(e){return{}}}},944:(e,t,r)=>{"use strict";function n(e,t){"function"==typeof console.debug&&console.debug("New Relic Warning: https://github.com/newrelic/newrelic-browser-agent/blob/main/docs/warning-codes.md#".concat(e),t)}r.d(t,{R:()=>n})},5284:(e,t,r)=>{"use strict";r.d(t,{t:()=>c,B:()=>s});var n=r(7836),i=r(6154);const o="newrelic";const a=new Set,s={};function c(e,t){const r=n.ee.get(t);s[t]??={},e&&"object"==typeof e&&(a.has(t)||(r.emit("rumresp",[e]),s[t]=e,a.add(t),function(e={}){try{i.gm.dispatchEvent(new CustomEvent(o,{detail:e}))}catch(e){}}({loaded:!0})))}},8990:(e,t,r)=>{"use strict";r.d(t,{I:()=>i});var n=Object.prototype.hasOwnProperty;function i(e,t,r){if(n.call(e,t))return e[t];var i=r();if(Object.defineProperty&&Object.keys)try{return Object.defineProperty(e,t,{value:i,writable:!0,enumerable:!1}),i}catch(e){}return e[t]=i,i}},6389:(e,t,r)=>{"use strict";function n(e,t=500,r={}){const n=r?.leading||!1;let i;return(...r)=>{n&&void 0===i&&(e.apply(this,r),i=setTimeout((()=>{i=clearTimeout(i)}),t)),n||(clearTimeout(i),i=setTimeout((()=>{e.apply(this,r)}),t))}}function i(e){let t=!1;return(...r)=>{t||(t=!0,e.apply(this,r))}}r.d(t,{J:()=>i,s:()=>n})},3304:(e,t,r)=>{"use strict";r.d(t,{A:()=>o});var n=r(7836);const i=()=>{const e=new WeakSet;return(t,r)=>{if("object"==typeof r&&null!==r){if(e.has(r))return;e.add(r)}return r}};function o(e){try{return JSON.stringify(e,i())??""}catch(e){try{n.ee.emit("internal-error",[e])}catch(e){}return""}}},5289:(e,t,r)=>{"use strict";r.d(t,{GG:()=>o,sB:()=>a});var n=r(3878);function i(){return"undefined"==typeof document||"complete"===document.readyState}function o(e,t){if(i())return e();(0,n.sp)("load",e,t)}function a(e){if(i())return e();(0,n.DD)("DOMContentLoaded",e)}},384:(e,t,r)=>{"use strict";r.d(t,{NT:()=>o,US:()=>d,Zm:()=>a,bQ:()=>c,dV:()=>s,nY:()=>u,pV:()=>l});var n=r(6154),i=r(1863);const o={beacon:"bam.nr-data.net",errorBeacon:"bam.nr-data.net"};function a(){return n.gm.NREUM||(n.gm.NREUM={}),void 0===n.gm.newrelic&&(n.gm.newrelic=n.gm.NREUM),n.gm.NREUM}function s(){let e=a();return e.o||(e.o={ST:n.gm.setTimeout,SI:n.gm.setImmediate,CT:n.gm.clearTimeout,XHR:n.gm.XMLHttpRequest,REQ:n.gm.Request,EV:n.gm.Event,PR:n.gm.Promise,MO:n.gm.MutationObserver,FETCH:n.gm.fetch,WS:n.gm.WebSocket}),e}function c(e,t){let r=a();r.initializedAgents??={},t.initializedAt={ms:(0,i.t)(),date:new Date},r.initializedAgents[e]=t}function u(e){let t=a();return t.initializedAgents?.[e]}function d(e,t){a()[e]=t}function l(){return function(){let e=a();const t=e.info||{};e.info={beacon:o.beacon,errorBeacon:o.errorBeacon,...t}}(),function(){let e=a();const t=e.init||{};e.init={...t}}(),s(),function(){let e=a();const t=e.loader_config||{};e.loader_config={...t}}(),a()}},2843:(e,t,r)=>{"use strict";r.d(t,{u:()=>i});var n=r(3878);function i(e,t=!1,r,i){(0,n.DD)("visibilitychange",(function(){if(t)return void("hidden"===document.visibilityState&&e());e(document.visibilityState)}),r,i)}},8139:(e,t,r)=>{"use strict";r.d(t,{u:()=>f});var n=r(7836),i=r(3434),o=r(8990),a=r(6154);const s={},c=a.gm.XMLHttpRequest,u="addEventListener",d="removeEventListener",l="nr@wrapped:".concat(n.P);function f(e){var t=function(e){return(e||n.ee).get("events")}(e);if(s[t.debugId]++)return t;s[t.debugId]=1;var r=(0,i.YM)(t,!0);function f(e){r.inPlace(e,[u,d],"-",p)}function p(e,t){return e[1]}return"getPrototypeOf"in Object&&(a.RI&&h(document,f),h(a.gm,f),h(c.prototype,f)),t.on(u+"-start",(function(e,t){var n=e[1];if(null!==n&&("function"==typeof n||"object"==typeof n)){var i=(0,o.I)(n,l,(function(){var e={object:function(){if("function"!=typeof n.handleEvent)return;return n.handleEvent.apply(n,arguments)},function:n}[typeof n];return e?r(e,"fn-",null,e.name||"anonymous"):n}));this.wrapped=e[1]=i}})),t.on(d+"-start",(function(e){e[1]=this.wrapped||e[1]})),t}function h(e,t,...r){let n=e;for(;"object"==typeof n&&!Object.prototype.hasOwnProperty.call(n,u);)n=Object.getPrototypeOf(n);n&&t(n,...r)}},3434:(e,t,r)=>{"use strict";r.d(t,{Jt:()=>o,YM:()=>c});var n=r(7836),i=r(5607);const o="nr@original:".concat(i.W);var a=Object.prototype.hasOwnProperty,s=!1;function c(e,t){return e||(e=n.ee),r.inPlace=function(e,t,n,i,o){n||(n="");const a="-"===n.charAt(0);for(let s=0;s<t.length;s++){const c=t[s],u=e[c];d(u)||(e[c]=r(u,a?c+n:n,i,c,o))}},r.flag=o,r;function r(t,r,n,s,c){return d(t)?t:(r||(r=""),nrWrapper[o]=t,function(e,t,r){if(Object.defineProperty&&Object.keys)try{return Object.keys(e).forEach((function(r){Object.defineProperty(t,r,{get:function(){return e[r]},set:function(t){return e[r]=t,t}})})),t}catch(e){u([e],r)}for(var n in e)a.call(e,n)&&(t[n]=e[n])}(t,nrWrapper,e),nrWrapper);function nrWrapper(){var o,a,d,l;try{a=this,o=[...arguments],d="function"==typeof n?n(o,a):n||{}}catch(t){u([t,"",[o,a,s],d],e)}i(r+"start",[o,a,s],d,c);try{return l=t.apply(a,o)}catch(e){throw i(r+"err",[o,a,e],d,c),e}finally{i(r+"end",[o,a,l],d,c)}}}function i(r,n,i,o){if(!s||t){var a=s;s=!0;try{e.emit(r,n,i,t,o)}catch(t){u([t,r,n,i],e)}s=a}}}function u(e,t){t||(t=n.ee);try{t.emit("internal-error",e)}catch(e){}}function d(e){return!(e&&"function"==typeof e&&e.apply&&!e[o])}},9300:(e,t,r)=>{"use strict";r.d(t,{T:()=>n});const n=r(860).K7.ajax},3333:(e,t,r)=>{"use strict";r.d(t,{TZ:()=>n,Zp:()=>i,mq:()=>s,nf:()=>a,qN:()=>o});const n=r(860).K7.genericEvents,i=["auxclick","click","copy","keydown","paste","scrollend"],o=["focus","blur"],a=4,s=1e3},6774:(e,t,r)=>{"use strict";r.d(t,{T:()=>n});const n=r(860).K7.jserrors},993:(e,t,r)=>{"use strict";r.d(t,{ET:()=>o,TZ:()=>a,p_:()=>i});var n=r(860);const i={ERROR:"ERROR",WARN:"WARN",INFO:"INFO",DEBUG:"DEBUG",TRACE:"TRACE"},o="log",a=n.K7.logging},3785:(e,t,r)=>{"use strict";r.d(t,{R:()=>c,b:()=>u});var n=r(9908),i=r(1863),o=r(860),a=r(3969),s=r(993);function c(e,t,r={},c=s.p_.INFO){(0,n.p)(a.xV,["API/logging/".concat(c.toLowerCase(),"/called")],void 0,o.K7.metrics,e),(0,n.p)(s.ET,[(0,i.t)(),t,r,c],void 0,o.K7.logging,e)}function u(e){return"string"==typeof e&&Object.values(s.p_).some((t=>t===e.toUpperCase().trim()))}},3969:(e,t,r)=>{"use strict";r.d(t,{TZ:()=>n,XG:()=>s,rs:()=>i,xV:()=>a,z_:()=>o});const n=r(860).K7.metrics,i="sm",o="cm",a="storeSupportabilityMetrics",s="storeEventMetrics"},6630:(e,t,r)=>{"use strict";r.d(t,{T:()=>n});const n=r(860).K7.pageViewEvent},782:(e,t,r)=>{"use strict";r.d(t,{T:()=>n});const n=r(860).K7.pageViewTiming},6344:(e,t,r)=>{"use strict";r.d(t,{BB:()=>d,G4:()=>o,Qb:()=>l,TZ:()=>i,Ug:()=>a,_s:()=>s,bc:()=>u,yP:()=>c});var n=r(2614);const i=r(860).K7.sessionReplay,o={RECORD:"recordReplay",PAUSE:"pauseReplay",REPLAY_RUNNING:"replayRunning",ERROR_DURING_REPLAY:"errorDuringReplay"},a=.12,s={DomContentLoaded:0,Load:1,FullSnapshot:2,IncrementalSnapshot:3,Meta:4,Custom:5},c={[n.g.ERROR]:15e3,[n.g.FULL]:3e5,[n.g.OFF]:0},u={RESET:{message:"Session was reset",sm:"Reset"},IMPORT:{message:"Recorder failed to import",sm:"Import"},TOO_MANY:{message:"429: Too Many Requests",sm:"Too-Many"},TOO_BIG:{message:"Payload was too large",sm:"Too-Big"},CROSS_TAB:{message:"Session Entity was set to OFF on another tab",sm:"Cross-Tab"},ENTITLEMENTS:{message:"Session Replay is not allowed and will not be started",sm:"Entitlement"}},d=5e3,l={API:"api"}},5270:(e,t,r)=>{"use strict";r.d(t,{Aw:()=>c,CT:()=>u,SR:()=>s});var n=r(384),i=r(9417),o=r(7767),a=r(6154);function s(e){return!!(0,n.dV)().o.MO&&(0,o.V)(e)&&!0===(0,i.gD)(e,"session_trace.enabled")}function c(e){return!0===(0,i.gD)(e,"session_replay.preload")&&s(e)}function u(e,t){const r=t.correctAbsoluteTimestamp(e);return{originalTimestamp:e,correctedTimestamp:r,timestampDiff:e-r,originTime:a.WN,correctedOriginTime:t.correctedOriginTime,originTimeDiff:Math.floor(a.WN-t.correctedOriginTime)}}},3738:(e,t,r)=>{"use strict";r.d(t,{He:()=>i,Kp:()=>s,Lc:()=>u,Rz:()=>d,TZ:()=>n,bD:()=>o,d3:()=>a,jx:()=>l,uP:()=>c});const n=r(860).K7.sessionTrace,i="bstResource",o="resource",a="-start",s="-end",c="fn"+a,u="fn"+s,d="pushState",l=1e3},3962:(e,t,r)=>{"use strict";r.d(t,{AM:()=>o,O2:()=>s,Qu:()=>c,TZ:()=>a,ih:()=>u,tC:()=>i});var n=r(860);const i=["click","keydown","submit"],o="api",a=n.K7.softNav,s={INITIAL_PAGE_LOAD:"",ROUTE_CHANGE:1,UNSPECIFIED:2},c={INTERACTION:1,AJAX:2,CUSTOM_END:3,CUSTOM_TRACER:4},u={IP:"in progress",FIN:"finished",CAN:"cancelled"}},7378:(e,t,r)=>{"use strict";r.d(t,{$p:()=>x,BR:()=>b,Kp:()=>R,L3:()=>y,Lc:()=>c,NC:()=>o,SG:()=>d,TZ:()=>i,U6:()=>p,UT:()=>m,d3:()=>w,dT:()=>f,e5:()=>A,gx:()=>v,l9:()=>l,oW:()=>h,op:()=>g,rw:()=>u,tH:()=>E,uP:()=>s,wW:()=>T,xq:()=>a});var n=r(384);const i=r(860).K7.spa,o=["click","submit","keypress","keydown","keyup","change"],a=999,s="fn-start",c="fn-end",u="cb-start",d="api-ixn-",l="remaining",f="interaction",h="spaNode",p="jsonpNode",g="fetch-start",m="fetch-done",v="fetch-body-",b="jsonp-end",y=(0,n.dV)().o.ST,w="-start",R="-end",x="-body",T="cb"+R,A="jsTime",E="fetch"},4234:(e,t,r)=>{"use strict";r.d(t,{W:()=>o});var n=r(7836),i=r(1687);class o{constructor(e,t){this.agentIdentifier=e,this.ee=n.ee.get(e),this.featureName=t,this.blocked=!1}deregisterDrain(){(0,i.x3)(this.agentIdentifier,this.featureName)}}},7767:(e,t,r)=>{"use strict";r.d(t,{V:()=>o});var n=r(9417),i=r(6154);const o=e=>i.RI&&!0===(0,n.gD)(e,"privacy.cookies_enabled")},425:(e,t,r)=>{"use strict";r.d(t,{j:()=>j});var n=r(860),i=r(2555),o=r(3371),a=r(9908),s=r(7836),c=r(1687),u=r(5289),d=r(6154),l=r(944),f=r(3969),h=r(384),p=r(6344);const g=["setErrorHandler","finished","addToTrace","addRelease","addPageAction","setCurrentRouteName","setPageViewName","setCustomAttribute","interaction","noticeError","setUserId","setApplicationVersion","start",p.G4.RECORD,p.G4.PAUSE,"log","wrapLogger"],m=["setErrorHandler","finished","addToTrace","addRelease"];var v=r(1863),b=r(2614),y=r(993),w=r(3785),R=r(2646),x=r(3434);function T(e,t,r,n){if("object"!=typeof t||!t||"string"!=typeof r||!r||"function"!=typeof t[r])return(0,l.R)(29);const i=function(e){return(e||s.ee).get("logger")}(e),o=(0,x.YM)(i),a=new R.y(s.P);return a.level=n.level,a.customAttributes=n.customAttributes,o.inPlace(t,[r],"wrap-logger-",a),i}function A(){const e=(0,h.pV)();g.forEach((t=>{e[t]=(...r)=>function(t,...r){let n=[];return Object.values(e.initializedAgents).forEach((e=>{e&&e.api?e.exposed&&e.api[t]&&n.push(e.api[t](...r)):(0,l.R)(38,t)})),n.length>1?n:n[0]}(t,...r)}))}const E={};var S=r(9417),N=r(5603),O=r(5284);const _=e=>{const t=e.startsWith("http");e+="/",r.p=t?e:"https://"+e};let I=!1;function j(e,t={},g,R){let{init:x,info:j,loader_config:P,runtime:C={},exposed:k=!0}=t;C.loaderType=g;const L=(0,h.pV)();j||(x=L.init,j=L.info,P=L.loader_config),(0,S.xN)(e.agentIdentifier,x||{}),(0,N.a)(e.agentIdentifier,P||{}),j.jsAttributes??={},d.bv&&(j.jsAttributes.isWorker=!0),(0,i.x1)(e.agentIdentifier,j);const H=(0,S.D0)(e.agentIdentifier),D=[j.beacon,j.errorBeacon];I||(H.proxy.assets&&(_(H.proxy.assets),D.push(H.proxy.assets)),H.proxy.beacon&&D.push(H.proxy.beacon),A(),(0,h.US)("activatedFeatures",O.B),e.runSoftNavOverSpa&&=!0===H.soft_navigations.enabled&&H.feature_flags.includes("soft_nav")),C.denyList=[...H.ajax.deny_list||[],...H.ajax.block_internal?D:[]],C.ptid=e.agentIdentifier,(0,o.V)(e.agentIdentifier,C),e.ee=s.ee.get(e.agentIdentifier),void 0===e.api&&(e.api=function(e,t,h=!1){t||(0,c.Ak)(e,"api");const g={};var R=s.ee.get(e),x=R.get("tracer");E[e]=b.g.OFF,R.on(p.G4.REPLAY_RUNNING,(t=>{E[e]=t}));var A="api-",S=A+"ixn-";function N(t,r,n,o){const a=(0,i.Vp)(e);return null===r?delete a.jsAttributes[t]:(0,i.x1)(e,{...a,jsAttributes:{...a.jsAttributes,[t]:r}}),I(A,n,!0,o||null===r?"session":void 0)(t,r)}function O(){}g.log=function(e,{customAttributes:t={},level:r=y.p_.INFO}={}){(0,a.p)(f.xV,["API/log/called"],void 0,n.K7.metrics,R),(0,w.R)(R,e,t,r)},g.wrapLogger=(e,t,{customAttributes:r={},level:i=y.p_.INFO}={})=>{(0,a.p)(f.xV,["API/wrapLogger/called"],void 0,n.K7.metrics,R),T(R,e,t,{customAttributes:r,level:i})},m.forEach((e=>{g[e]=I(A,e,!0,"api")})),g.addPageAction=I(A,"addPageAction",!0,n.K7.genericEvents),g.setPageViewName=function(t,r){if("string"==typeof t)return"/"!==t.charAt(0)&&(t="/"+t),(0,o.f)(e).customTransaction=(r||"http://custom.transaction")+t,I(A,"setPageViewName",!0)()},g.setCustomAttribute=function(e,t,r=!1){if("string"==typeof e){if(["string","number","boolean"].includes(typeof t)||null===t)return N(e,t,"setCustomAttribute",r);(0,l.R)(40,typeof t)}else(0,l.R)(39,typeof e)},g.setUserId=function(e){if("string"==typeof e||null===e)return N("enduser.id",e,"setUserId",!0);(0,l.R)(41,typeof e)},g.setApplicationVersion=function(e){if("string"==typeof e||null===e)return N("application.version",e,"setApplicationVersion",!1);(0,l.R)(42,typeof e)},g.start=()=>{try{(0,a.p)(f.xV,["API/start/called"],void 0,n.K7.metrics,R),R.emit("manual-start-all")}catch(e){(0,l.R)(23,e)}},g[p.G4.RECORD]=function(){(0,a.p)(f.xV,["API/recordReplay/called"],void 0,n.K7.metrics,R),(0,a.p)(p.G4.RECORD,[],void 0,n.K7.sessionReplay,R)},g[p.G4.PAUSE]=function(){(0,a.p)(f.xV,["API/pauseReplay/called"],void 0,n.K7.metrics,R),(0,a.p)(p.G4.PAUSE,[],void 0,n.K7.sessionReplay,R)},g.interaction=function(e){return(new O).get("object"==typeof e?e:{})};const _=O.prototype={createTracer:function(e,t){var r={},i=this,o="function"==typeof t;return(0,a.p)(f.xV,["API/createTracer/called"],void 0,n.K7.metrics,R),h||(0,a.p)(S+"tracer",[(0,v.t)(),e,r],i,n.K7.spa,R),function(){if(x.emit((o?"":"no-")+"fn-start",[(0,v.t)(),i,o],r),o)try{return t.apply(this,arguments)}catch(e){const t="string"==typeof e?new Error(e):e;throw x.emit("fn-err",[arguments,this,t],r),t}finally{x.emit("fn-end",[(0,v.t)()],r)}}}};function I(e,t,r,i){return function(){return(0,a.p)(f.xV,["API/"+t+"/called"],void 0,n.K7.metrics,R),i&&(0,a.p)(e+t,[(0,v.t)(),...arguments],r?null:this,i,R),r?void 0:this}}function j(){r.e(478).then(r.bind(r,8778)).then((({setAPI:t})=>{t(e),(0,c.Ze)(e,"api")})).catch((e=>{(0,l.R)(27,e),R.abort()}))}return["actionText","setName","setAttribute","save","ignore","onEnd","getContext","end","get"].forEach((e=>{_[e]=I(S,e,void 0,h?n.K7.softNav:n.K7.spa)})),g.setCurrentRouteName=h?I(S,"routeName",void 0,n.K7.softNav):I(A,"routeName",!0,n.K7.spa),g.noticeError=function(t,r){"string"==typeof t&&(t=new Error(t)),(0,a.p)(f.xV,["API/noticeError/called"],void 0,n.K7.metrics,R),(0,a.p)("err",[t,(0,v.t)(),!1,r,!!E[e]],void 0,n.K7.jserrors,R)},d.RI?(0,u.GG)((()=>j()),!0):j(),g}(e.agentIdentifier,R,e.runSoftNavOverSpa)),void 0===e.exposed&&(e.exposed=k),I=!0}},8374:(e,t,r)=>{r.nc=(()=>{try{return document?.currentScript?.nonce}catch(e){}return""})()},860:(e,t,r)=>{"use strict";r.d(t,{$J:()=>o,K7:()=>n,P3:()=>i});const n={ajax:"ajax",genericEvents:"generic_events",jserrors:"jserrors",logging:"logging",metrics:"metrics",pageAction:"page_action",pageViewEvent:"page_view_event",pageViewTiming:"page_view_timing",sessionReplay:"session_replay",sessionTrace:"session_trace",softNav:"soft_navigations",spa:"spa"},i={[n.pageViewEvent]:1,[n.pageViewTiming]:2,[n.metrics]:3,[n.jserrors]:4,[n.spa]:5,[n.ajax]:6,[n.sessionTrace]:7,[n.softNav]:8,[n.sessionReplay]:9,[n.logging]:10,[n.genericEvents]:11},o={[n.pageViewTiming]:"events",[n.ajax]:"events",[n.spa]:"events",[n.softNav]:"events",[n.metrics]:"jserrors",[n.jserrors]:"jserrors",[n.sessionTrace]:"browser/blobs",[n.sessionReplay]:"browser/blobs",[n.logging]:"browser/logs",[n.genericEvents]:"ins"}}},n={};function i(e){var t=n[e];if(void 0!==t)return t.exports;var o=n[e]={exports:{}};return r[e](o,o.exports,i),o.exports}i.m=r,i.d=(e,t)=>{for(var r in t)i.o(t,r)&&!i.o(e,r)&&Object.defineProperty(e,r,{enumerable:!0,get:t[r]})},i.f={},i.e=e=>Promise.all(Object.keys(i.f).reduce(((t,r)=>(i.f[r](e,t),t)),[])),i.u=e=>({212:"nr-spa-compressor",249:"nr-spa-recorder",478:"nr-spa"}[e]+"-1.274.0.min.js"),i.o=(e,t)=>Object.prototype.hasOwnProperty.call(e,t),e={},t="NRBA-1.274.0.PROD:",i.l=(r,n,o,a)=>{if(e[r])e[r].push(n);else{var s,c;if(void 0!==o)for(var u=document.getElementsByTagName("script"),d=0;d<u.length;d++){var l=u[d];if(l.getAttribute("src")==r||l.getAttribute("data-webpack")==t+o){s=l;break}}if(!s){c=!0;var f={478:"sha512-1vUqEfJPB8Pihje9mv5CfYgkitO1FWcS+UQb84DbXqP8oYctRv4/lzl/MzNLPlRhcY1WVDBGL20I8vm6s2VV7g==",249:"sha512-Y/BeZAh6VSTmUtUNmS5XdyKxL92s30Fyyj8xVW76HSPxcKItL4+x2+kGMZc8pMJnUpZDz1L4eftZQAJh3D8NnA==",212:"sha512-Gn2tQ3qog5Yhrx/gRutkSTYPp+7nkKFt4/mIXg99LxcNpMDAYJZDBYmAACdoHNM86+iq1F3cBcQotFNzjIX8bw=="};(s=document.createElement("script")).charset="utf-8",s.timeout=120,i.nc&&s.setAttribute("nonce",i.nc),s.setAttribute("data-webpack",t+o),s.src=r,0!==s.src.indexOf(window.location.origin+"/")&&(s.crossOrigin="anonymous"),f[a]&&(s.integrity=f[a])}e[r]=[n];var h=(t,n)=>{s.onerror=s.onload=null,clearTimeout(p);var i=e[r];if(delete e[r],s.parentNode&&s.parentNode.removeChild(s),i&&i.forEach((e=>e(n))),t)return t(n)},p=setTimeout(h.bind(null,void 0,{type:"timeout",target:s}),12e4);s.onerror=h.bind(null,s.onerror),s.onload=h.bind(null,s.onload),c&&document.head.appendChild(s)}},i.r=e=>{"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},i.p="https://js-agent.newrelic.com/",(()=>{var e={38:0,788:0};i.f.j=(t,r)=>{var n=i.o(e,t)?e[t]:void 0;if(0!==n)if(n)r.push(n[2]);else{var o=new Promise(((r,i)=>n=e[t]=[r,i]));r.push(n[2]=o);var a=i.p+i.u(t),s=new Error;i.l(a,(r=>{if(i.o(e,t)&&(0!==(n=e[t])&&(e[t]=void 0),n)){var o=r&&("load"===r.type?"missing":r.type),a=r&&r.target&&r.target.src;s.message="Loading chunk "+t+" failed.\n("+o+": "+a+")",s.name="ChunkLoadError",s.type=o,s.request=a,n[1](s)}}),"chunk-"+t,t)}};var t=(t,r)=>{var n,o,[a,s,c]=r,u=0;if(a.some((t=>0!==e[t]))){for(n in s)i.o(s,n)&&(i.m[n]=s[n]);if(c)c(i)}for(t&&t(r);u<a.length;u++)o=a[u],i.o(e,o)&&e[o]&&e[o][0](),e[o]=0},r=self["webpackChunk:NRBA-1.274.0.PROD"]=self["webpackChunk:NRBA-1.274.0.PROD"]||[];r.forEach(t.bind(null,0)),r.push=t.bind(null,r.push.bind(r))})(),(()=>{"use strict";i(8374);var e=i(944),t=i(6344),r=i(9566);class n{agentIdentifier;constructor(e=(0,r.LA)(16)){this.agentIdentifier=e}#e(t,...r){if("function"==typeof this.api?.[t])return this.api[t](...r);(0,e.R)(35,t)}addPageAction(e,t){return this.#e("addPageAction",e,t)}setPageViewName(e,t){return this.#e("setPageViewName",e,t)}setCustomAttribute(e,t,r){return this.#e("setCustomAttribute",e,t,r)}noticeError(e,t){return this.#e("noticeError",e,t)}setUserId(e){return this.#e("setUserId",e)}setApplicationVersion(e){return this.#e("setApplicationVersion",e)}setErrorHandler(e){return this.#e("setErrorHandler",e)}addRelease(e,t){return this.#e("addRelease",e,t)}log(e,t){return this.#e("log",e,t)}}class o extends n{#e(t,...r){if("function"==typeof this.api?.[t])return this.api[t](...r);(0,e.R)(35,t)}start(){return this.#e("start")}finished(e){return this.#e("finished",e)}recordReplay(){return this.#e(t.G4.RECORD)}pauseReplay(){return this.#e(t.G4.PAUSE)}addToTrace(e){return this.#e("addToTrace",e)}setCurrentRouteName(e){return this.#e("setCurrentRouteName",e)}interaction(){return this.#e("interaction")}wrapLogger(e,t,r){return this.#e("wrapLogger",e,t,r)}}var a=i(860),s=i(9417);const c=Object.values(a.K7);function u(e){const t={};return c.forEach((r=>{t[r]=function(e,t){return!0===(0,s.gD)(t,"".concat(e,".enabled"))}(r,e)})),t}var d=i(425);var l=i(1687),f=i(4234),h=i(5289),p=i(6154),g=i(5270),m=i(7767),v=i(6389);class b extends f.W{constructor(e,t,r=!0){super(e.agentIdentifier,t),this.auto=r,this.abortHandler=void 0,this.featAggregate=void 0,this.onAggregateImported=void 0,!1===e.init[this.featureName].autoStart&&(this.auto=!1),this.auto?(0,l.Ak)(e.agentIdentifier,t):this.ee.on("manual-start-all",(0,v.J)((()=>{(0,l.Ak)(e.agentIdentifier,this.featureName),this.auto=!0,this.importAggregator(e)})))}importAggregator(t,r={}){if(this.featAggregate||!this.auto)return;let n;this.onAggregateImported=new Promise((e=>{n=e}));const o=async()=>{let o;try{if((0,m.V)(this.agentIdentifier)){const{setupAgentSession:e}=await i.e(478).then(i.bind(i,6526));o=e(t)}}catch(t){(0,e.R)(20,t),this.ee.emit("internal-error",[t]),this.featureName===a.K7.sessionReplay&&this.abortHandler?.()}try{if(t.sharedAggregator)await t.sharedAggregator;else{t.sharedAggregator=i.e(478).then(i.bind(i,9337));const{EventAggregator:e}=await t.sharedAggregator;t.sharedAggregator=new e}if(!this.#t(this.featureName,o))return(0,l.Ze)(this.agentIdentifier,this.featureName),void n(!1);const{lazyFeatureLoader:e}=await i.e(478).then(i.bind(i,6103)),{Aggregate:a}=await e(this.featureName,"aggregate");this.featAggregate=new a(t,r),n(!0)}catch(t){(0,e.R)(34,t),this.abortHandler?.(),(0,l.Ze)(this.agentIdentifier,this.featureName,!0),n(!1),this.ee&&this.ee.abort()}};p.RI?(0,h.GG)((()=>o()),!0):o()}#t(e,t){switch(e){case a.K7.sessionReplay:return(0,g.SR)(this.agentIdentifier)&&!!t;case a.K7.sessionTrace:return!!t;default:return!0}}}var y=i(6630);class w extends b{static featureName=y.T;constructor(e,t=!0){super(e,y.T,t),this.importAggregator(e)}}var R=i(384);var x=i(9908),T=i(2843),A=i(3878),E=i(782),S=i(1863);class N extends b{static featureName=E.T;constructor(e,t=!0){super(e,E.T,t),p.RI&&((0,T.u)((()=>(0,x.p)("docHidden",[(0,S.t)()],void 0,E.T,this.ee)),!0),(0,A.sp)("pagehide",(()=>(0,x.p)("winPagehide",[(0,S.t)()],void 0,E.T,this.ee))),this.importAggregator(e))}}var O=i(3969);class _ extends b{static featureName=O.TZ;constructor(e,t=!0){super(e,O.TZ,t),this.importAggregator(e)}}var I=i(6774),j=i(3304);class P{constructor(e,t,r,n,i){this.name="UncaughtError",this.message="string"==typeof e?e:(0,j.A)(e),this.sourceURL=t,this.line=r,this.column=n,this.__newrelic=i}}function C(e){return H(e)?e:new P(void 0!==e?.message?e.message:e,e?.filename||e?.sourceURL,e?.lineno||e?.line,e?.colno||e?.col,e?.__newrelic)}function k(e){const t="Unhandled Promise Rejection";if(!e?.reason)return;if(H(e.reason))try{return e.reason.message=t+": "+e.reason.message,C(e.reason)}catch(t){return C(e.reason)}const r=C(e.reason);return r.message=t+": "+r?.message,r}function L(e){if(e.error instanceof SyntaxError&&!/:\d+$/.test(e.error.stack?.trim())){const t=new P(e.message,e.filename,e.lineno,e.colno,e.error.__newrelic);return t.name=SyntaxError.name,t}return H(e.error)?e.error:C(e)}function H(e){return e instanceof Error&&!!e.stack}class D extends b{static featureName=I.T;#r=!1;constructor(e,r=!0){super(e,I.T,r);try{this.removeOnAbort=new AbortController}catch(e){}this.ee.on("internal-error",(e=>{this.abortHandler&&(0,x.p)("ierr",[C(e),(0,S.t)(),!0,{},this.#r],void 0,this.featureName,this.ee)})),this.ee.on(t.G4.REPLAY_RUNNING,(e=>{this.#r=e})),p.gm.addEventListener("unhandledrejection",(e=>{this.abortHandler&&(0,x.p)("err",[k(e),(0,S.t)(),!1,{unhandledPromiseRejection:1},this.#r],void 0,this.featureName,this.ee)}),(0,A.jT)(!1,this.removeOnAbort?.signal)),p.gm.addEventListener("error",(e=>{this.abortHandler&&(0,x.p)("err",[L(e),(0,S.t)(),!1,{},this.#r],void 0,this.featureName,this.ee)}),(0,A.jT)(!1,this.removeOnAbort?.signal)),this.abortHandler=this.#n,this.importAggregator(e)}#n(){this.removeOnAbort?.abort(),this.abortHandler=void 0}}var M=i(8990);let K=1;const U="nr@id";function V(e){const t=typeof e;return!e||"object"!==t&&"function"!==t?-1:e===p.gm?0:(0,M.I)(e,U,(function(){return K++}))}function G(e){if("string"==typeof e&&e.length)return e.length;if("object"==typeof e){if("undefined"!=typeof ArrayBuffer&&e instanceof ArrayBuffer&&e.byteLength)return e.byteLength;if("undefined"!=typeof Blob&&e instanceof Blob&&e.size)return e.size;if(!("undefined"!=typeof FormData&&e instanceof FormData))try{return(0,j.A)(e).length}catch(e){return}}}var F=i(8139),B=i(7836),W=i(3434);const z={},q=["open","send"];function Z(t){var r=t||B.ee;const n=function(e){return(e||B.ee).get("xhr")}(r);if(z[n.debugId]++)return n;z[n.debugId]=1,(0,F.u)(r);var i=(0,W.YM)(n),o=p.gm.XMLHttpRequest,a=p.gm.MutationObserver,s=p.gm.Promise,c=p.gm.setInterval,u="readystatechange",d=["onload","onerror","onabort","onloadstart","onloadend","onprogress","ontimeout"],l=[],f=p.gm.XMLHttpRequest=function(t){const r=new o(t),a=n.context(r);try{n.emit("new-xhr",[r],a),r.addEventListener(u,(s=a,function(){var e=this;e.readyState>3&&!s.resolved&&(s.resolved=!0,n.emit("xhr-resolved",[],e)),i.inPlace(e,d,"fn-",y)}),(0,A.jT)(!1))}catch(t){(0,e.R)(15,t);try{n.emit("internal-error",[t])}catch(e){}}var s;return r};function h(e,t){i.inPlace(t,["onreadystatechange"],"fn-",y)}if(function(e,t){for(var r in e)t[r]=e[r]}(o,f),f.prototype=o.prototype,i.inPlace(f.prototype,q,"-xhr-",y),n.on("send-xhr-start",(function(e,t){h(e,t),function(e){l.push(e),a&&(g?g.then(b):c?c(b):(m=-m,v.data=m))}(t)})),n.on("open-xhr-start",h),a){var g=s&&s.resolve();if(!c&&!s){var m=1,v=document.createTextNode(m);new a(b).observe(v,{characterData:!0})}}else r.on("fn-end",(function(e){e[0]&&e[0].type===u||b()}));function b(){for(var e=0;e<l.length;e++)h(0,l[e]);l.length&&(l=[])}function y(e,t){return t}return n}var Y="fetch-",X=Y+"body-",J=["arrayBuffer","blob","json","text","formData"],Q=p.gm.Request,ee=p.gm.Response,te="prototype";const re={};function ne(e){const t=function(e){return(e||B.ee).get("fetch")}(e);if(!(Q&&ee&&p.gm.fetch))return t;if(re[t.debugId]++)return t;function r(e,r,n){var i=e[r];"function"==typeof i&&(e[r]=function(){var e,r=[...arguments],o={};t.emit(n+"before-start",[r],o),o[B.P]&&o[B.P].dt&&(e=o[B.P].dt);var a=i.apply(this,r);return t.emit(n+"start",[r,e],a),a.then((function(e){return t.emit(n+"end",[null,e],a),e}),(function(e){throw t.emit(n+"end",[e],a),e}))})}return re[t.debugId]=1,J.forEach((e=>{r(Q[te],e,X),r(ee[te],e,X)})),r(p.gm,"fetch",Y),t.on(Y+"end",(function(e,r){var n=this;if(r){var i=r.headers.get("content-length");null!==i&&(n.rxSize=i),t.emit(Y+"done",[null,r],n)}else t.emit(Y+"done",[e],n)})),t}var ie=i(7485),oe=i(5603);class ae{constructor(e){this.agentIdentifier=e}generateTracePayload(e){if(!this.shouldGenerateTrace(e))return null;var t=(0,oe.o)(this.agentIdentifier);if(!t)return null;var n=(t.accountID||"").toString()||null,i=(t.agentID||"").toString()||null,o=(t.trustKey||"").toString()||null;if(!n||!i)return null;var a=(0,r.ZF)(),s=(0,r.el)(),c=Date.now(),u={spanId:a,traceId:s,timestamp:c};return(e.sameOrigin||this.isAllowedOrigin(e)&&this.useTraceContextHeadersForCors())&&(u.traceContextParentHeader=this.generateTraceContextParentHeader(a,s),u.traceContextStateHeader=this.generateTraceContextStateHeader(a,c,n,i,o)),(e.sameOrigin&&!this.excludeNewrelicHeader()||!e.sameOrigin&&this.isAllowedOrigin(e)&&this.useNewrelicHeaderForCors())&&(u.newrelicHeader=this.generateTraceHeader(a,s,c,n,i,o)),u}generateTraceContextParentHeader(e,t){return"00-"+t+"-"+e+"-01"}generateTraceContextStateHeader(e,t,r,n,i){return i+"@nr=0-1-"+r+"-"+n+"-"+e+"----"+t}generateTraceHeader(e,t,r,n,i,o){if(!("function"==typeof p.gm?.btoa))return null;var a={v:[0,1],d:{ty:"Browser",ac:n,ap:i,id:e,tr:t,ti:r}};return o&&n!==o&&(a.d.tk=o),btoa((0,j.A)(a))}shouldGenerateTrace(e){return this.isDtEnabled()&&this.isAllowedOrigin(e)}isAllowedOrigin(e){var t=!1,r={};if((0,s.gD)(this.agentIdentifier,"distributed_tracing")&&(r=(0,s.D0)(this.agentIdentifier).distributed_tracing),e.sameOrigin)t=!0;else if(r.allowed_origins instanceof Array)for(var n=0;n<r.allowed_origins.length;n++){var i=(0,ie.D)(r.allowed_origins[n]);if(e.hostname===i.hostname&&e.protocol===i.protocol&&e.port===i.port){t=!0;break}}return t}isDtEnabled(){var e=(0,s.gD)(this.agentIdentifier,"distributed_tracing");return!!e&&!!e.enabled}excludeNewrelicHeader(){var e=(0,s.gD)(this.agentIdentifier,"distributed_tracing");return!!e&&!!e.exclude_newrelic_header}useNewrelicHeaderForCors(){var e=(0,s.gD)(this.agentIdentifier,"distributed_tracing");return!!e&&!1!==e.cors_use_newrelic_header}useTraceContextHeadersForCors(){var e=(0,s.gD)(this.agentIdentifier,"distributed_tracing");return!!e&&!!e.cors_use_tracecontext_headers}}var se=i(9300),ce=i(7295),ue=["load","error","abort","timeout"],de=ue.length,le=(0,R.dV)().o.REQ,fe=(0,R.dV)().o.XHR;class he extends b{static featureName=se.T;constructor(e,t=!0){super(e,se.T,t),this.dt=new ae(e.agentIdentifier),this.handler=(e,t,r,n)=>(0,x.p)(e,t,r,n,this.ee);try{const e={xmlhttprequest:"xhr",fetch:"fetch",beacon:"beacon"};p.gm?.performance?.getEntriesByType("resource").forEach((t=>{if(t.initiatorType in e&&0!==t.responseStatus){const r={status:t.responseStatus},n={rxSize:t.transferSize,duration:Math.floor(t.duration),cbTime:0};pe(r,t.name),this.handler("xhr",[r,n,t.startTime,t.responseEnd,e[t.initiatorType]],void 0,a.K7.ajax)}}))}catch(e){}ne(this.ee),Z(this.ee),function(e,t,r,n){function i(e){var t=this;t.totalCbs=0,t.called=0,t.cbTime=0,t.end=R,t.ended=!1,t.xhrGuids={},t.lastSize=null,t.loadCaptureCalled=!1,t.params=this.params||{},t.metrics=this.metrics||{},e.addEventListener("load",(function(r){T(t,e)}),(0,A.jT)(!1)),p.lR||e.addEventListener("progress",(function(e){t.lastSize=e.loaded}),(0,A.jT)(!1))}function o(e){this.params={method:e[0]},pe(this,e[1]),this.metrics={}}function s(t,r){e.loader_config.xpid&&this.sameOrigin&&r.setRequestHeader("X-NewRelic-ID",e.loader_config.xpid);var i=n.generateTracePayload(this.parsedOrigin);if(i){var o=!1;i.newrelicHeader&&(r.setRequestHeader("newrelic",i.newrelicHeader),o=!0),i.traceContextParentHeader&&(r.setRequestHeader("traceparent",i.traceContextParentHeader),i.traceContextStateHeader&&r.setRequestHeader("tracestate",i.traceContextStateHeader),o=!0),o&&(this.dt=i)}}function c(e,r){var n=this.metrics,i=e[0],o=this;if(n&&i){var a=G(i);a&&(n.txSize=a)}this.startTime=(0,S.t)(),this.body=i,this.listener=function(e){try{"abort"!==e.type||o.loadCaptureCalled||(o.params.aborted=!0),("load"!==e.type||o.called===o.totalCbs&&(o.onloadCalled||"function"!=typeof r.onload)&&"function"==typeof o.end)&&o.end(r)}catch(e){try{t.emit("internal-error",[e])}catch(e){}}};for(var s=0;s<de;s++)r.addEventListener(ue[s],this.listener,(0,A.jT)(!1))}function u(e,t,r){this.cbTime+=e,t?this.onloadCalled=!0:this.called+=1,this.called!==this.totalCbs||!this.onloadCalled&&"function"==typeof r.onload||"function"!=typeof this.end||this.end(r)}function d(e,t){var r=""+V(e)+!!t;this.xhrGuids&&!this.xhrGuids[r]&&(this.xhrGuids[r]=!0,this.totalCbs+=1)}function l(e,t){var r=""+V(e)+!!t;this.xhrGuids&&this.xhrGuids[r]&&(delete this.xhrGuids[r],this.totalCbs-=1)}function f(){this.endTime=(0,S.t)()}function h(e,r){r instanceof fe&&"load"===e[0]&&t.emit("xhr-load-added",[e[1],e[2]],r)}function g(e,r){r instanceof fe&&"load"===e[0]&&t.emit("xhr-load-removed",[e[1],e[2]],r)}function m(e,t,r){t instanceof fe&&("onload"===r&&(this.onload=!0),("load"===(e[0]&&e[0].type)||this.onload)&&(this.xhrCbStart=(0,S.t)()))}function v(e,r){this.xhrCbStart&&t.emit("xhr-cb-time",[(0,S.t)()-this.xhrCbStart,this.onload,r],r)}function b(e){var t,r=e[1]||{};if("string"==typeof e[0]?0===(t=e[0]).length&&p.RI&&(t=""+p.gm.location.href):e[0]&&e[0].url?t=e[0].url:p.gm?.URL&&e[0]&&e[0]instanceof URL?t=e[0].href:"function"==typeof e[0].toString&&(t=e[0].toString()),"string"==typeof t&&0!==t.length){t&&(this.parsedOrigin=(0,ie.D)(t),this.sameOrigin=this.parsedOrigin.sameOrigin);var i=n.generateTracePayload(this.parsedOrigin);if(i&&(i.newrelicHeader||i.traceContextParentHeader))if(e[0]&&e[0].headers)s(e[0].headers,i)&&(this.dt=i);else{var o={};for(var a in r)o[a]=r[a];o.headers=new Headers(r.headers||{}),s(o.headers,i)&&(this.dt=i),e.length>1?e[1]=o:e.push(o)}}function s(e,t){var r=!1;return t.newrelicHeader&&(e.set("newrelic",t.newrelicHeader),r=!0),t.traceContextParentHeader&&(e.set("traceparent",t.traceContextParentHeader),t.traceContextStateHeader&&e.set("tracestate",t.traceContextStateHeader),r=!0),r}}function y(e,t){this.params={},this.metrics={},this.startTime=(0,S.t)(),this.dt=t,e.length>=1&&(this.target=e[0]),e.length>=2&&(this.opts=e[1]);var r,n=this.opts||{},i=this.target;"string"==typeof i?r=i:"object"==typeof i&&i instanceof le?r=i.url:p.gm?.URL&&"object"==typeof i&&i instanceof URL&&(r=i.href),pe(this,r);var o=(""+(i&&i instanceof le&&i.method||n.method||"GET")).toUpperCase();this.params.method=o,this.body=n.body,this.txSize=G(n.body)||0}function w(e,t){if(this.endTime=(0,S.t)(),this.params||(this.params={}),(0,ce.iW)(this.params))return;let n;this.params.status=t?t.status:0,"string"==typeof this.rxSize&&this.rxSize.length>0&&(n=+this.rxSize);const i={txSize:this.txSize,rxSize:n,duration:(0,S.t)()-this.startTime};r("xhr",[this.params,i,this.startTime,this.endTime,"fetch"],this,a.K7.ajax)}function R(e){const t=this.params,n=this.metrics;if(!this.ended){this.ended=!0;for(let t=0;t<de;t++)e.removeEventListener(ue[t],this.listener,!1);t.aborted||(0,ce.iW)(t)||(n.duration=(0,S.t)()-this.startTime,this.loadCazptureCalled||4!==e.readyState?null==t.status&&(t.status=0):T(this,e),n.cbTime=this.cbTime,r("xhr",[t,n,this.startTime,this.endTime,"xhr"],this,a.K7.ajax))}}function T(e,r){e.params.status=r.status;var n=function(e,t){var r=e.responseType;return"json"===r&&null!==t?t:"arraybuffer"===r||"blob"===r||"json"===r?G(e.response):"text"===r||""===r||void 0===r?G(e.responseText):void 0}(r,e.lastSize);if(n&&(e.metrics.rxSize=n),e.sameOrigin){var i=r.getResponseHeader("X-NewRelic-App-Data");i&&((0,x.p)(O.rs,["Ajax/CrossApplicationTracing/Header/Seen"],void 0,a.K7.metrics,t),e.params.cat=i.split(", ").pop())}e.loadCaptureCalled=!0}t.on("new-xhr",i),t.on("open-xhr-start",o),t.on("open-xhr-end",s),t.on("send-xhr-start",c),t.on("xhr-cb-time",u),t.on("xhr-load-added",d),t.on("xhr-load-removed",l),t.on("xhr-resolved",f),t.on("addEventListener-end",h),t.on("removeEventListener-end",g),t.on("fn-end",v),t.on("fetch-before-start",b),t.on("fetch-start",y),t.on("fn-start",m),t.on("fetch-done",w)}(e,this.ee,this.handler,this.dt),this.importAggregator(e)}}function pe(e,t){var r=(0,ie.D)(t),n=e.params||e;n.hostname=r.hostname,n.port=r.port,n.protocol=r.protocol,n.host=r.hostname+":"+r.port,n.pathname=r.pathname,e.parsedOrigin=r,e.sameOrigin=r.sameOrigin}const ge={},me=["pushState","replaceState"];function ve(e){const t=function(e){return(e||B.ee).get("history")}(e);return!p.RI||ge[t.debugId]++||(ge[t.debugId]=1,(0,W.YM)(t).inPlace(window.history,me,"-")),t}var be=i(3738);const{He:ye,bD:we,d3:Re,Kp:xe,TZ:Te,Lc:Ae,uP:Ee,Rz:Se}=be;class Ne extends b{static featureName=Te;constructor(e,t=!0){super(e,Te,t);if(!(0,m.V)(this.agentIdentifier))return void this.deregisterDrain();const r=this.ee;let n;ve(r),this.eventsEE=(0,F.u)(r),this.eventsEE.on(Ee,(function(e,t){this.bstStart=(0,S.t)()})),this.eventsEE.on(Ae,(function(e,t){(0,x.p)("bst",[e[0],t,this.bstStart,(0,S.t)()],void 0,a.K7.sessionTrace,r)})),r.on(Se+Re,(function(e){this.time=(0,S.t)(),this.startPath=location.pathname+location.hash})),r.on(Se+xe,(function(e){(0,x.p)("bstHist",[location.pathname+location.hash,this.startPath,this.time],void 0,a.K7.sessionTrace,r)}));try{n=new PerformanceObserver((e=>{const t=e.getEntries();(0,x.p)(ye,[t],void 0,a.K7.sessionTrace,r)})),n.observe({type:we,buffered:!0})}catch(e){}this.importAggregator(e,{resourceObserver:n})}}var Oe=i(2614);class _e extends b{static featureName=t.TZ;#i;#o;constructor(e,r=!0){let n;super(e,t.TZ,r),this.replayRunning=!1,this.#o=e;try{n=JSON.parse(localStorage.getItem("".concat(Oe.H3,"_").concat(Oe.uh)))}catch(e){}(0,g.SR)(e.agentIdentifier)&&this.ee.on(t.G4.RECORD,(()=>this.#a())),this.#s(n)?(this.#i=n?.sessionReplayMode,this.#c()):this.importAggregator(e),this.ee.on("err",(e=>{this.replayRunning&&(this.errorNoticed=!0,(0,x.p)(t.G4.ERROR_DURING_REPLAY,[e],void 0,this.featureName,this.ee))})),this.ee.on(t.G4.REPLAY_RUNNING,(e=>{this.replayRunning=e}))}#s(e){return e&&(e.sessionReplayMode===Oe.g.FULL||e.sessionReplayMode===Oe.g.ERROR)||(0,g.Aw)(this.agentIdentifier)}#u=!1;async#c(e){if(!this.#u){this.#u=!0;try{const{Recorder:t}=await Promise.all([i.e(478),i.e(249)]).then(i.bind(i,8589));this.recorder??=new t({mode:this.#i,agentIdentifier:this.agentIdentifier,trigger:e,ee:this.ee}),this.recorder.startRecording(),this.abortHandler=this.recorder.stopRecording}catch(e){}this.importAggregator(this.#o,{recorder:this.recorder,errorNoticed:this.errorNoticed})}}#a(){this.featAggregate?this.featAggregate.mode!==Oe.g.FULL&&this.featAggregate.initializeRecording(Oe.g.FULL,!0):(this.#i=Oe.g.FULL,this.#c(t.Qb.API),this.recorder&&this.recorder.parent.mode!==Oe.g.FULL&&(this.recorder.parent.mode=Oe.g.FULL,this.recorder.stopRecording(),this.recorder.startRecording(),this.abortHandler=this.recorder.stopRecording))}}var Ie=i(3962);class je extends b{static featureName=Ie.TZ;constructor(e,t=!0){if(super(e,Ie.TZ,t),!p.RI||!(0,R.dV)().o.MO)return;const r=ve(this.ee);Ie.tC.forEach((e=>{(0,A.sp)(e,(e=>{a(e)}),!0)}));const n=()=>(0,x.p)("newURL",[(0,S.t)(),""+window.location],void 0,this.featureName,this.ee);r.on("pushState-end",n),r.on("replaceState-end",n);try{this.removeOnAbort=new AbortController}catch(e){}(0,A.sp)("popstate",(e=>(0,x.p)("newURL",[e.timeStamp,""+window.location],void 0,this.featureName,this.ee)),!0,this.removeOnAbort?.signal);let i=!1;const o=new((0,R.dV)().o.MO)(((e,t)=>{i||(i=!0,requestAnimationFrame((()=>{(0,x.p)("newDom",[(0,S.t)()],void 0,this.featureName,this.ee),i=!1})))})),a=(0,v.s)((e=>{(0,x.p)("newUIEvent",[e],void 0,this.featureName,this.ee),o.observe(document.body,{attributes:!0,childList:!0,subtree:!0,characterData:!0})}),100,{leading:!0});this.abortHandler=function(){this.removeOnAbort?.abort(),o.disconnect(),this.abortHandler=void 0},this.importAggregator(e,{domObserver:o})}}var Pe=i(7378);const Ce={},ke=["appendChild","insertBefore","replaceChild"];function Le(e){const t=function(e){return(e||B.ee).get("jsonp")}(e);if(!p.RI||Ce[t.debugId])return t;Ce[t.debugId]=!0;var r=(0,W.YM)(t),n=/[?&](?:callback|cb)=([^&#]+)/,i=/(.*)\.([^.]+)/,o=/^(\w+)(\.|$)(.*)$/;function a(e,t){if(!e)return t;const r=e.match(o),n=r[1];return a(r[3],t[n])}return r.inPlace(Node.prototype,ke,"dom-"),t.on("dom-start",(function(e){!function(e){if(!e||"string"!=typeof e.nodeName||"script"!==e.nodeName.toLowerCase())return;if("function"!=typeof e.addEventListener)return;var o=(s=e.src,c=s.match(n),c?c[1]:null);var s,c;if(!o)return;var u=function(e){var t=e.match(i);if(t&&t.length>=3)return{key:t[2],parent:a(t[1],window)};return{key:e,parent:window}}(o);if("function"!=typeof u.parent[u.key])return;var d={};function l(){t.emit("jsonp-end",[],d),e.removeEventListener("load",l,(0,A.jT)(!1)),e.removeEventListener("error",f,(0,A.jT)(!1))}function f(){t.emit("jsonp-error",[],d),t.emit("jsonp-end",[],d),e.removeEventListener("load",l,(0,A.jT)(!1)),e.removeEventListener("error",f,(0,A.jT)(!1))}r.inPlace(u.parent,[u.key],"cb-",d),e.addEventListener("load",l,(0,A.jT)(!1)),e.addEventListener("error",f,(0,A.jT)(!1)),t.emit("new-jsonp",[e.src],d)}(e[0])})),t}const He={};function De(e){const t=function(e){return(e||B.ee).get("promise")}(e);if(He[t.debugId])return t;He[t.debugId]=!0;var r=t.context,n=(0,W.YM)(t),i=p.gm.Promise;return i&&function(){function e(r){var o=t.context(),a=n(r,"executor-",o,null,!1);const s=Reflect.construct(i,[a],e);return t.context(s).getCtx=function(){return o},s}p.gm.Promise=e,Object.defineProperty(e,"name",{value:"Promise"}),e.toString=function(){return i.toString()},Object.setPrototypeOf(e,i),["all","race"].forEach((function(r){const n=i[r];e[r]=function(e){let i=!1;[...e||[]].forEach((e=>{this.resolve(e).then(a("all"===r),a(!1))}));const o=n.apply(this,arguments);return o;function a(e){return function(){t.emit("propagate",[null,!i],o,!1,!1),i=i||!e}}}})),["resolve","reject"].forEach((function(r){const n=i[r];e[r]=function(e){const r=n.apply(this,arguments);return e!==r&&t.emit("propagate",[e,!0],r,!1,!1),r}})),e.prototype=i.prototype;const o=i.prototype.then;i.prototype.then=function(...e){var i=this,a=r(i);a.promise=i,e[0]=n(e[0],"cb-",a,null,!1),e[1]=n(e[1],"cb-",a,null,!1);const s=o.apply(this,e);return a.nextPromise=s,t.emit("propagate",[i,!0],s,!1,!1),s},i.prototype.then[W.Jt]=o,t.on("executor-start",(function(e){e[0]=n(e[0],"resolve-",this,null,!1),e[1]=n(e[1],"resolve-",this,null,!1)})),t.on("executor-err",(function(e,t,r){e[1](r)})),t.on("cb-end",(function(e,r,n){t.emit("propagate",[n,!0],this.nextPromise,!1,!1)})),t.on("propagate",(function(e,r,n){this.getCtx&&!r||(this.getCtx=function(){if(e instanceof Promise)var r=t.context(e);return r&&r.getCtx?r.getCtx():this})}))}(),t}const Me={},Ke="setTimeout",Ue="setInterval",Ve="clearTimeout",Ge="-start",Fe=[Ke,"setImmediate",Ue,Ve,"clearImmediate"];function Be(e){const t=function(e){return(e||B.ee).get("timer")}(e);if(Me[t.debugId]++)return t;Me[t.debugId]=1;var r=(0,W.YM)(t);return r.inPlace(p.gm,Fe.slice(0,2),Ke+"-"),r.inPlace(p.gm,Fe.slice(2,3),Ue+"-"),r.inPlace(p.gm,Fe.slice(3),Ve+"-"),t.on(Ue+Ge,(function(e,t,n){e[0]=r(e[0],"fn-",null,n)})),t.on(Ke+Ge,(function(e,t,n){this.method=n,this.timerDuration=isNaN(e[1])?0:+e[1],e[0]=r(e[0],"fn-",this,n)})),t}const We={};function ze(e){const t=function(e){return(e||B.ee).get("mutation")}(e);if(!p.RI||We[t.debugId])return t;We[t.debugId]=!0;var r=(0,W.YM)(t),n=p.gm.MutationObserver;return n&&(window.MutationObserver=function(e){return this instanceof n?new n(r(e,"fn-")):n.apply(this,arguments)},MutationObserver.prototype=n.prototype),t}const{TZ:qe,d3:Ze,Kp:Ye,$p:Xe,wW:Je,e5:Qe,tH:$e,uP:et,rw:tt,Lc:rt}=Pe;class nt extends b{static featureName=qe;constructor(e,t=!0){if(super(e,qe,t),!p.RI)return;try{this.removeOnAbort=new AbortController}catch(e){}let r,n=0;const i=this.ee.get("tracer"),o=Le(this.ee),a=De(this.ee),s=Be(this.ee),c=Z(this.ee),u=this.ee.get("events"),d=ne(this.ee),l=ve(this.ee),f=ze(this.ee);function h(e,t){l.emit("newURL",[""+window.location,t])}function g(){n++,r=window.location.hash,this[et]=(0,S.t)()}function m(){n--,window.location.hash!==r&&h(0,!0);var e=(0,S.t)();this[Qe]=~~this[Qe]+e-this[et],this[rt]=e}function v(e,t){e.on(t,(function(){this[t]=(0,S.t)()}))}this.ee.on(et,g),a.on(tt,g),o.on(tt,g),this.ee.on(rt,m),a.on(Je,m),o.on(Je,m),this.ee.on("fn-err",((...t)=>{t[2]?.__newrelic?.[e.agentIdentifier]||(0,x.p)("function-err",[...t],void 0,this.featureName,this.ee)})),this.ee.buffer([et,rt,"xhr-resolved"],this.featureName),u.buffer([et],this.featureName),s.buffer(["setTimeout"+Ye,"clearTimeout"+Ze,et],this.featureName),c.buffer([et,"new-xhr","send-xhr"+Ze],this.featureName),d.buffer([$e+Ze,$e+"-done",$e+Xe+Ze,$e+Xe+Ye],this.featureName),l.buffer(["newURL"],this.featureName),f.buffer([et],this.featureName),a.buffer(["propagate",tt,Je,"executor-err","resolve"+Ze],this.featureName),i.buffer([et,"no-"+et],this.featureName),o.buffer(["new-jsonp","cb-start","jsonp-error","jsonp-end"],this.featureName),v(d,$e+Ze),v(d,$e+"-done"),v(o,"new-jsonp"),v(o,"jsonp-end"),v(o,"cb-start"),l.on("pushState-end",h),l.on("replaceState-end",h),window.addEventListener("hashchange",h,(0,A.jT)(!0,this.removeOnAbort?.signal)),window.addEventListener("load",h,(0,A.jT)(!0,this.removeOnAbort?.signal)),window.addEventListener("popstate",(function(){h(0,n>1)}),(0,A.jT)(!0,this.removeOnAbort?.signal)),this.abortHandler=this.#n,this.importAggregator(e)}#n(){this.removeOnAbort?.abort(),this.abortHandler=void 0}}var it=i(3333);class ot extends b{static featureName=it.TZ;constructor(e,t=!0){super(e,it.TZ,t);const r=[e.init.page_action.enabled,e.init.performance.capture_marks,e.init.performance.capture_measures,e.init.user_actions.enabled];p.RI&&e.init.user_actions.enabled&&(it.Zp.forEach((e=>(0,A.sp)(e,(e=>(0,x.p)("ua",[e],void 0,this.featureName,this.ee)),!0))),it.qN.forEach((e=>(0,A.sp)(e,(e=>(0,x.p)("ua",[e],void 0,this.featureName,this.ee)))))),r.some((e=>e))?this.importAggregator(e):this.deregisterDrain()}}var at=i(993),st=i(3785);class ct extends b{static featureName=at.TZ;constructor(e,t=!0){super(e,at.TZ,t);const r=this.ee;this.ee.on("wrap-logger-end",(function([e]){const{level:t,customAttributes:n}=this;(0,st.R)(r,e,n,t)})),this.importAggregator(e)}}new class extends o{constructor(t,r){super(r),p.gm?(this.features={},(0,R.bQ)(this.agentIdentifier,this),this.desiredFeatures=new Set(t.features||[]),this.desiredFeatures.add(w),this.runSoftNavOverSpa=[...this.desiredFeatures].some((e=>e.featureName===a.K7.softNav)),(0,d.j)(this,t,t.loaderType||"agent"),this.run()):(0,e.R)(21)}get config(){return{info:this.info,init:this.init,loader_config:this.loader_config,runtime:this.runtime}}run(){try{const t=u(this.agentIdentifier),r=[...this.desiredFeatures];r.sort(((e,t)=>a.P3[e.featureName]-a.P3[t.featureName])),r.forEach((r=>{if(!t[r.featureName]&&r.featureName!==a.K7.pageViewEvent)return;if(this.runSoftNavOverSpa&&r.featureName===a.K7.spa)return;if(!this.runSoftNavOverSpa&&r.featureName===a.K7.softNav)return;const n=function(e){switch(e){case a.K7.ajax:return[a.K7.jserrors];case a.K7.sessionTrace:return[a.K7.ajax,a.K7.pageViewEvent];case a.K7.sessionReplay:return[a.K7.sessionTrace];case a.K7.pageViewTiming:return[a.K7.pageViewEvent];default:return[]}}(r.featureName).filter((e=>!(e in this.features)));n.length>0&&(0,e.R)(36,{targetFeature:r.featureName,missingDependencies:n}),this.features[r.featureName]=new r(this)}))}catch(t){(0,e.R)(22,t);for(const e in this.features)this.features[e].abortHandler?.();const r=(0,R.Zm)();delete r.initializedAgents[this.agentIdentifier]?.api,delete r.initializedAgents[this.agentIdentifier]?.features,delete this.sharedAggregator;return r.ee.get(this.agentIdentifier).abort(),!1}}}({features:[he,w,N,Ne,_e,_,D,ot,ct,je,nt],loaderType:"spa"})})()})();</script><link rel="preload" href="/article-pages/_nuxt/4764e3b.js" as="script"><link rel="preload" href="/article-pages/_nuxt/8e7ee66.js" as="script"><link rel="preload" href="/article-pages/_nuxt/css/468b299.css" as="style"><link rel="preload" href="/article-pages/_nuxt/232bf4b.js" as="script"><link rel="preload" href="/article-pages/_nuxt/css/6a64fd3.css" as="style"><link rel="preload" href="/article-pages/_nuxt/3b10072.js" as="script"><link rel="preload" href="/article-pages/_nuxt/a07a553.js" as="script"><link rel="preload" href="/article-pages/_nuxt/css/e5cdfa1.css" as="style"><link rel="preload" href="/article-pages/_nuxt/94ee25c.js" as="script"><link rel="preload" href="/article-pages/_nuxt/css/82a0061.css" as="style"><link rel="preload" href="/article-pages/_nuxt/5465e0e.js" as="script"><link rel="preload" href="/article-pages/_nuxt/css/d80c00c.css" as="style"><link rel="preload" href="/article-pages/_nuxt/fb04c78.js" as="script"><link rel="preload" href="/article-pages/_nuxt/f8f682e.js" as="script"><link rel="stylesheet" href="/article-pages/_nuxt/css/468b299.css"><link rel="stylesheet" href="/article-pages/_nuxt/css/6a64fd3.css"><link rel="stylesheet" href="/article-pages/_nuxt/css/e5cdfa1.css"><link rel="stylesheet" href="/article-pages/_nuxt/css/82a0061.css"><link rel="stylesheet" href="/article-pages/_nuxt/css/d80c00c.css"> <meta property="fb:admins" content="1841006843"> </head> <body > <button class="BypassBlock__firstEl"></button> <a href="#main-content" class="BypassBlock__wrapper"> <span class="BypassBlock__button">Skip to main content</span> </a> <!-- Google Tag Manager (noscript) --> <noscript> <iframe src="https://tag-manager.frontiersin.org/ns.html?id=GTM-M322FV2&gtm_auth=owVbWxfaJr21yQv1fe1cAQ&gtm_preview=env-1&gtm_cookies_win=x" height="0" width="0" style="display:none;visibility:hidden"></iframe> </noscript> <!-- End Google Tag Manager (noscript) --> <div data-server-rendered="true" id="__nuxt"><div id="__layout"><div theme="red" class="ArticleLayout"><nav class="Ibar"><h1 class="acc-hidden">Top bar navigation</h1> <div class="Ibar__main"><div class="Ibar__wrapper"><button aria-label="Open Menu" data-event="iBar-btn-openMenu" class="Ibar__burger"></button> <div class="Ibar__logo"><a href="//www.frontiersin.org/" aria-label="Frontiershome" data-event="iBar-a-home" class="Ibar__logo__link"><svg viewBox="0 0 2811 590" fill="none" xmlns="http://www.w3.org/2000/svg" class="Ibar__logo__svg"><path d="M633.872 234.191h-42.674v-57.246h42.674c0-19.776 2.082-35.389 5.204-48.92 4.164-13.53 9.368-23.939 17.695-31.225 8.326-8.326 18.735-13.53 32.266-16.653 13.531-3.123 29.143-5.204 47.878-5.204h21.858c7.286 0 14.572 1.04 21.857 1.04v62.451c-8.326-1.041-16.653-2.082-23.939-2.082-10.408 0-17.694 1.041-23.939 4.164-6.245 3.122-9.368 10.408-9.368 22.898v13.531h53.083v57.246h-53.083v213.372h-89.512V234.191zM794.161 176.945h86.39v47.879h1.041c6.245-17.694 16.653-30.185 31.225-39.552 14.572-9.368 31.225-13.531 49.96-13.531h10.409c3.122 0 7.286 1.041 10.408 2.082v81.185c-6.245-2.082-11.449-3.122-16.653-4.163-5.204-1.041-11.449-1.041-16.654-1.041-11.449 0-20.816 2.082-29.143 5.204-8.327 3.123-15.613 8.327-20.817 14.572-5.204 6.245-10.408 12.49-12.49 20.817-3.123 8.326-4.163 15.612-4.163 23.939v133.228h-88.472V176.945h-1.041zM989.84 312.254c0-19.776 3.122-39.552 10.41-56.205 7.28-17.695 16.65-32.266 29.14-45.797 12.49-13.531 27.06-22.899 44.76-30.185 17.69-7.285 36.43-11.449 57.24-11.449 20.82 0 39.56 4.164 57.25 11.449 17.69 7.286 32.27 17.695 45.8 30.185 12.49 12.49 22.9 28.102 29.14 45.797 7.29 17.694 10.41 36.429 10.41 56.205 0 20.817-3.12 39.552-10.41 57.246-7.29 17.695-16.65 32.266-29.14 44.756-12.49 12.49-28.11 22.899-45.8 30.185-17.69 7.286-36.43 11.449-57.25 11.449-20.81 0-40.59-4.163-57.24-11.449-17.7-7.286-32.27-17.695-44.76-30.185-12.49-12.49-21.86-28.102-29.14-44.756-7.288-17.694-10.41-36.429-10.41-57.246zm88.47 0c0 8.327 1.04 17.694 3.12 26.021 2.09 9.368 5.21 16.653 9.37 23.939 4.16 7.286 9.37 13.531 16.65 17.695 7.29 4.163 15.62 7.285 26.03 7.285 10.4 0 18.73-2.081 26.02-7.285 7.28-4.164 12.49-10.409 16.65-17.695 4.16-7.286 7.29-15.612 9.37-23.939 2.08-9.368 3.12-17.694 3.12-26.021 0-8.327-1.04-17.694-3.12-26.021-2.08-9.368-5.21-16.653-9.37-23.939-4.16-7.286-9.37-13.531-16.65-17.695-7.29-5.204-15.62-7.285-26.02-7.285-10.41 0-18.74 2.081-26.03 7.285-7.28 5.205-12.49 10.409-16.65 17.695-4.16 7.286-7.28 15.612-9.37 23.939-2.08 9.368-3.12 17.694-3.12 26.021zM1306.25 176.945h86.39v37.47h1.04c4.17-7.286 9.37-13.531 15.62-18.735 6.24-5.204 13.53-10.408 20.81-14.572 7.29-4.163 15.62-7.286 23.94-9.367 8.33-2.082 16.66-3.123 24.98-3.123 22.9 0 40.6 4.164 53.09 11.449 13.53 7.286 22.89 16.654 29.14 27.062 6.24 10.409 10.41 21.858 12.49 34.348 2.08 12.49 2.08 22.898 2.08 33.307v172.779h-88.47V316.417v-27.061c0-9.368-1.04-16.654-4.16-23.94-3.13-7.286-7.29-12.49-13.53-16.653-6.25-4.164-15.62-6.245-27.07-6.245-8.32 0-15.61 2.081-21.85 5.204-6.25 3.122-11.45 7.286-14.58 13.531-4.16 5.204-6.24 11.449-8.32 18.735s-3.12 14.572-3.12 21.858v145.717h-88.48V176.945zM1780.88 234.19h-55.17v122.819c0 10.408 3.12 17.694 8.33 20.817 6.24 3.122 13.53 5.204 22.9 5.204 4.16 0 7.28 0 11.45-1.041h11.45v65.573c-8.33 0-15.62 1.041-23.94 2.082-8.33 1.04-16.66 1.041-23.94 1.041-18.74 0-34.35-2.082-46.84-5.205-12.49-3.122-21.86-8.326-29.14-15.612-7.29-7.286-12.49-16.654-14.58-29.144-3.12-12.49-4.16-27.062-4.16-45.797V234.19h-44.76v-57.246h44.76V94.717h88.47v82.227h55.17v57.246zM1902.66 143.639h-88.48V75.984h88.48v67.655zm-89.52 33.307h88.48v270.618h-88.48V176.946zM2024.43 334.111c1.04 18.735 6.25 33.307 16.66 44.756 10.4 11.449 24.98 16.653 43.71 16.653 10.41 0 20.82-2.081 30.19-7.286 9.36-5.204 16.65-12.49 20.81-22.898h83.27c-4.16 15.613-10.41 29.144-19.78 40.593-9.36 11.449-19.77 20.817-31.22 28.102-12.49 7.286-24.98 12.491-39.55 16.654-14.57 3.122-29.15 5.204-43.72 5.204-21.86 0-41.63-3.122-60.37-9.367-18.73-6.246-34.34-15.613-46.83-28.103-12.49-12.49-22.9-27.062-30.19-45.797-7.28-17.694-10.41-38.511-10.41-60.369 0-20.817 4.17-39.552 11.45-57.246 7.29-17.694 17.7-32.266 31.23-44.756 13.53-12.49 29.14-21.858 46.83-29.144 17.7-7.286 36.43-10.408 56.21-10.408 23.94 0 45.8 4.163 63.49 12.49 17.7 8.327 33.31 19.776 44.76 35.389 11.45 15.612 20.81 32.266 26.02 52.042 5.2 19.776 8.33 41.633 7.28 64.532h-199.84v-1.041zm110.33-49.961c-1.04-15.612-6.24-28.102-15.61-39.551-9.37-10.409-21.86-16.654-37.47-16.654s-28.1 5.204-38.51 15.613c-10.41 10.408-16.66 23.939-18.74 40.592h110.33zM2254.46 176.945h86.39v47.879h1.04c6.25-17.694 16.65-30.185 31.23-39.552 14.57-9.368 31.22-13.531 49.96-13.531h10.4c3.13 0 7.29 1.041 10.41 2.082v81.185c-6.24-2.082-11.45-3.122-16.65-4.163-5.21-1.041-11.45-1.041-16.65-1.041-11.45 0-20.82 2.082-29.15 5.204-8.32 3.123-15.61 8.327-20.81 14.572-6.25 6.245-10.41 12.49-12.49 20.817-3.13 8.326-4.17 15.612-4.17 23.939v133.228h-88.47V176.945h-1.04zM2534.45 359.091c0 7.286 1.04 12.49 4.16 17.694 3.12 5.204 6.24 9.368 10.41 12.49 4.16 3.123 9.36 5.204 14.57 7.286 6.24 2.082 11.45 2.082 17.69 2.082 4.17 0 8.33 0 13.53-2.082 5.21-1.041 9.37-3.123 13.53-5.204 4.17-2.082 7.29-5.204 10.41-9.368 3.13-4.163 4.17-8.327 4.17-13.531 0-5.204-2.09-9.367-5.21-12.49-3.12-3.122-7.28-6.245-11.45-8.327-4.16-2.081-9.36-4.163-14.57-5.204-5.2-1.041-9.37-2.081-13.53-3.122-13.53-3.123-28.1-6.245-42.67-9.368-14.58-3.122-28.11-7.286-40.6-12.49-12.49-6.245-22.9-13.531-30.18-23.939-8.33-10.409-11.45-23.94-11.45-42.675 0-16.653 4.16-30.184 11.45-40.592 8.33-10.409 17.69-18.736 30.18-24.981 12.49-6.245 26.02-10.408 40.6-13.53 14.57-3.123 28.1-4.164 41.63-4.164 14.57 0 29.14 1.041 43.71 4.164 14.58 2.081 27.07 7.285 39.56 13.53 12.49 6.245 21.85 15.613 29.14 27.062 7.29 11.45 11.45 26.021 12.49 43.716h-82.23c0-10.409-4.16-18.736-11.45-23.94-7.28-4.163-16.65-7.286-28.1-7.286-4.16 0-8.32 0-12.49 1.041-4.16 1.041-8.32 1.041-12.49 2.082-4.16 1.041-7.28 3.122-9.37 6.245-2.08 3.122-4.16 6.245-4.16 11.449 0 6.245 3.12 11.449 10.41 15.613 6.24 4.163 14.57 7.286 24.98 10.408 10.41 2.082 20.82 5.204 32.27 7.286 11.44 2.082 22.89 4.163 33.3 6.245 13.53 3.123 24.98 7.286 33.31 13.531 9.37 6.245 15.61 12.49 20.82 19.776 5.2 7.286 9.36 14.572 11.45 21.858 2.08 7.285 3.12 13.53 3.12 19.776 0 17.694-4.17 33.306-11.45 45.796-8.33 12.491-17.7 21.858-30.19 30.185-12.49 7.286-26.02 12.49-41.63 16.653-15.61 3.123-31.22 5.204-45.8 5.204-15.61 0-32.26-1.04-47.87-4.163-15.62-3.122-29.15-8.327-41.64-15.612a83.855 83.855 0 01-30.18-30.185c-8.33-12.49-12.49-28.102-12.49-46.838h84.31v-2.081z" fill="#FFFFFF" class="Ibar__logo__text"></path> <path d="M0 481.911V281.028l187.351-58.287v200.882L0 481.911z" fill="#8BC53F"></path> <path d="M187.351 423.623V222.741l126.983 87.431v200.882l-126.983-87.431z" fill="#EBD417"></path> <path d="M126.982 569.341L0 481.911l187.351-58.287 126.983 87.43-187.352 58.287z" fill="#034EA1"></path> <path d="M183.188 212.331l51.001-116.574 65.573 155.085-51.001 116.574-65.573-155.085z" fill="#712E74"></path> <path d="M248.761 367.415l51.001-116.574 171.739-28.102-49.96 115.533-172.78 29.143z" fill="#009FD1"></path> <path d="M299.762 250.842L234.189 95.757l171.739-28.103 65.573 155.085-171.739 28.103z" fill="#F6921E"></path> <path d="M187.352 222.741L59.328 198.802 44.757 71.819 172.78 95.76l14.572 126.982z" fill="#DA2128"></path> <path d="M172.78 95.758L44.757 71.818l70.777-70.776 128.023 23.94-70.777 70.776z" fill="#25BCBD"></path> <path d="M258.129 153.005l-70.777 69.736-14.571-126.982 70.777-70.778 14.571 128.024z" fill="#00844A"></path></svg></a></div> <a aria-label="Frontiers in Computational Neuroscience" href="//www.frontiersin.org/journals/computational-neuroscience" data-event="iBar-a-journalHome" class="Ibar__journalName"><div logoClass="Ibar__logo--mixed" class="Ibar__journalName__container"><div class="Ibar__journal__maskLogo" style="display:none;"><img src="" class="Ibar__journal__logo"></div> <div class="Ibar__journalName"><span>Frontiers in</span> <span> Computational Neuroscience</span></div></div></a> <div parent-data-event="iBar" class="Ibar__dropdown Ibar__dropdown--aboutUs"><button class="Ibar__dropdown__trigger"><!----> About us </button> <div class="Ibar__dropdown__menu"><div class="Ibar__dropdown__menu__header"><button aria-label="Close Dropdown" class="Ibar__dropdown__menu__header__title"> About us </button> <button aria-label="Close Dropdown" class="Ibar__close"></button></div> <div class="Ibar__dropdown__about"><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Who we are</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/mission" target="_self" data-event="iBar-aboutUs_0-a_whoWeAre">Mission and values</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/history" target="_self" data-event="iBar-aboutUs_0-a_whoWeAre">History</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/leadership" target="_self" data-event="iBar-aboutUs_0-a_whoWeAre">Leadership</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/awards" target="_self" data-event="iBar-aboutUs_0-a_whoWeAre">Awards</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Impact and progress</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/impact" target="_self" data-event="iBar-aboutUs_1-a_impactAndProgress">Frontiers' impact</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://progressreport.frontiersin.org/?utm_source=fweb&amp;utm_medium=frep&amp;utm_campaign=pr20" target="_blank" data-event="iBar-aboutUs_1-a_impactAndProgress">Progress Report 2022</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/progress-reports" target="_self" data-event="iBar-aboutUs_1-a_impactAndProgress">All progress reports</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Publishing model</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/how-we-publish" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">How we publish</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/open-access" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Open access</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/fee-policy" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Fee policy</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/peer-review" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Peer review</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/research-integrity" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Research integrity</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/research-topics" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Research Topics</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Services</li> <li class="Ibar__dropdown__about__block__item"><a href="https://publishingpartnerships.frontiersin.org/" target="_blank" data-event="iBar-aboutUs_3-a_services">Societies</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/open-access-agreements/consortia" target="_self" data-event="iBar-aboutUs_3-a_services">National consortia</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/open-access-agreements" target="_self" data-event="iBar-aboutUs_3-a_services">Institutional partnerships</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/collaborators" target="_self" data-event="iBar-aboutUs_3-a_services">Collaborators</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">More from Frontiers</li> <li class="Ibar__dropdown__about__block__item"><a href="https://forum.frontiersin.org/" target="_blank" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Frontiers Forum</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersplanetprize.org/" target="_blank" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Frontiers Planet Prize</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://pressoffice.frontiersin.org/" target="_blank" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Press office</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.orgabout/sustainability" target="_self" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Sustainability</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://careers.frontiersin.org/" target="_blank" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Career opportunities</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/contact" target="_self" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Contact us</a></li></ul></div></div></div> <a href="https://www.frontiersin.org/journals" data-event="iBar-a-allJournals" class="Ibar__link">All journals</a><a href="https://www.frontiersin.org/articles" data-event="iBar-a-allArticles" class="Ibar__link">All articles</a> <a href="https://www.frontiersin.org/submission/submit?domainid=1&amp;fieldid=55&amp;specialtyid=237&amp;entitytype=1&amp;entityid=9" data-event="iBar-a-submit" class="Ibar__button Ibar__submit">Submit your research</a> <div class="Ibar__spacer"></div> <a href="/search" aria-label="Search" data-event="iBar-a-search" class="Ibar__icon Ibar__icon--search"><span>Search</span></a> <!----> <!----> <!----> <div class="Ibar__userArea"></div></div></div> <div class="Ibar__menu Ibar__menu--journal"><div class="Ibar__menu__header"><div class="Ibar__logo"><div class="Ibar__logo"><a href="//www.frontiersin.org/" aria-label="Frontiershome" data-event="iBar-a-home" class="Ibar__logo__link"><svg viewBox="0 0 2811 590" fill="none" xmlns="http://www.w3.org/2000/svg" class="Ibar__logo__svg"><path d="M633.872 234.191h-42.674v-57.246h42.674c0-19.776 2.082-35.389 5.204-48.92 4.164-13.53 9.368-23.939 17.695-31.225 8.326-8.326 18.735-13.53 32.266-16.653 13.531-3.123 29.143-5.204 47.878-5.204h21.858c7.286 0 14.572 1.04 21.857 1.04v62.451c-8.326-1.041-16.653-2.082-23.939-2.082-10.408 0-17.694 1.041-23.939 4.164-6.245 3.122-9.368 10.408-9.368 22.898v13.531h53.083v57.246h-53.083v213.372h-89.512V234.191zM794.161 176.945h86.39v47.879h1.041c6.245-17.694 16.653-30.185 31.225-39.552 14.572-9.368 31.225-13.531 49.96-13.531h10.409c3.122 0 7.286 1.041 10.408 2.082v81.185c-6.245-2.082-11.449-3.122-16.653-4.163-5.204-1.041-11.449-1.041-16.654-1.041-11.449 0-20.816 2.082-29.143 5.204-8.327 3.123-15.613 8.327-20.817 14.572-5.204 6.245-10.408 12.49-12.49 20.817-3.123 8.326-4.163 15.612-4.163 23.939v133.228h-88.472V176.945h-1.041zM989.84 312.254c0-19.776 3.122-39.552 10.41-56.205 7.28-17.695 16.65-32.266 29.14-45.797 12.49-13.531 27.06-22.899 44.76-30.185 17.69-7.285 36.43-11.449 57.24-11.449 20.82 0 39.56 4.164 57.25 11.449 17.69 7.286 32.27 17.695 45.8 30.185 12.49 12.49 22.9 28.102 29.14 45.797 7.29 17.694 10.41 36.429 10.41 56.205 0 20.817-3.12 39.552-10.41 57.246-7.29 17.695-16.65 32.266-29.14 44.756-12.49 12.49-28.11 22.899-45.8 30.185-17.69 7.286-36.43 11.449-57.25 11.449-20.81 0-40.59-4.163-57.24-11.449-17.7-7.286-32.27-17.695-44.76-30.185-12.49-12.49-21.86-28.102-29.14-44.756-7.288-17.694-10.41-36.429-10.41-57.246zm88.47 0c0 8.327 1.04 17.694 3.12 26.021 2.09 9.368 5.21 16.653 9.37 23.939 4.16 7.286 9.37 13.531 16.65 17.695 7.29 4.163 15.62 7.285 26.03 7.285 10.4 0 18.73-2.081 26.02-7.285 7.28-4.164 12.49-10.409 16.65-17.695 4.16-7.286 7.29-15.612 9.37-23.939 2.08-9.368 3.12-17.694 3.12-26.021 0-8.327-1.04-17.694-3.12-26.021-2.08-9.368-5.21-16.653-9.37-23.939-4.16-7.286-9.37-13.531-16.65-17.695-7.29-5.204-15.62-7.285-26.02-7.285-10.41 0-18.74 2.081-26.03 7.285-7.28 5.205-12.49 10.409-16.65 17.695-4.16 7.286-7.28 15.612-9.37 23.939-2.08 9.368-3.12 17.694-3.12 26.021zM1306.25 176.945h86.39v37.47h1.04c4.17-7.286 9.37-13.531 15.62-18.735 6.24-5.204 13.53-10.408 20.81-14.572 7.29-4.163 15.62-7.286 23.94-9.367 8.33-2.082 16.66-3.123 24.98-3.123 22.9 0 40.6 4.164 53.09 11.449 13.53 7.286 22.89 16.654 29.14 27.062 6.24 10.409 10.41 21.858 12.49 34.348 2.08 12.49 2.08 22.898 2.08 33.307v172.779h-88.47V316.417v-27.061c0-9.368-1.04-16.654-4.16-23.94-3.13-7.286-7.29-12.49-13.53-16.653-6.25-4.164-15.62-6.245-27.07-6.245-8.32 0-15.61 2.081-21.85 5.204-6.25 3.122-11.45 7.286-14.58 13.531-4.16 5.204-6.24 11.449-8.32 18.735s-3.12 14.572-3.12 21.858v145.717h-88.48V176.945zM1780.88 234.19h-55.17v122.819c0 10.408 3.12 17.694 8.33 20.817 6.24 3.122 13.53 5.204 22.9 5.204 4.16 0 7.28 0 11.45-1.041h11.45v65.573c-8.33 0-15.62 1.041-23.94 2.082-8.33 1.04-16.66 1.041-23.94 1.041-18.74 0-34.35-2.082-46.84-5.205-12.49-3.122-21.86-8.326-29.14-15.612-7.29-7.286-12.49-16.654-14.58-29.144-3.12-12.49-4.16-27.062-4.16-45.797V234.19h-44.76v-57.246h44.76V94.717h88.47v82.227h55.17v57.246zM1902.66 143.639h-88.48V75.984h88.48v67.655zm-89.52 33.307h88.48v270.618h-88.48V176.946zM2024.43 334.111c1.04 18.735 6.25 33.307 16.66 44.756 10.4 11.449 24.98 16.653 43.71 16.653 10.41 0 20.82-2.081 30.19-7.286 9.36-5.204 16.65-12.49 20.81-22.898h83.27c-4.16 15.613-10.41 29.144-19.78 40.593-9.36 11.449-19.77 20.817-31.22 28.102-12.49 7.286-24.98 12.491-39.55 16.654-14.57 3.122-29.15 5.204-43.72 5.204-21.86 0-41.63-3.122-60.37-9.367-18.73-6.246-34.34-15.613-46.83-28.103-12.49-12.49-22.9-27.062-30.19-45.797-7.28-17.694-10.41-38.511-10.41-60.369 0-20.817 4.17-39.552 11.45-57.246 7.29-17.694 17.7-32.266 31.23-44.756 13.53-12.49 29.14-21.858 46.83-29.144 17.7-7.286 36.43-10.408 56.21-10.408 23.94 0 45.8 4.163 63.49 12.49 17.7 8.327 33.31 19.776 44.76 35.389 11.45 15.612 20.81 32.266 26.02 52.042 5.2 19.776 8.33 41.633 7.28 64.532h-199.84v-1.041zm110.33-49.961c-1.04-15.612-6.24-28.102-15.61-39.551-9.37-10.409-21.86-16.654-37.47-16.654s-28.1 5.204-38.51 15.613c-10.41 10.408-16.66 23.939-18.74 40.592h110.33zM2254.46 176.945h86.39v47.879h1.04c6.25-17.694 16.65-30.185 31.23-39.552 14.57-9.368 31.22-13.531 49.96-13.531h10.4c3.13 0 7.29 1.041 10.41 2.082v81.185c-6.24-2.082-11.45-3.122-16.65-4.163-5.21-1.041-11.45-1.041-16.65-1.041-11.45 0-20.82 2.082-29.15 5.204-8.32 3.123-15.61 8.327-20.81 14.572-6.25 6.245-10.41 12.49-12.49 20.817-3.13 8.326-4.17 15.612-4.17 23.939v133.228h-88.47V176.945h-1.04zM2534.45 359.091c0 7.286 1.04 12.49 4.16 17.694 3.12 5.204 6.24 9.368 10.41 12.49 4.16 3.123 9.36 5.204 14.57 7.286 6.24 2.082 11.45 2.082 17.69 2.082 4.17 0 8.33 0 13.53-2.082 5.21-1.041 9.37-3.123 13.53-5.204 4.17-2.082 7.29-5.204 10.41-9.368 3.13-4.163 4.17-8.327 4.17-13.531 0-5.204-2.09-9.367-5.21-12.49-3.12-3.122-7.28-6.245-11.45-8.327-4.16-2.081-9.36-4.163-14.57-5.204-5.2-1.041-9.37-2.081-13.53-3.122-13.53-3.123-28.1-6.245-42.67-9.368-14.58-3.122-28.11-7.286-40.6-12.49-12.49-6.245-22.9-13.531-30.18-23.939-8.33-10.409-11.45-23.94-11.45-42.675 0-16.653 4.16-30.184 11.45-40.592 8.33-10.409 17.69-18.736 30.18-24.981 12.49-6.245 26.02-10.408 40.6-13.53 14.57-3.123 28.1-4.164 41.63-4.164 14.57 0 29.14 1.041 43.71 4.164 14.58 2.081 27.07 7.285 39.56 13.53 12.49 6.245 21.85 15.613 29.14 27.062 7.29 11.45 11.45 26.021 12.49 43.716h-82.23c0-10.409-4.16-18.736-11.45-23.94-7.28-4.163-16.65-7.286-28.1-7.286-4.16 0-8.32 0-12.49 1.041-4.16 1.041-8.32 1.041-12.49 2.082-4.16 1.041-7.28 3.122-9.37 6.245-2.08 3.122-4.16 6.245-4.16 11.449 0 6.245 3.12 11.449 10.41 15.613 6.24 4.163 14.57 7.286 24.98 10.408 10.41 2.082 20.82 5.204 32.27 7.286 11.44 2.082 22.89 4.163 33.3 6.245 13.53 3.123 24.98 7.286 33.31 13.531 9.37 6.245 15.61 12.49 20.82 19.776 5.2 7.286 9.36 14.572 11.45 21.858 2.08 7.285 3.12 13.53 3.12 19.776 0 17.694-4.17 33.306-11.45 45.796-8.33 12.491-17.7 21.858-30.19 30.185-12.49 7.286-26.02 12.49-41.63 16.653-15.61 3.123-31.22 5.204-45.8 5.204-15.61 0-32.26-1.04-47.87-4.163-15.62-3.122-29.15-8.327-41.64-15.612a83.855 83.855 0 01-30.18-30.185c-8.33-12.49-12.49-28.102-12.49-46.838h84.31v-2.081z" fill="#FFFFFF" class="Ibar__logo__text"></path> <path d="M0 481.911V281.028l187.351-58.287v200.882L0 481.911z" fill="#8BC53F"></path> <path d="M187.351 423.623V222.741l126.983 87.431v200.882l-126.983-87.431z" fill="#EBD417"></path> <path d="M126.982 569.341L0 481.911l187.351-58.287 126.983 87.43-187.352 58.287z" fill="#034EA1"></path> <path d="M183.188 212.331l51.001-116.574 65.573 155.085-51.001 116.574-65.573-155.085z" fill="#712E74"></path> <path d="M248.761 367.415l51.001-116.574 171.739-28.102-49.96 115.533-172.78 29.143z" fill="#009FD1"></path> <path d="M299.762 250.842L234.189 95.757l171.739-28.103 65.573 155.085-171.739 28.103z" fill="#F6921E"></path> <path d="M187.352 222.741L59.328 198.802 44.757 71.819 172.78 95.76l14.572 126.982z" fill="#DA2128"></path> <path d="M172.78 95.758L44.757 71.818l70.777-70.776 128.023 23.94-70.777 70.776z" fill="#25BCBD"></path> <path d="M258.129 153.005l-70.777 69.736-14.571-126.982 70.777-70.778 14.571 128.024z" fill="#00844A"></path></svg></a></div></div> <button aria-label="Close Menu" data-event="iBarMenu-btn-closeMenu" class="Ibar__close"></button></div> <div class="Ibar__menu__wrapper"><div class="Ibar__menu__journal"><a href="//www.frontiersin.org/journals/computational-neuroscience" data-event="iBarMenu-a-journalHome"><div class="Ibar__journalName__container"><div class="Ibar__journal__maskLogo" style="display:none;"><img src="" class="Ibar__journal__logo"></div> <div class="Ibar__journalName"><span>Frontiers in</span> <span> Computational Neuroscience</span></div></div></a> <!----> <a href="//www.frontiersin.org/journals/computational-neuroscience/articles" data-event="iBar-a-articles" class="Ibar__link">Articles</a><a href="//www.frontiersin.org/journals/computational-neuroscience/research-topics" data-event="iBar-a-researchTopics" class="Ibar__link">Research Topics</a><a href="//www.frontiersin.org/journals/computational-neuroscience/editors" data-event="iBar-a-editorialBoard" class="Ibar__link">Editorial board</a> <div parent-data-event="iBarMenu" class="Ibar__dropdown"><button class="Ibar__dropdown__trigger"><!----> About journal </button> <div class="Ibar__dropdown__menu"><div class="Ibar__dropdown__menu__header"><button aria-label="Close Dropdown" class="Ibar__dropdown__menu__header__title"> About journal </button> <button aria-label="Close Dropdown" class="Ibar__close"></button></div> <div class="Ibar__dropdown__about"><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Scope</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-editors" target="_self" data-event="iBar-aboutJournal_0-a_scope">Specialty chief editors</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-scope" target="_self" data-event="iBar-aboutJournal_0-a_scope">Mission &amp; scope</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-facts" target="_self" data-event="iBar-aboutJournal_0-a_scope">Facts</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-submission" target="_self" data-event="iBar-aboutJournal_0-a_scope">Submission</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-open" target="_self" data-event="iBar-aboutJournal_0-a_scope">Open access statement</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#copyright-statement" target="_self" data-event="iBar-aboutJournal_0-a_scope">Copyright statement</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-quality" target="_self" data-event="iBar-aboutJournal_0-a_scope">Quality</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">For authors</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/why-submit" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Why submit?</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/article-types" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Article types</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/author-guidelines" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Author guidelines</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/editor-guidelines" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Editor guidelines</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/publishing-fees" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Publishing fees</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/submission-checklist" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Submission checklist</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/contact-editorial-office" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Contact editorial office</a></li></ul></div></div></div></div> <div parent-data-event="iBarMenu" class="Ibar__dropdown Ibar__dropdown--aboutUs"><button class="Ibar__dropdown__trigger"><!----> About us </button> <div class="Ibar__dropdown__menu"><div class="Ibar__dropdown__menu__header"><button aria-label="Close Dropdown" class="Ibar__dropdown__menu__header__title"> About us </button> <button aria-label="Close Dropdown" class="Ibar__close"></button></div> <div class="Ibar__dropdown__about"><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Who we are</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/mission" target="_self" data-event="iBar-aboutUs_0-a_whoWeAre">Mission and values</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/history" target="_self" data-event="iBar-aboutUs_0-a_whoWeAre">History</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/leadership" target="_self" data-event="iBar-aboutUs_0-a_whoWeAre">Leadership</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/awards" target="_self" data-event="iBar-aboutUs_0-a_whoWeAre">Awards</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Impact and progress</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/impact" target="_self" data-event="iBar-aboutUs_1-a_impactAndProgress">Frontiers' impact</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://progressreport.frontiersin.org/?utm_source=fweb&amp;utm_medium=frep&amp;utm_campaign=pr20" target="_blank" data-event="iBar-aboutUs_1-a_impactAndProgress">Progress Report 2022</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/progress-reports" target="_self" data-event="iBar-aboutUs_1-a_impactAndProgress">All progress reports</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Publishing model</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/how-we-publish" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">How we publish</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/open-access" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Open access</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/fee-policy" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Fee policy</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/peer-review" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Peer review</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/research-integrity" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Research integrity</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/research-topics" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Research Topics</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Services</li> <li class="Ibar__dropdown__about__block__item"><a href="https://publishingpartnerships.frontiersin.org/" target="_blank" data-event="iBar-aboutUs_3-a_services">Societies</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/open-access-agreements/consortia" target="_self" data-event="iBar-aboutUs_3-a_services">National consortia</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/open-access-agreements" target="_self" data-event="iBar-aboutUs_3-a_services">Institutional partnerships</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/collaborators" target="_self" data-event="iBar-aboutUs_3-a_services">Collaborators</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">More from Frontiers</li> <li class="Ibar__dropdown__about__block__item"><a href="https://forum.frontiersin.org/" target="_blank" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Frontiers Forum</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersplanetprize.org/" target="_blank" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Frontiers Planet Prize</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://pressoffice.frontiersin.org/" target="_blank" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Press office</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.orgabout/sustainability" target="_self" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Sustainability</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://careers.frontiersin.org/" target="_blank" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Career opportunities</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/contact" target="_self" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Contact us</a></li></ul></div></div></div> <a href="https://www.frontiersin.org/journals" data-event="iBar-a-allJournals" class="Ibar__link">All journals</a><a href="https://www.frontiersin.org/articles" data-event="iBar-a-allArticles" class="Ibar__link">All articles</a> <!----> <!----> <!----> <a href="https://www.frontiersin.org/submission/submit?domainid=1&amp;fieldid=55&amp;specialtyid=237&amp;entitytype=1&amp;entityid=9" data-event="iBarMenu-a-submit" class="Ibar__button Ibar__submit">Submit your research</a></div></div> <div class="Ibar__journal"><div class="Ibar__wrapper Ibar__wrapper--journal"><a aria-label="Frontiers in Computational Neuroscience" href="//www.frontiersin.org/journals/computational-neuroscience" data-event="iBarJournal-a-journalHome" class="Ibar__journalName"><div class="Ibar__journalName__container"><div class="Ibar__journal__maskLogo" style="display:none;"><img src="" class="Ibar__journal__logo"></div> <div class="Ibar__journalName"><span>Frontiers in</span> <span> Computational Neuroscience</span></div></div></a> <!----> <a href="//www.frontiersin.org/journals/computational-neuroscience/articles" data-event="iBar-a-articles" class="Ibar__link">Articles</a><a href="//www.frontiersin.org/journals/computational-neuroscience/research-topics" data-event="iBar-a-researchTopics" class="Ibar__link">Research Topics</a><a href="//www.frontiersin.org/journals/computational-neuroscience/editors" data-event="iBar-a-editorialBoard" class="Ibar__link">Editorial board</a> <div parent-data-event="iBarJournal" class="Ibar__dropdown"><button class="Ibar__dropdown__trigger"><!----> About journal </button> <div class="Ibar__dropdown__menu"><div class="Ibar__dropdown__menu__header"><button aria-label="Close Dropdown" class="Ibar__dropdown__menu__header__title"> About journal </button> <button aria-label="Close Dropdown" class="Ibar__close"></button></div> <div class="Ibar__dropdown__about"><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Scope</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-editors" target="_self" data-event="iBar-aboutJournal_0-a_scope">Specialty chief editors</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-scope" target="_self" data-event="iBar-aboutJournal_0-a_scope">Mission &amp; scope</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-facts" target="_self" data-event="iBar-aboutJournal_0-a_scope">Facts</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-submission" target="_self" data-event="iBar-aboutJournal_0-a_scope">Submission</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-open" target="_self" data-event="iBar-aboutJournal_0-a_scope">Open access statement</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#copyright-statement" target="_self" data-event="iBar-aboutJournal_0-a_scope">Copyright statement</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-quality" target="_self" data-event="iBar-aboutJournal_0-a_scope">Quality</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">For authors</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/why-submit" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Why submit?</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/article-types" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Article types</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/author-guidelines" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Author guidelines</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/editor-guidelines" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Editor guidelines</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/publishing-fees" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Publishing fees</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/submission-checklist" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Submission checklist</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/contact-editorial-office" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Contact editorial office</a></li></ul></div></div></div> <div class="Ibar__spacer"></div></div></div> <div class="Ibar__journal Ibar__journal--mix"><div class="Ibar__wrapper Ibar__wrapper--journal"><div class="Ibar__logo"><a href="//www.frontiersin.org/" aria-label="Frontiershome" data-event="iBar-a-home" class="Ibar__logo__link"><svg viewBox="0 0 2811 590" fill="none" xmlns="http://www.w3.org/2000/svg" class="Ibar__logo__svg"><path d="M633.872 234.191h-42.674v-57.246h42.674c0-19.776 2.082-35.389 5.204-48.92 4.164-13.53 9.368-23.939 17.695-31.225 8.326-8.326 18.735-13.53 32.266-16.653 13.531-3.123 29.143-5.204 47.878-5.204h21.858c7.286 0 14.572 1.04 21.857 1.04v62.451c-8.326-1.041-16.653-2.082-23.939-2.082-10.408 0-17.694 1.041-23.939 4.164-6.245 3.122-9.368 10.408-9.368 22.898v13.531h53.083v57.246h-53.083v213.372h-89.512V234.191zM794.161 176.945h86.39v47.879h1.041c6.245-17.694 16.653-30.185 31.225-39.552 14.572-9.368 31.225-13.531 49.96-13.531h10.409c3.122 0 7.286 1.041 10.408 2.082v81.185c-6.245-2.082-11.449-3.122-16.653-4.163-5.204-1.041-11.449-1.041-16.654-1.041-11.449 0-20.816 2.082-29.143 5.204-8.327 3.123-15.613 8.327-20.817 14.572-5.204 6.245-10.408 12.49-12.49 20.817-3.123 8.326-4.163 15.612-4.163 23.939v133.228h-88.472V176.945h-1.041zM989.84 312.254c0-19.776 3.122-39.552 10.41-56.205 7.28-17.695 16.65-32.266 29.14-45.797 12.49-13.531 27.06-22.899 44.76-30.185 17.69-7.285 36.43-11.449 57.24-11.449 20.82 0 39.56 4.164 57.25 11.449 17.69 7.286 32.27 17.695 45.8 30.185 12.49 12.49 22.9 28.102 29.14 45.797 7.29 17.694 10.41 36.429 10.41 56.205 0 20.817-3.12 39.552-10.41 57.246-7.29 17.695-16.65 32.266-29.14 44.756-12.49 12.49-28.11 22.899-45.8 30.185-17.69 7.286-36.43 11.449-57.25 11.449-20.81 0-40.59-4.163-57.24-11.449-17.7-7.286-32.27-17.695-44.76-30.185-12.49-12.49-21.86-28.102-29.14-44.756-7.288-17.694-10.41-36.429-10.41-57.246zm88.47 0c0 8.327 1.04 17.694 3.12 26.021 2.09 9.368 5.21 16.653 9.37 23.939 4.16 7.286 9.37 13.531 16.65 17.695 7.29 4.163 15.62 7.285 26.03 7.285 10.4 0 18.73-2.081 26.02-7.285 7.28-4.164 12.49-10.409 16.65-17.695 4.16-7.286 7.29-15.612 9.37-23.939 2.08-9.368 3.12-17.694 3.12-26.021 0-8.327-1.04-17.694-3.12-26.021-2.08-9.368-5.21-16.653-9.37-23.939-4.16-7.286-9.37-13.531-16.65-17.695-7.29-5.204-15.62-7.285-26.02-7.285-10.41 0-18.74 2.081-26.03 7.285-7.28 5.205-12.49 10.409-16.65 17.695-4.16 7.286-7.28 15.612-9.37 23.939-2.08 9.368-3.12 17.694-3.12 26.021zM1306.25 176.945h86.39v37.47h1.04c4.17-7.286 9.37-13.531 15.62-18.735 6.24-5.204 13.53-10.408 20.81-14.572 7.29-4.163 15.62-7.286 23.94-9.367 8.33-2.082 16.66-3.123 24.98-3.123 22.9 0 40.6 4.164 53.09 11.449 13.53 7.286 22.89 16.654 29.14 27.062 6.24 10.409 10.41 21.858 12.49 34.348 2.08 12.49 2.08 22.898 2.08 33.307v172.779h-88.47V316.417v-27.061c0-9.368-1.04-16.654-4.16-23.94-3.13-7.286-7.29-12.49-13.53-16.653-6.25-4.164-15.62-6.245-27.07-6.245-8.32 0-15.61 2.081-21.85 5.204-6.25 3.122-11.45 7.286-14.58 13.531-4.16 5.204-6.24 11.449-8.32 18.735s-3.12 14.572-3.12 21.858v145.717h-88.48V176.945zM1780.88 234.19h-55.17v122.819c0 10.408 3.12 17.694 8.33 20.817 6.24 3.122 13.53 5.204 22.9 5.204 4.16 0 7.28 0 11.45-1.041h11.45v65.573c-8.33 0-15.62 1.041-23.94 2.082-8.33 1.04-16.66 1.041-23.94 1.041-18.74 0-34.35-2.082-46.84-5.205-12.49-3.122-21.86-8.326-29.14-15.612-7.29-7.286-12.49-16.654-14.58-29.144-3.12-12.49-4.16-27.062-4.16-45.797V234.19h-44.76v-57.246h44.76V94.717h88.47v82.227h55.17v57.246zM1902.66 143.639h-88.48V75.984h88.48v67.655zm-89.52 33.307h88.48v270.618h-88.48V176.946zM2024.43 334.111c1.04 18.735 6.25 33.307 16.66 44.756 10.4 11.449 24.98 16.653 43.71 16.653 10.41 0 20.82-2.081 30.19-7.286 9.36-5.204 16.65-12.49 20.81-22.898h83.27c-4.16 15.613-10.41 29.144-19.78 40.593-9.36 11.449-19.77 20.817-31.22 28.102-12.49 7.286-24.98 12.491-39.55 16.654-14.57 3.122-29.15 5.204-43.72 5.204-21.86 0-41.63-3.122-60.37-9.367-18.73-6.246-34.34-15.613-46.83-28.103-12.49-12.49-22.9-27.062-30.19-45.797-7.28-17.694-10.41-38.511-10.41-60.369 0-20.817 4.17-39.552 11.45-57.246 7.29-17.694 17.7-32.266 31.23-44.756 13.53-12.49 29.14-21.858 46.83-29.144 17.7-7.286 36.43-10.408 56.21-10.408 23.94 0 45.8 4.163 63.49 12.49 17.7 8.327 33.31 19.776 44.76 35.389 11.45 15.612 20.81 32.266 26.02 52.042 5.2 19.776 8.33 41.633 7.28 64.532h-199.84v-1.041zm110.33-49.961c-1.04-15.612-6.24-28.102-15.61-39.551-9.37-10.409-21.86-16.654-37.47-16.654s-28.1 5.204-38.51 15.613c-10.41 10.408-16.66 23.939-18.74 40.592h110.33zM2254.46 176.945h86.39v47.879h1.04c6.25-17.694 16.65-30.185 31.23-39.552 14.57-9.368 31.22-13.531 49.96-13.531h10.4c3.13 0 7.29 1.041 10.41 2.082v81.185c-6.24-2.082-11.45-3.122-16.65-4.163-5.21-1.041-11.45-1.041-16.65-1.041-11.45 0-20.82 2.082-29.15 5.204-8.32 3.123-15.61 8.327-20.81 14.572-6.25 6.245-10.41 12.49-12.49 20.817-3.13 8.326-4.17 15.612-4.17 23.939v133.228h-88.47V176.945h-1.04zM2534.45 359.091c0 7.286 1.04 12.49 4.16 17.694 3.12 5.204 6.24 9.368 10.41 12.49 4.16 3.123 9.36 5.204 14.57 7.286 6.24 2.082 11.45 2.082 17.69 2.082 4.17 0 8.33 0 13.53-2.082 5.21-1.041 9.37-3.123 13.53-5.204 4.17-2.082 7.29-5.204 10.41-9.368 3.13-4.163 4.17-8.327 4.17-13.531 0-5.204-2.09-9.367-5.21-12.49-3.12-3.122-7.28-6.245-11.45-8.327-4.16-2.081-9.36-4.163-14.57-5.204-5.2-1.041-9.37-2.081-13.53-3.122-13.53-3.123-28.1-6.245-42.67-9.368-14.58-3.122-28.11-7.286-40.6-12.49-12.49-6.245-22.9-13.531-30.18-23.939-8.33-10.409-11.45-23.94-11.45-42.675 0-16.653 4.16-30.184 11.45-40.592 8.33-10.409 17.69-18.736 30.18-24.981 12.49-6.245 26.02-10.408 40.6-13.53 14.57-3.123 28.1-4.164 41.63-4.164 14.57 0 29.14 1.041 43.71 4.164 14.58 2.081 27.07 7.285 39.56 13.53 12.49 6.245 21.85 15.613 29.14 27.062 7.29 11.45 11.45 26.021 12.49 43.716h-82.23c0-10.409-4.16-18.736-11.45-23.94-7.28-4.163-16.65-7.286-28.1-7.286-4.16 0-8.32 0-12.49 1.041-4.16 1.041-8.32 1.041-12.49 2.082-4.16 1.041-7.28 3.122-9.37 6.245-2.08 3.122-4.16 6.245-4.16 11.449 0 6.245 3.12 11.449 10.41 15.613 6.24 4.163 14.57 7.286 24.98 10.408 10.41 2.082 20.82 5.204 32.27 7.286 11.44 2.082 22.89 4.163 33.3 6.245 13.53 3.123 24.98 7.286 33.31 13.531 9.37 6.245 15.61 12.49 20.82 19.776 5.2 7.286 9.36 14.572 11.45 21.858 2.08 7.285 3.12 13.53 3.12 19.776 0 17.694-4.17 33.306-11.45 45.796-8.33 12.491-17.7 21.858-30.19 30.185-12.49 7.286-26.02 12.49-41.63 16.653-15.61 3.123-31.22 5.204-45.8 5.204-15.61 0-32.26-1.04-47.87-4.163-15.62-3.122-29.15-8.327-41.64-15.612a83.855 83.855 0 01-30.18-30.185c-8.33-12.49-12.49-28.102-12.49-46.838h84.31v-2.081z" fill="#FFFFFF" class="Ibar__logo__text"></path> <path d="M0 481.911V281.028l187.351-58.287v200.882L0 481.911z" fill="#8BC53F"></path> <path d="M187.351 423.623V222.741l126.983 87.431v200.882l-126.983-87.431z" fill="#EBD417"></path> <path d="M126.982 569.341L0 481.911l187.351-58.287 126.983 87.43-187.352 58.287z" fill="#034EA1"></path> <path d="M183.188 212.331l51.001-116.574 65.573 155.085-51.001 116.574-65.573-155.085z" fill="#712E74"></path> <path d="M248.761 367.415l51.001-116.574 171.739-28.102-49.96 115.533-172.78 29.143z" fill="#009FD1"></path> <path d="M299.762 250.842L234.189 95.757l171.739-28.103 65.573 155.085-171.739 28.103z" fill="#F6921E"></path> <path d="M187.352 222.741L59.328 198.802 44.757 71.819 172.78 95.76l14.572 126.982z" fill="#DA2128"></path> <path d="M172.78 95.758L44.757 71.818l70.777-70.776 128.023 23.94-70.777 70.776z" fill="#25BCBD"></path> <path d="M258.129 153.005l-70.777 69.736-14.571-126.982 70.777-70.778 14.571 128.024z" fill="#00844A"></path></svg></a></div> <a aria-label="Frontiers in Computational Neuroscience" href="//www.frontiersin.org/journals/computational-neuroscience" data-event="iBarJournal-a-journalHome" class="Ibar__journalName"><div logoClass="Ibar__logo--mixed" class="Ibar__journalName__container"><div class="Ibar__journal__maskLogo" style="display:none;"><img src="" class="Ibar__journal__logo"></div> <div class="Ibar__journalName"><span>Frontiers in</span> <span> Computational Neuroscience</span></div></div></a> <div class="Ibar__spacer"></div> <!----> <a href="//www.frontiersin.org/journals/computational-neuroscience/articles" data-event="iBar-a-articles" class="Ibar__link">Articles</a><a href="//www.frontiersin.org/journals/computational-neuroscience/research-topics" data-event="iBar-a-researchTopics" class="Ibar__link">Research Topics</a><a href="//www.frontiersin.org/journals/computational-neuroscience/editors" data-event="iBar-a-editorialBoard" class="Ibar__link">Editorial board</a> <div parent-data-event="iBarJournal" class="Ibar__dropdown"><button class="Ibar__dropdown__trigger"><!----> About journal </button> <div class="Ibar__dropdown__menu"><div class="Ibar__dropdown__menu__header"><button aria-label="Close Dropdown" class="Ibar__dropdown__menu__header__title"> About journal </button> <button aria-label="Close Dropdown" class="Ibar__close"></button></div> <div class="Ibar__dropdown__about"><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Scope</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-editors" target="_self" data-event="iBar-aboutJournal_0-a_scope">Specialty chief editors</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-scope" target="_self" data-event="iBar-aboutJournal_0-a_scope">Mission &amp; scope</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-facts" target="_self" data-event="iBar-aboutJournal_0-a_scope">Facts</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-submission" target="_self" data-event="iBar-aboutJournal_0-a_scope">Submission</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-open" target="_self" data-event="iBar-aboutJournal_0-a_scope">Open access statement</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#copyright-statement" target="_self" data-event="iBar-aboutJournal_0-a_scope">Copyright statement</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/about#about-quality" target="_self" data-event="iBar-aboutJournal_0-a_scope">Quality</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">For authors</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/why-submit" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Why submit?</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/article-types" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Article types</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/author-guidelines" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Author guidelines</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/editor-guidelines" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Editor guidelines</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/publishing-fees" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Publishing fees</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/submission-checklist" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Submission checklist</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/computational-neuroscience/for-authors/contact-editorial-office" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Contact editorial office</a></li></ul></div></div></div> <div class="Ibar__spacer"></div> <a href="https://www.frontiersin.org/submission/submit?domainid=1&amp;fieldid=55&amp;specialtyid=237&amp;entitytype=1&amp;entityid=9" data-event="iBarJournal-a-submit" class="Ibar__button Ibar__submit"><span>Submit</span> <span> your research</span></a> <a href="/search" aria-label="Search" data-event="iBar-a-search" class="Ibar__icon Ibar__icon--search"><span>Search</span></a> <!----> <!----> <!----> <div class="Ibar__userArea"></div></div></div></nav> <div class="ArticlePage"><div><div class="Layout Layout--withAside Layout--withIbarMix ArticleDetails"><!----> <aside class="Layout__aside"><div class="ArticleDetails__wrapper"><div class="ArticleDetails__aside"><div class="ArticleDetails__aside__responsiveButtons"><div id="FloatingButtonsEl" class="ActionsDropDown"><button aria-label="Open dropdown" data-event="actionsDropDown-button-toggle" class="ActionsDropDown__button ActionsDropDown__button--type ActionsDropDown__button--icon"><span class="ActionsDropDown__button__label">Download article</span></button> <div class="ActionsDropDown__menuWrapper"><!----> <ul class="ActionsDropDown__menu"><li><a href="/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/pdf" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-pdf" class="ActionsDropDown__option"> Download PDF </a></li><li><a href="http://www.readcube.com/articles/10.3389/fncom.2021.650050" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-readCube" class="ActionsDropDown__option"> ReadCube </a></li><li><a href="/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/epub" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-epub" class="ActionsDropDown__option"> EPUB </a></li><li><a href="/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/xml/nlm" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-nlmXml" class="ActionsDropDown__option"> XML (NLM) </a></li></ul> <button aria-label="Close modal" data-event="actionsDropDown-button-close" class="ActionsDropDown__mobileClose"></button></div></div> <div class="ArticleDetails__aside__responsiveButtons__items"><!----> <div class="ArticleDetailsShare__responsive"><button aria-label="Open share options" class="ArticleDetailsShare__trigger"></button> <div class="ArticleDetailsShare"><h5 class="ArticleDetailsShare__title">Share on</h5> <ul class="ArticleDetailsShare__list"><li class="ArticleDetailsShare__item"><a href="https://www.twitter.com/share?url=https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/full" target="_blank" title="Share on X" aria-label="Share on X" class="ArticleDetailsShare__link ArticleDetailsShare__link--x"></a></li><li class="ArticleDetailsShare__item"><a href="https://www.linkedin.com/share?url=https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/full" target="_blank" title="Share on Linkedin" aria-label="Share on Linkedin" class="ArticleDetailsShare__link ArticleDetailsShare__link--linkedin"></a></li><li class="ArticleDetailsShare__item"><a href="https://www.facebook.com/sharer/sharer.php?u=https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/full" target="_blank" title="Share on Facebook" aria-label="Share on Facebook" class="ArticleDetailsShare__link ArticleDetailsShare__link--facebook"></a></li></ul></div></div> <div class="ActionsDropDown"><button aria-label="Open dropdown" data-event="actionsDropDown-button-toggle" class="ActionsDropDown__button ActionsDropDown__button--typeIconButton ActionsDropDown__button--iconQuote"><!----></button> <div class="ActionsDropDown__menuWrapper"><div class="ActionsDropDown__mobileTitle"> Export citation </div> <ul class="ActionsDropDown__menu"><li><a href="/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/endNote" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-endNote" class="ActionsDropDown__option"> EndNote </a></li><li><a href="/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/reference" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-referenceManager" class="ActionsDropDown__option"> Reference Manager </a></li><li><a href="/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/text" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-simpleTextFile" class="ActionsDropDown__option"> Simple Text file </a></li><li><a href="/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/bibTex" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-bibTex" class="ActionsDropDown__option"> BibTex </a></li></ul> <button aria-label="Close modal" data-event="actionsDropDown-button-close" class="ActionsDropDown__mobileClose"></button></div></div></div></div> <div class="TotalViews"><div class="TotalViews__data"><div class="TotalViews__data__metrics"><div class="TotalViews__data__metrics__number"> 19,2K </div> <div class="TotalViews__data__metrics__text"><div class="TotalViews__data__metrics__label">Total views</div></div></div> <div class="TotalViews__data__metrics"><div class="TotalViews__data__metrics__number"> 3,2K </div> <div class="TotalViews__data__metrics__text"><div class="TotalViews__data__metrics__label">Downloads</div></div></div> <div class="TotalViews__data__metrics"><div class="TotalViews__data__metrics__number"> 101 </div> <div class="TotalViews__data__metrics__text"><div class="TotalViews__data__metrics__label">Citations</div></div></div> <div class="ImpactMetricsInfoPopover"><button aria-label="Open impact metrics info" class="ImpactMetricsInfoPopover__button"></button> <div class="ImpactMetricsInfoPopover__tooltip"><button aria-label="Close impact metrics info" class="ImpactMetricsInfoPopover__tooltip__closeButton"></button> <div class="ImpactMetricsInfoPopover__tooltip__text"> Citation numbers are available from Dimensions </div></div></div></div> <div class="TotalViews__viewImpactLink"><span class="Link__wrapper"><a aria-label="View article impact" href="http://loop-impact.frontiersin.org/impact/article/650050#totalviews/views" target="_blank" data-event="customLink-link-a_viewArticleImpact" class="Link Link--linkType Link--maincolor Link--medium Link--icon Link--chevronRight Link--right"><span>View article impact</span></a></span></div> <div class="TotalViews__altmetric"><div data-badge-popover="bottom" data-badge-type="donut" data-doi="10.3389/fncom.2021.650050" data-condensed="true" data-link-target="new" class="altmetric-embed"></div> <span class="Link__wrapper"><a aria-label="View altmetric score" href="https://www.altmetric.com/details/doi/10.3389/fncom.2021.650050" target="_blank" data-event="customLink-link-a_viewAltmetricScore" class="Link Link--linkType Link--maincolor Link--medium Link--icon Link--chevronRight Link--right"><span>View altmetric score</span></a></span></div></div> <div class="ArticleDetailsShare"><h5 class="ArticleDetailsShare__title">Share on</h5> <ul class="ArticleDetailsShare__list"><li class="ArticleDetailsShare__item"><a href="https://www.twitter.com/share?url=https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/full" target="_blank" title="Share on X" aria-label="Share on X" class="ArticleDetailsShare__link ArticleDetailsShare__link--x"></a></li><li class="ArticleDetailsShare__item"><a href="https://www.linkedin.com/share?url=https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/full" target="_blank" title="Share on Linkedin" aria-label="Share on Linkedin" class="ArticleDetailsShare__link ArticleDetailsShare__link--linkedin"></a></li><li class="ArticleDetailsShare__item"><a href="https://www.facebook.com/sharer/sharer.php?u=https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/full" target="_blank" title="Share on Facebook" aria-label="Share on Facebook" class="ArticleDetailsShare__link ArticleDetailsShare__link--facebook"></a></li></ul></div> <div class="ArticleDetailsEditors"><div class="ArticleDetailsEditors__editors"><div class="ArticleDetailsEditors__title">Edited by</div> <a href="https://loop.frontiersin.org/people/1003479/overview" data-event="editorInfo-a-samanSargolzaei" class="ArticleDetailsEditors__ediorInfo"><figure class="Avatar Avatar--size-32"><img src="https://loop.frontiersin.org/images/profile/1003479/32" alt="Saman Sargolzaei" class="Avatar__img is-inside-mask"></figure> <div class="ArticleDetailsEditors__ediorInfo__info"><div class="ArticleDetailsEditors__ediorInfo__name"> Saman Sargolzaei </div> <div class="ArticleDetailsEditors__ediorInfo__affiliation"> University of Tennessee at Martin, United States </div></div></a></div></div> <div class="ArticleDetailsEditors"><div class="ArticleDetailsEditors__editors"><div class="ArticleDetailsEditors__title">Reviewed by</div> <a href="https://loop.frontiersin.org/people/222607/overview" data-event="editorInfo-a-antonioDourado" class="ArticleDetailsEditors__ediorInfo"><figure class="Avatar Avatar--size-32"><img src="https://loop.frontiersin.org/images/profile/222607/32" alt="Antonio Dourado" class="Avatar__img is-inside-mask"></figure> <div class="ArticleDetailsEditors__ediorInfo__info"><div class="ArticleDetailsEditors__ediorInfo__name"> Antonio Dourado </div> <div class="ArticleDetailsEditors__ediorInfo__affiliation"> University of Coimbra, Portugal </div></div></a><a href="https://loop.frontiersin.org/people/1191759/overview" data-event="editorInfo-a-nhanDuyTruong" class="ArticleDetailsEditors__ediorInfo"><figure class="Avatar Avatar--size-32"><img src="https://loop.frontiersin.org/images/profile/1191759/32" alt="Nhan Duy Truong" class="Avatar__img is-inside-mask"></figure> <div class="ArticleDetailsEditors__ediorInfo__info"><div class="ArticleDetailsEditors__ediorInfo__name"> Nhan Duy Truong </div> <div class="ArticleDetailsEditors__ediorInfo__affiliation"> The University of Sydney, Australia </div></div></a></div></div> <div class="ArticleDetailsGlossary ArticleDetailsGlossary--open"><button class="ArticleDetailsGlossary__header"><div class="ArticleDetailsGlossary__header__title">Table of contents</div> <div class="ArticleDetailsGlossary__header__arrow"></div></button> <div class="ArticleDetailsGlossary__content"><ul class="flyoutJournal"> <li><a href="#h1">Abstract</a></li> <li><a href="#h2">Introduction</a></li> <li><a href="#h3">Materials and Methods</a></li> <li><a href="#h4">Results</a></li> <li><a href="#h5">Discussion</a></li> <li><a href="#h6">Conclusion</a></li> <li><a href="#h7">Data Availability Statement</a></li> <li><a href="#h8">Ethics Statement</a></li> <li><a href="#h9">Author Contributions</a></li> <li><a href="#conf1">Conflict of Interest</a></li> <li><a href="#refer1">References</a></li> </ul> </div></div> <!----> <div class="ActionsDropDown"><button aria-label="Open dropdown" data-event="actionsDropDown-button-toggle" class="ActionsDropDown__button ActionsDropDown__button--typeOutline ActionsDropDown__button--iconQuote"><span class="ActionsDropDown__button__label">Export citation</span></button> <div class="ActionsDropDown__menuWrapper"><!----> <ul class="ActionsDropDown__menu"><li><a href="/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/endNote" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-endNote" class="ActionsDropDown__option"> EndNote </a></li><li><a href="/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/reference" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-referenceManager" class="ActionsDropDown__option"> Reference Manager </a></li><li><a href="/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/text" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-simpleTextFile" class="ActionsDropDown__option"> Simple Text file </a></li><li><a href="/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/bibTex" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-bibTex" class="ActionsDropDown__option"> BibTex </a></li></ul> <button aria-label="Close modal" data-event="actionsDropDown-button-close" class="ActionsDropDown__mobileClose"></button></div></div> <div class="CheckForUpdates"><button data-target="crossmark" data-event="checkForUpdates-btn-openModal" class="CheckForUpdates__link"><img src="/article-pages/_nuxt/img/crossmark.5c8ec60.svg" alt="Crossmark icon" class="CheckForUpdates__link__img"> <div class="CheckForUpdates__link__text">Check for updates</div></button></div> <!----> <!----></div> <!----> <div><div class="FloatingButtons"><!----> <div class="ActionsDropDown"><button aria-label="Open dropdown" data-event="actionsDropDown-button-toggle" class="ActionsDropDown__button ActionsDropDown__button--type ActionsDropDown__button--iconDownload"><span class="ActionsDropDown__button__label">Download article</span></button> <div class="ActionsDropDown__menuWrapper"><div class="ActionsDropDown__mobileTitle"> Download </div> <ul class="ActionsDropDown__menu"><li><a href="/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/pdf" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-pdf" class="ActionsDropDown__option"> Download PDF </a></li><li><a href="http://www.readcube.com/articles/10.3389/fncom.2021.650050" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-readCube" class="ActionsDropDown__option"> ReadCube </a></li><li><a href="/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/epub" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-epub" class="ActionsDropDown__option"> EPUB </a></li><li><a href="/journals/computational-neuroscience/articles/10.3389/fncom.2021.650050/xml/nlm" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-nlmXml" class="ActionsDropDown__option"> XML (NLM) </a></li></ul> <button aria-label="Close modal" data-event="actionsDropDown-button-close" class="ActionsDropDown__mobileClose"></button></div></div></div> <!----></div></div></aside> <main class="Layout__main"><!----> <section class="ArticleDetails__main"><div class="ArticleLayoutHeader"><div class="ArticleLayoutHeader__info"><h2 class="ArticleLayoutHeader__info__title">ORIGINAL RESEARCH article</h2> <div class="ArticleLayoutHeader__info__journalDate"><span>Front. Comput. Neurosci.</span><span>, 08 April 2021</span></div> <!----> <div class="ArticleLayoutHeader__info__doiVolume"><span> Volume 15 - 2021 | </span> <a href="https://doi.org/10.3389/fncom.2021.650050" class="ArticleLayoutHeader__info__doi"> https://doi.org/10.3389/fncom.2021.650050 </a></div> <!----></div> <!----> <div class="ArticleLayoutHeader__isPartOfRT"><span class="ArticleLayoutHeader__isPartOfRT__label">This article is part of the Research Topic</span> <span class="ArticleLayoutHeader__isPartOfRT__title">Deep Learning for Neurological Disorders in Children</span> <span class="Link__wrapper"><a aria-label="View all 11 articles" href="https://www.frontiersin.org/research-topics/14859/deep-learning-for-neurological-disorders-in-children/articles" target="_self" data-event="customLink-link-a_viewAll11Articles" class="Link Link--linkType Link--maincolor Link--medium Link--icon Link--chevronRight Link--right"><span>View all 11 articles</span></a></span></div></div> <div class="ArticleDetails__main__content"><div class="ArticleDetails__main__content__main ArticleDetails__main__content__main--fullArticle"><div class="JournalAbstract"><div class="JournalAbstract__titleWrapper"><h1>A Deep Learning Approach for Automatic Seizure Detection in Children With Epilepsy</h1> <!----></div> <!----></div> <div class="JournalFullText"><div class="JournalAbstract"> <a id="h1" name="h1"></a> <div class="authors"><span class="author-wrapper"> <a href="https://loop.frontiersin.org/people/1163050" class="user-id-1163050"><img class="pr5" src="https://loop.frontiersin.org/images/profile/1163050/74" onerror="this.onerror=null;this.src='https://loop.frontiersin.org/cdn/images/profile/default_32.jpg';" alt="\r\nAhmed Abdelhameed*">Ahmed Abdelhameed</a><sup>*</sup></span><span class="author-wrapper"><img class="pr5" src="https://loop.frontiersin.org/cdn/images/profile/default_32.jpg" alt="Magdy Bayoumi" onerror="this.onerror=null;this.src='https://loop.frontiersin.org/cdn/images/profile/default_32.jpg';">Magdy Bayoumi</span></div> <ul class="notes"> <li class="pl0">Department of Electrical and Computer Engineering, University of Louisiana at Lafayette, Lafayette, LA, United States</li> </ul> <p class="mb0">Over the last few decades, electroencephalogram (EEG) has become one of the most vital tools used by physicians to diagnose several neurological disorders of the human brain and, in particular, to detect seizures. Because of its peculiar nature, the consequent impact of epileptic seizures on the quality of life of patients made the precise diagnosis of epilepsy extremely essential. Therefore, this article proposes a novel deep-learning approach for detecting seizures in pediatric patients based on the classification of raw multichannel EEG signal recordings that are minimally pre-processed. The new approach takes advantage of the automatic feature learning capabilities of a two-dimensional deep convolution autoencoder (2D-DCAE) linked to a neural network-based classifier to form a unified system that is trained in a supervised way to achieve the best classification accuracy between the ictal and interictal brain state signals. For testing and evaluating our approach, two models were designed and assessed using three different EEG data segment lengths and a 10-fold cross-validation scheme. Based on five evaluation metrics, the best performing model was a supervised deep convolutional autoencoder (SDCAE) model that uses a bidirectional long short-term memory (Bi-LSTM) &#x2013; based classifier, and EEG segment length of 4 s. Using the public dataset collected from the Children&#x2019;s Hospital Boston (CHB) and the Massachusetts Institute of Technology (MIT), this model has obtained 98.79 &#x00B1; 0.53% accuracy, 98.72 &#x00B1; 0.77% sensitivity, 98.86 &#x00B1; 0.53% specificity, 98.86 &#x00B1; 0.53% precision, and an F1-score of 98.79 &#x00B1; 0.53%, respectively. Based on these results, our new approach was able to present one of the most effective seizure detection methods compared to other existing state-of-the-art methods applied to the same dataset.</p> <div class="clear"></div> </div> <div class="JournalFullText"> <a id="h2" name="h2"></a><h2>Introduction</h2> <p class="mb15">Epilepsy is inevitably recognized to be one of the most critical and persistent neurological disorders affecting the human brain. It has spread to more than 50 million patients of various ages worldwide (<a href="#B37">World Health Organization, 2020</a>) with approximately 450,000 patients under the age of 17 in the United States out of nearly 3 million American patients diagnosed with this disease (<a href="#B10">Epilepsy in Children, 2020</a>). Epilepsy can be characterized apparently by its recurrent unprovoked seizures. A seizure is a period of anomalous, synchronous innervation of a population of neurons that may last from seconds to a few minutes. Epileptic seizures are ephemeral instances of partial or complete abnormal unintentional movements of the body that may also be combined with a loss of consciousness. While epileptic seizures rarely occur in each patient, their ensuing effects on the patients&#x2019; emotions, social interactions, and physical communications make diagnosis and treatment of epileptic seizures of ultimate significance.</p> <p class="mb15">Electroencephalograms (EEGs; <a href="#B28">Schomer and Lopez da Silva, 2018</a>) which have been around for a long time, are commonly used among neurologists to diagnose several brain disorders and in particular, epilepsy attributable to workable reasons, such as its availability, effortlessness, and low cost. EEG operates by positioning several electrodes along the surface of the human scalp and then recording and measuring the voltage oscillations emanating from the ion current flowing through the brain. These voltage oscillations, which correspond to the neuronal activity of the brain, are then transformed into multiple time series called signals. EEG is a very powerful non-invasive diagnostic tool since we can use it precisely to capture and denote epileptic signals that are characterized by spikes, sharp waves, or spike-and-wave complexities. As a result, EEG signals have been the most widely used in the clinical examination of various epileptic brain states, for both the detection and prediction of epileptic seizures.</p> <p class="mb15">By interpreting the recorded EEG signals visually, neurologists can substantially distinguish between epileptic brain activities during a seizure (ictal) state and normal brain activities between seizures (interictal) state. Over the last two decades, however, an abundance of automated EEG-based epilepsy diagnostic studies has been established. This was motivated by the exhausting and time-consuming nature of the human visual evaluation process that depends mainly on the doctors&#x2019; expertise. Besides that, the need for objective, rapid, and effective systems for the processing of vast amounts of EEG recordings has become unavoidable to be able to diminish the possibility of misinterpretations. The availability of such systems would greatly enhance the quality of life of epileptic patients.</p> <p class="mb15">Following the acquisition and pre-processing of EEG raw signals, most of the automated seizure detection techniques consist of two key successive stages. The first stage concerns the extraction and selection of certain features of the EEG signals. In the second step, a classification system is then built and trained to utilize these extracted features for the detection of epileptic activities. The feature extraction step has a direct effect on the precision and sophistication of the developed automatic seizure detection technique. Due to the non-stationary property of the EEG signals, the feature extraction stage typically involves considerable work and significant domain-knowledge to study and analyze the signals either in the time domain, the frequency domain, or in the time-frequency domain (<a href="#B6">Acharya et al., 2013</a>). Predicated on this research, it has become the mission of the system designer to devise the extraction of the best-representing features that can precisely discriminate between the epileptic brain states from the EEG signals of different subjects.</p> <p class="mb15">In the literature, several EEG signal features extracted by various methods have been proposed for seizure detection. For example (<a href="#B32">Song et al., 2012</a>), used approximate entropy and sample entropy as EEG features, and integrated them with an extreme learning machine (ELM) for the automated detection of epileptic seizures. <a href="#B9">Chen et al. (2017)</a> used non-subsampled wavelet&#x2013;Fourier features for seizure detection. <a href="#B34">Wang et al. (2018)</a> proposed an algorithm that combines wavelet decomposition and the directed transfer function (DTF) for feature extraction. <a href="#B26">Raghu et al. (2019)</a> proposed using matrix determinant as a feature for the analysis of epileptic EEG signals. Certainly, even with the achievement of great results, it is not inherently guaranteed that the features derived through the intricate, and error-prone manual feature extraction methodology would yield the maximum possible classification accuracy. As such, it would be very fitting to work out how to build substantial systems that can automatically learn the best representative features from minimally preprocessed EEG signals while at the same time realize optimum classification performance.</p> <p class="mb15">The recent advances in machine learning science and particularly the deep learning techniques breakthroughs have shown its superiority for automatically learning very robust features that outperformed the human-engineered features in many fields such as speech recognition, natural language processing, and computer vision as well as medical diagnosis (<a href="#B35">Wang et al., 2020</a>). Multiple seizure detection systems that used artificial neural networks (ANNs) as classifiers, after traditional feature extraction, were reported in previous work. For instance (<a href="#B25">Orhan et al., 2011</a>), used multilayer perceptron (MLP) for classification after using discrete wavelet transform (DWT) and K-means algorithm for feature extraction. <a href="#B27">Samiee et al. (2015)</a> also used MLP as a classifier after using discrete short-time Fourier transform (DSTFT) for feature extraction. In <a href="#B19">Jaiswal and Banka (2017)</a>, ANNs were evaluated for classification after using the local neighbor descriptive pattern (LNDP) and one-dimensional local gradient pattern (1D-LGP) techniques for feature extraction. <a href="#B38">Yavuz et al. (2018)</a> performed cepstral analysis utilizing generalized regression neural network for EEG signals classification. On the other hand, convolutional neural networks (CNNs) were adopted for both automatic feature learning and classification. For example (<a href="#B5">Acharya et al., 2018</a>), proposed a deep CNN consisting of 13 layers for automatic seizure detection. For the same purpose (<a href="#B3">Abdelhameed et al., 2018a</a>), designed a system that combined a one-dimensional CNN with a bidirectional long short-term memory (Bi-LSTM) recurrent neural network. <a href="#B20">Ke et al. (2018)</a>; <a href="#B41">Zhou et al. (2018)</a>, and <a href="#B16">Hossain et al. (2019)</a> also used CNN for feature extraction and classification. In <a href="#B17">Hu et al. (2019)</a>, CNN and support vector machine (SVM) were incorporated together for feature extraction and classification of EEG signals.</p> <p class="mb15">As reported, most of the deep learning algorithms that involved automatic feature learning have targeted single-channel epileptic EEG signals. It is therefore still important to research more data-driven algorithms that can handle more complex multichannel epileptic EEG signals.</p> <p class="mb15">In general, supervised learning is the most widely used technique for classifying EEG signals among all other machine learning techniques. Several researchers have recently experimented with semi-supervised deep learning strategies in which an autoencoder (AE) neural network can benefit from training using both unlabeled and labeled data to improve the efficacy of the classification process (<a href="#B12">Gogna et al., 2017</a>; <a href="#B39">Yuan et al., 2017</a>; <a href="#B1">Abdelhameed and Bayoumi, 2018</a>, <a href="#B2">2019</a>; <a href="#B29">She et al., 2018</a>). Two approaches of using AEs have been used in the literature. The first one is the stacked AEs approach, where each layer of a neural network consisting of multiple hidden layers is trained individually using an AE in an unsupervised way. After that, all trained layers are stacked together and a softmax layer is attached to form a stacked network that is finally trained in a supervised fashion. The second approach uses deep AEs to pre-train all layers of the neural network simultaneously instead of that greedy layer-wise training. This latter approach still suffers from one particular drawback which is the necessity to train the semi-supervised deep learning model twice. One training episode is conducted in an unsupervised way using unlabeled training data that enables the AE to learn good initial parameters (weights). In the second episode, the first half of the pre-trained AE (the encoder network) attached to a selected classifier is trained as a new system in a supervised manner using labeled data to perform the final classification task.</p> <p class="mb15">Therefore, in this work, to address the limitation of the classification schemes alluded to above, a novel deep learning-based system that uses a two-dimensional supervised deep convolutional autoencoder (2D-SDCAE) is proposed for the detection of epileptic seizures in multichannel EEG signals recordings. The innovative approach in the proposed system is that the AE is trained only once in a supervised way to perform two tasks at the same time. The first task is to automatically learn the best features from the EEG signals and to summarize them in a succinct, low-dimensional, latent space representation while performing the classification task efficiently. The method of consolidating the simultaneous learning to perform both tasks in a single model, which is trained only once in a supervised way, has proven to have a good impact on improving the learning capabilities of the model and thus achieving better classification accuracy.</p> <p class="mb15">In addition to operating directly on raw EEG signal data, there are several advantages to our approach. First of all, the SDCAE is faster compared to conventional semi-supervised classification systems since it is trained only once. Second, to minimize the total number of network parameters, the proposed SDCAE uses convolutional layers for learning features instead of fully connected layers that are commonly used in regular MLP-based AEs. Third, the proposed system can be used in signal compression schemes as the original high-dimensional signals can be perfectly reconstructed from the low-dimensional latent representation using the second half of the AE (the decoder network). Finally, the training of AEs in a supervised way is more effective in learning more structured latent representation, making it very feasible to deploy very simple classifiers and still have very high-precision seizure detection systems. It is also worth noting that performance and hardware resource-saving have been taken into account to make the proposed system more suitable for real-time use and potential hardware implementation and deployment.</p> <p class="mb0">Two SDCAE models are designed to test our novel approach, and their performance for seizure detection in children is evaluated. Both models are used to classify EEG data segments to distinguish between ictal and interictal brain states. The first model is a two-dimensional deep convolution autoencoder (2D-DCAE) in which the convolutional layers of the encoder network are attached to a simple MLP network consisting of two fully connected hidden layers and one output layer for classification. The second system is also a 2D-DCAE but in this system, the convolutional layers of the encoder network are attached to one Bi-LSTM recurrent neural network layer to do the classification task. Besides, the performance of both proposed models is further compared to two regular deep learning models having the same layers&#x2019; structure, except that the decoder network layers are completely removed. These two models are trained in a supervised manner to only do the classification task. By quantitatively evaluating the performance of the proposed models using different EEG segment lengths, our new approach of using SDCAE will prove to be a very good candidate for producing one of the most accurate seizure detection systems.</p> <a id="h3" name="h3"></a><h2>Materials and Methods</h2> <h3 class="pt0">Dataset</h3> <p class="mb0">Patients&#x2019; data obtained from the online Children&#x2019;s Hospital Boston&#x2013;Massachusetts Institute of Technology (CHB&#x2013;MIT) Database were used to assess and measure the efficacy of the proposed models. The dataset is recorded at Boston Children&#x2019;s Hospital and consists of long-term EEG scalp recordings of 23 pediatric patients with intractable seizures (<a href="#B30">Shoeb, 2009</a>). 23 channels EEG signals recordings are collected using 21 electrodes whose names are specified by the International 10&#x2013;20 electrode positioning system using the modified combinatorial nomenclature as shown in <a href="#F1">Figure 1</a>. The signals are then sampled at 256 Hz and the band-pass filtered between 0 and 128 Hz.</p> <div class="DottedLine"></div> <div class="Imageheaders">FIGURE 1</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g001.jpg" name="figure1" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g001.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g001.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g001.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g001.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g001.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g001.jpg" alt="www.frontiersin.org" id="F1" loading="lazy"> </picture> </a> <p><strong>Figure 1.</strong> 21 EEG electrode positions based on the 10&#x2013;20 system using modified combinatorial nomenclature.</p> </div> <div class="clear"></div> <div class="DottedLine"></div> <p class="mb15 w100pc float_left mt15">In this study, 16 out of the 23 pediatric patients are selected for the assessment of the classification models. More details about the selected patients are listed in <a href="#T1">Table 1</a>. Seizures less than 10 s are too short so, all Chb16&#x2019;s seizures were not considered for testing (<a href="#B11">Gao et al., 2020</a>). The seizures of the two patients (Chb12 and Chb13) were omitted due to the excess variations in channel naming and electrode positioning swapping. Four patients (Chb04, Chb15, Chb18, and Chb19) have been excluded since they are 16 years of age and older because the aim is to research seizure detection in young children.</p> <div class="DottedLine"></div> <div class="Imageheaders">TABLE 1</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t001.jpg" name="table1" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t001.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t001.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t001.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t001.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t001.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t001.jpg" alt="www.frontiersin.org" id="T1" loading="lazy"> </picture> </a> <p><strong>Table 1.</strong> Seizure information of the selected patients.</p> </div> <div class="clear"></div> <div class="DottedLine"></div> <p class="mb15 w100pc float_left mt15">Typically, epileptic patients have limited numbers of seizures that span much shorter times relative to seizure-free periods. A discrepancy between the number of ictal and interictal EEG data segments is often present. To surmount the bias in the training process of the classification models in which classifiers tend to favor the class with the largest number of segments, and as a design choice, the number of interictal segments is chosen to be equal to the number of ictal segments while forming the final dataset. Downsampling the original interictal dataset can be found in previous work as in <a href="#B36">Wei et al. (2019)</a>, <a href="#B11">Gao et al. (2020)</a>. Non-overlapped EEG segments of 1, 2, and 4 s duration were tested for evaluating the proposed models. A single EEG segment is represented as a matrix whose dimension is (<i>L</i> &#x00D7; <i>N</i>) where <i>L</i> is the sequence length = 256 &#x00D7; segment duration and <i>N</i> is the number of channels. As an example, one 2-s segment is represented as a 512 &#x00D7; 23 matrix. The EEG dataset is then formed by putting all the ictal and interictal segments in one matrix whose dimension is (<i>2KL</i> &#x00D7; 23) where <i>K</i> is the number of the ictal or interictal segments and <i>L</i> is as defined before.</p> <h3>Dataset Preparation</h3> <p class="mb15">To prepare the EEG dataset before the training phase, all segments combined are pre-processed by applying <i>z</i>-score normalization for all channels at one to ensure that all values are standardized by having a zero mean (&#x03BC;) and unit standard deviation (&#x03C3;) using the Eq. (1)</p> <div class="equationImageholder pb0"> <math display='block'> <mrow> <mi>x</mi> <mo>=</mo> <mfrac> <mrow> <mi>x</mi> <mo>-</mo> <mi mathvariant="normal">&#x03BC;</mi> </mrow> <mi mathvariant="normal">&#x03C3;</mi> </mfrac> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>1</mn><mo stretchy='false'>)</mo></math> </div> <p class="mb0">Next, as a batch, the whole dataset values are scaled to the [0, 1] range using Min&#x2013;Max normalization to ensure that the original and the reconstructed segments have the same range of values. Finally, the channel&#x2019;s dimension of the segments is extended by one column to be more suitable for the AE to be used.</p> <h3>Proposed Systems Architecture</h3> <p class="mb0">The objective of the article is to build accurate and reliable deep learning models for epileptic seizure detection based on differentiating between two classes of epileptic brain states, interictal and ictal. The proposed models automatically learn powerful features that help to achieve a high classification accuracy of minimally pre-processed EEG signals. Our target is to eliminate the overhead induced by the exhausting manual feature extraction process and also replacing complex systems that require long training times with a much simpler, faster, and more efficient system that benefits from the structure and functionality of AEs. An AE neural network consists of two subnetworks: an encoder and a decoder. The encoder network is used for compressing (encoding) the input information (EEG signals in our case) into a lower-dimensional representation and the decoder is used in a reverse way to decompress or reconstruct the original signal. AE-based compression is accomplished by continually training a network to reconstruct its input while trying to minimize a loss function between the original input and the reconstructed one. 2D-DCAE-based models are proposed for automatically learning inherent signal features from labeled EEG segments while being trained in a supervised way. <a href="#F2">Figure 2</a> shows the block diagram of the first proposed model which consists of a 2D-DCAE where the encoder output, the latent space representation, is also fed into an MLP network to perform the classification task.</p> <div class="DottedLine"></div> <div class="Imageheaders">FIGURE 2</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g002.jpg" name="figure2" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g002.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g002.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g002.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g002.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g002.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g002.jpg" alt="www.frontiersin.org" id="F2" loading="lazy"> </picture> </a> <p><strong>Figure 2.</strong> Block diagram of 2D-DCAE + MLP model for seizure detection.</p> </div> <div class="clear"></div> <div class="DottedLine"></div> <p class="mb15 w100pc float_left mt15"><a href="#F3">Figure 3</a> shows the block diagram of the second proposed model which consists of a 2D-DCAE but in this case, the encoder output which is the latent space representation is feed into a Bi-LSTM recurrent neural network to perform the classification task.</p> <div class="DottedLine"></div> <div class="Imageheaders">FIGURE 3</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g003.jpg" name="figure3" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g003.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g003.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g003.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g003.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g003.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g003.jpg" alt="www.frontiersin.org" id="F3" loading="lazy"> </picture> </a> <p><strong>Figure 3.</strong> Block diagram of 2D-DCAE + Bi-LSTM model for seizure detection.</p> </div> <div class="clear"></div> <div class="DottedLine"></div> <p class="mb15 w100pc float_left mt15">The performance of the two proposed models will be compared with two other models. One of the new models comprises a two-dimensional deep convolutional neural network (2D-DCNN) connected to an MLP, <a href="#F4">Figure 4A</a>, while a 2D-DCNN is connected to a Bi-LSTM to form the second model, <a href="#F4">Figure 4B</a>.</p> <div class="DottedLine"></div> <div class="Imageheaders">FIGURE 4</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g004.jpg" name="figure4" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g004.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g004.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g004.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g004.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g004.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g004.jpg" alt="www.frontiersin.org" id="F4" loading="lazy"> </picture> </a> <p><strong>Figure 4.</strong> Block diagram of the two-dimensional deep convolutional neural network-based models for seizure detection, <b>(A)</b> 2D-DCNN + MLP model and <b>(B)</b> 2D-DCNN + Bi-LSTM model.</p> </div> <div class="clear"></div> <div class="DottedLine"></div> <h3>Two-Dimensional Deep Convolutional Autoencoder</h3> <p class="mb15">Convolutional neural networks are a special class of feedforward neural networks that are very well-suited for processing multidimensional data like images or multi-channel EEG signals. Applications of CNNs in a variety of disciplines, such as computer vision and pattern recognition, have recorded very impressive outcomes (<a href="#B22">Krizhevsky et al., 2017</a>). This is due to its great ability to hierarchically learn excellent spatial features for the representation of data of different types. The parameter sharing and sparse connections properties of CNNs make them much more memory-savers compared to MLPs networks that consist of fully connected layers. As a result of these advantages, a two-dimensional convolution autoencoder stacked with convolution and pooling layers is proposed in this work rather than a standard AE that uses only fully connected layers.</p> <p class="mb15">The encoder subnetwork of the AE is a CNN consists of four convolutional layers and four max-pooling layers stacked interchangeably. The convolutional layers are responsible for learning the spatial and temporal features in the input EEG signals segments while the max-pooling layers are used for dimensionality reduction by downsampling. A single convolutional layer is made up of filters (kernels) consisting of trainable parameters (weights) that slide over and convolve with the input to generate feature maps where the number of feature maps equals the number of the applied filters. A configurable parameter (stride) controls how much the filter window is sliding over the input. The pooling layer performs down-sampling by lowering the dimension of the feature maps to reduce computational complexity. The low dimensional output of the encoding network is called latent space representation or bottleneck. On the other side, the decoder subnetwork consists of four convolutional layers and four upsampling layers which are also deployed interchangeably and are used to reconstruct the original input.</p> <p class="mb15">In all models, in the encoder network, the convolutional layers are configured with 32, 32, 64, and 64 filters, respectively. In the decoder network, the first three convolutional layers are configured with 64, 32, and 32 filters while the last layer has only one filter. All convolutional layers have a kernel size of 3 &#x00D7; 2, and a default stride value equals one. To keep the height and width of the feature maps at the same values, all convolutional layers are configured using the same padding technique. The activation function used in all convolutional layers, except the last layer, is the rectified linear unit (ReLU) defined in Eq. (2) because of its sparsity, computational simplicity, and sturdiness against noise in the input signals (<a href="#B13">Goodfellow et al., 2017</a>).</p> <div class="equationImageholder pb0"> <math display='block'> <mrow> <mrow> <mi>f</mi> <mo>&#x2062;</mo> <mrow> <mo stretchy="false">(</mo> <mi>x</mi> <mo stretchy="false">)</mo> </mrow> </mrow> <mo>=</mo> <mrow> <mpadded width="+3.3pt"> <mtext>max</mtext> </mpadded> <mo>&#x2062;</mo> <mrow> <mo stretchy="false">{</mo> <mn>0</mn> <mo>,</mo> <mi>x</mi> <mo stretchy="false">}</mo> </mrow> </mrow> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>2</mn><mo stretchy='false'>)</mo></math> </div> <p class="mb15">where <i>x</i> is the weighted sum of the inputs and <i>f</i>(<i>x</i>) is the ReLU activation function.</p> <p class="mb15">The final convolutional layer of the 2D-DCAE uses the sigmoid activation function defined in Eq. (3) to generate an output in the range [0, 1].</p> <div class="equationImageholder pb0"> <math display='block'> <mrow> <mpadded width="+5pt"> <mi>y</mi> </mpadded> <mo rspace="7.5pt">=</mo> <mfrac> <mn>1</mn> <mrow> <mn>1</mn> <mo>+</mo> <msup> <mi>e</mi> <mrow> <mo>-</mo> <mi>x</mi> </mrow> </msup> </mrow> </mfrac> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>3</mn><mo stretchy='false'>)</mo></math> </div> <p class="mb15">where <i>x</i> is the weighted sum of the inputs and <i>y</i> is the output of the activation function.</p> <p class="mb15">All max-pooling layers are configured to perform input downsampling by taking the maximum value over windows of sizes (2, 2) except the last layer that uses a window of size (2, 3). The first upsampling layer does its job by interpolating the rows and columns of the input data using a size (2, 3) while the last three upsampling layers use (2, 2) sizes.</p> <p class="mb15">Our models apply the Batch Normalization (batch norm) technique for speeding up and stabilizing the training process and to ensure high performance. The batch norm transform (<a href="#B18">Ioffe and Szegedy, 2015</a>) is defined as:</p> <div class="equationImageholder pb0"> <math display='block'> <mrow> <mrow> <mi>B</mi> <mo>&#x2062;</mo> <msub> <mi>N</mi> <mrow> <mi mathvariant="normal">&#x03B3;</mi> <mo>,</mo> <mi mathvariant="normal">&#x03B2;</mi> </mrow> </msub> <mo>&#x2062;</mo> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>)</mo> </mrow> </mrow> <mo>=</mo> <mrow> <mrow> <mi mathvariant="normal">&#x03B3;</mi> <mo>&#x2062;</mo> <mfrac> <mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <msub> <mi mathvariant="normal">&#x03BC;</mi> <mi>B</mi> </msub> </mrow> <msqrt> <mrow> <mrow> <msubsup> <mi mathvariant="normal">&#x03C3;</mi> <mi>B</mi> <mn>2</mn> </msubsup> <mo>+</mo> </mrow> <mo>&#x2208;</mo> <mi></mi> </mrow> </msqrt> </mfrac> </mrow> <mo>+</mo> <mi mathvariant="normal">&#x03B2;</mi> </mrow> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>4</mn><mo stretchy='false'>)</mo></math> </div> <p class="mb0">where an input vector <i>x</i><sub><i>i</i></sub> is normalized within a mini-batch <i>B</i> = {<i>x</i><sub>1</sub>,<i>x</i><sub>2</sub>&#x2026;<i>x</i><sub><i>m</i></sub>} having a mean &#x03BC;<sub><i>B</i></sub>and variance <math id="INEQ4"><msubsup><mi mathvariant="normal">&#x03C3;</mi><mi>B</mi><mn>2</mn></msubsup></math>. &#x03B2; and &#x03B3; are two parameters that are learned jointly and used to scale and shift the normalized value while &#x2208; is a constant added for numerical stability. Four batch normalization layers are deployed between the four convolutional and max-pooling layers of the encoder subnetwork.</p> <h3>Proposed Classification Models</h3> <h4 class="pt0">Two-Dimensional Deep Convolution Autoencoder + MLP</h4> <p class="mb0">In the first proposed model depicted in <a href="#F5">Figure 5</a>, the output of the decoder subnetwork (the latent space representation) is also converted from its multi-dimensional form to a vector using a flatten layer and then fed into an MLP network-based classifier. The MLP network consists of two hidden fully connected layers having 50 and 32 neurons (units), respectively. Both layers use the Relu activation function. The output layer of the MLP has a sigmoid activation function whose output represents the probability that an input EEG segment belongs to one of the classes.</p> <div class="DottedLine"></div> <div class="Imageheaders">FIGURE 5</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g005.jpg" name="figure5" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g005.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g005.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g005.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g005.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g005.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g005.jpg" alt="www.frontiersin.org" id="F5" loading="lazy"> </picture> </a> <p><strong>Figure 5.</strong> Proposed 2D-DCAE + MLP architecture (assuming that an EEG segment length is 2 s).</p> </div> <div class="clear"></div> <div class="DottedLine"></div> <h4>Two-Dimensional Deep Convolution Autoencoder + Bi-LSTM</h4> <p class="mb0">Long short-term memory (LSTM) is a particular architecture of recurrent neural networks. It was developed to solve numerous problems that vanilla RNNs suffer during training using backpropagation over Time (BPTT) (<a href="#B24">Mozer, 1989</a>) such as information morphing and exploding and vanishing gradients (<a href="#B7">Bengio et al., 1994</a>). By proposing the concept of memory cells (units) with three controlling gates, LSTMs are capable of maintaining gradients values calculated by backpropagation during network training while preserving long-term temporal dependencies between inputs (<a href="#B15">Hochreiter and Schmidhuber, 1997</a>). <a href="#F6">Figure 6</a> shows the structure of a single LSTM cell.</p> <div class="DottedLine"></div> <div class="Imageheaders">FIGURE 6</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g006.jpg" name="figure6" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g006.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g006.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g006.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g006.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g006.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g006.jpg" alt="www.frontiersin.org" id="F6" loading="lazy"> </picture> </a> <p><strong>Figure 6.</strong> Long short-term memory cell structure.</p> </div> <div class="clear"></div> <div class="DottedLine"></div> <p class="mb15 w100pc float_left mt15">The following equations show how information is processed inside the LSTM cell.</p> <div class="equationImageholder pb0"> <math display='block'> <mrow> <msub> <mi>f</mi> <mi>t</mi> </msub> <mo>=</mo> <mi mathvariant="normal">&#x03C3;</mi> <mrow> <mo stretchy="false">(</mo> <msub> <mi>W</mi> <mi>f</mi> </msub> <mo>.</mo> <mrow> <mo>[</mo> <msub> <mi>h</mi> <mrow> <mi>t</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> <mo rspace="12.5pt">,</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>]</mo> </mrow> <mo>+</mo> <msub> <mi>b</mi> <mi>f</mi> </msub> <mo rspace="7.5pt" stretchy="false">)</mo> </mrow> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>5</mn><mo stretchy='false'>)</mo></math> </div> <div class="equationImageholder pb0"> <math display='block'> <mrow> <msub> <mi>i</mi> <mi>t</mi> </msub> <mo>=</mo> <mi mathvariant="normal">&#x03C3;</mi> <mrow> <mo stretchy="false">(</mo> <msub> <mi>W</mi> <mi>i</mi> </msub> <mo>.</mo> <mrow> <mo>[</mo> <msub> <mi>h</mi> <mrow> <mi>t</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> <mo rspace="7.5pt">,</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>]</mo> </mrow> <mo>+</mo> <msub> <mi>b</mi> <mi>i</mi> </msub> <mo stretchy="false">)</mo> </mrow> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>6</mn><mo stretchy='false'>)</mo></math> </div> <div class="equationImageholder pb0"> <math display='block'> <mrow> <msub> <mi>o</mi> <mi>t</mi> </msub> <mo>=</mo> <mi mathvariant="normal">&#x03C3;</mi> <mrow> <mo>(</mo> <msub> <mi>W</mi> <mi>o</mi> </msub> <mo>.</mo> <mrow> <mo>[</mo> <msub> <mi>h</mi> <mrow> <mi>t</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> <mo rspace="7.5pt">,</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo>]</mo> </mrow> <mo>+</mo> <msub> <mi>b</mi> <mi>o</mi> </msub> <mo>)</mo> </mrow> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>7</mn><mo stretchy='false'>)</mo></math> </div> <div class="equationImageholder pb0"> <math display='block'> <mrow> <msub> <mover accent="true"> <mi>c</mi> <mo>~</mo> </mover> <mi>t</mi> </msub> <mo>=</mo> <mrow> <mi>t</mi> <mo>&#x2062;</mo> <mi>a</mi> <mo>&#x2062;</mo> <mi>n</mi> <mo>&#x2062;</mo> <mi>h</mi> <mo>&#x2062;</mo> <mrow> <mo stretchy="false">(</mo> <mrow> <mrow> <msub> <mi>W</mi> <mi>c</mi> </msub> <mo>&#x22C5;</mo> <mrow> <mo stretchy="false">[</mo> <msub> <mi>h</mi> <mrow> <mi>t</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> <mo>,</mo> <msub> <mi>x</mi> <mi>t</mi> </msub> <mo stretchy="false">]</mo> </mrow> </mrow> <mo>+</mo> <msub> <mi>b</mi> <mi>c</mi> </msub> </mrow> <mo stretchy="false">)</mo> </mrow> </mrow> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>8</mn><mo stretchy='false'>)</mo></math> </div> <div class="equationImageholder pb0"> <math display='block'> <mrow> <msub> <mi>c</mi> <mi>t</mi> </msub> <mo>=</mo> <mrow> <mrow> <msub> <mi>f</mi> <mi>t</mi> </msub> <mo>&#x2062;</mo> <mrow> <mo largeop="true" mathsize="160%" movablelimits="false" stretchy="false" symmetric="true">&#x2299;</mo> <msub> <mi>c</mi> <mrow> <mi>t</mi> <mo>-</mo> <mn>1</mn> </mrow> </msub> </mrow> </mrow> <mo>+</mo> <mrow> <msub> <mi>i</mi> <mi>t</mi> </msub> <mo>&#x2062;</mo> <mrow> <mo largeop="true" mathsize="160%" movablelimits="false" rspace="5.8pt" stretchy="false" symmetric="true">&#x2299;</mo> <msub> <mover accent="true"> <mi>c</mi> <mo>~</mo> </mover> <mi>t</mi> </msub> </mrow> </mrow> </mrow> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>9</mn><mo stretchy='false'>)</mo></math> </div> <div class="equationImageholder pb0"> <math display='block'> <mrow> <msub> <mi>h</mi> <mi>t</mi> </msub> <mo>=</mo> <mrow> <msub> <mi>o</mi> <mi>t</mi> </msub> <mo>&#x2062;</mo> <mrow> <mo largeop="true" mathsize="160%" movablelimits="false" stretchy="false" symmetric="true">&#x2299;</mo> <mrow> <mi>t</mi> <mo>&#x2062;</mo> <mi>a</mi> <mo>&#x2062;</mo> <mi>n</mi> <mo>&#x2062;</mo> <mi>h</mi> <mo>&#x2062;</mo> <mrow> <mo stretchy="false">(</mo> <msub> <mi>c</mi> <mi>t</mi> </msub> <mo stretchy="false">)</mo> </mrow> </mrow> </mrow> </mrow> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>10</mn><mo stretchy='false'>)</mo></math> </div> <p class="mb15">where <i>x_t</i> is the input at time <i>t</i> in a sequence <i>X</i> = (<i>x</i><sub>1</sub>,<i>x</i><sub>2</sub>,<i>x</i><sub>3</sub>,.,<i>x</i><sub><i>n</i></sub>) of <i>n</i> time steps. <i>h</i><sub><i>t&#x2013;1</i></sub> and c<sub>t&#x2212;1</sub> are the hidden state output and cell state at the previous time step, respectively. h<sub>t</sub> and <i>c_t</i> are the current hidden state and cell state. <i>f_t</i>,<i>i</i><sub><i>t</i></sub>, and <i>o_t</i> are the forget, input, and output gates. <i>W</i> and <i>b</i> represent the weights and biases matrices and vectors while &#x03C3; is the sigmoid (logistic) function and &#x2299; is the Hadamard product operator. The memory cell starts operation by selecting which information to keep or forget from the previous states using the forget gate <i>f_t</i>. Then, the cell calculates the candidate state <math id="INEQ10"><msub><mover accent="true"><mi>c</mi><mo stretchy="false">~</mo></mover><mi>t</mi></msub></math>. After that, using the prior cell state c<sub>t&#x2212;1</sub> and the input gate <i>i_t</i>, the cell decides what further information to write to the current state <i>c</i><sub><i>t</i></sub>. Finally, the output gate <i>o_t</i> calculates how much state information h<sub>t</sub> will be transported to the next time step. Note that, in <a href="#F6">Figure 6</a>, the biases and the multiplication operations between the matrix of the concatenated input and the hidden state, and the weight matrices are not shown to make the figure simpler.</p> <p class="mb0">In the second proposed model, the output of the decoder subnetwork is fed into a Bi-LSTM recurrent neural network-based classifier as shown in <a href="#F7">Figure 7</a>.</p> <div class="DottedLine"></div> <div class="Imageheaders">FIGURE 7</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g007.jpg" name="figure7" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g007.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g007.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g007.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g007.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g007.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g007.jpg" alt="www.frontiersin.org" id="F7" loading="lazy"> </picture> </a> <p><strong>Figure 7.</strong> Proposed 2D-DCAE + Bi-LSTM architecture (assuming that an EEG segment length is 2 s).</p> </div> <div class="clear"></div> <div class="DottedLine"></div> <p class="mb15 w100pc float_left mt15">For classification, a single-layer Bi-LSTM network consisting of two LSTM blocks (cells) is used in this model. The Bi-LSTM network architecture is similar to the standard unidirectional LSTM architecture, except that both LSTM blocks process the output of the encoder, reshaped as a sequence, simultaneously in two opposite directions instead of one direction. After passing through the entire input sequence, the average of the two outputs of both blocks concatenated together is computed and used for the classification task. Bi-LSTMs are useful in that they take into account the temporal dependence between the current input at a certain time and its previous and subsequent counterparts, which offers a strong advantage for enhancing the classification results (<a href="#B4">Abdelhameed et al., 2018b</a>). <a href="#F8">Figure 8</a> shows a single-layer Bi-LSTM network unrolled over n time steps.</p> <div class="DottedLine"></div> <div class="Imageheaders">FIGURE 8</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g008.jpg" name="figure8" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g008.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g008.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g008.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g008.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g008.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g008.jpg" alt="www.frontiersin.org" id="F8" loading="lazy"> </picture> </a> <p><strong>Figure 8.</strong> Unrolled single-layer Bi-LSTM network.</p> </div> <div class="clear"></div> <div class="DottedLine"></div> <p class="mb15 w100pc float_left mt15">The Bi-LSTM layer is configured to have 50 units and to overcome overfitting, the dropout regularization technique is used with a value of 0.1. As in the first model, the sigmoid activation function is used to predict the EEG segment class label.</p> <h4>Loss Functions and Optimizer</h4> <p class="mb15">As the SDCAE is performing the two tasks of input reconstruction and classification simultaneously, both proposed models are designed to minimize two losses during network training. The first loss is the supervised classification loss (CL) between the predicted and actual class labels. The binary cross-entropy, defined in Eq. (11), is chosen as the loss function.</p> <div class="equationImageholder pb0"> <math display='block'> <mrow> <mrow> <mi>C</mi> <mo>&#x2062;</mo> <mi>L</mi> </mrow> <mo>=</mo> <mrow> <mrow> <mo>-</mo> <mrow> <mfrac> <mn>1</mn> <mi>N</mi> </mfrac> <mo>&#x2062;</mo> <mrow> <munderover> <mo largeop="true" movablelimits="false" symmetric="true">&#x2211;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <mrow> <mrow> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>&#x22C5;</mo> <mi>log</mi> </mrow> <mo>&#x2062;</mo> <mrow> <mo stretchy="false">(</mo> <msub> <mover accent="true"> <mi>y</mi> <mo stretchy="false">^</mo> </mover> <mi>i</mi> </msub> <mo rspace="7.5pt" stretchy="false">)</mo> </mrow> </mrow> </mrow> </mrow> </mrow> <mo rspace="7.5pt">+</mo> <mrow> <mrow> <mrow> <mo stretchy="false">(</mo> <mrow> <mn>1</mn> <mo>-</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> </mrow> <mo stretchy="false">)</mo> </mrow> <mo>&#x22C5;</mo> <mi>log</mi> </mrow> <mo>&#x2062;</mo> <mrow> <mo stretchy="false">(</mo> <mrow> <mn>1</mn> <mo>-</mo> <msub> <mover accent="true"> <mi>y</mi> <mo stretchy="false">^</mo> </mover> <mi>i</mi> </msub> </mrow> <mo stretchy="false">)</mo> </mrow> </mrow> </mrow> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>11</mn><mo stretchy='false'>)</mo></math> </div> <p class="mb15">where <math id="INEQ14"><msub><mover accent="true"><mi>y</mi><mo stretchy="false">^</mo></mover><mi>i</mi></msub></math> is the predicted model output for a single EEG segment, and <i>y_i</i> is the corresponding actual class label in a training batch equals <i>N.</i></p> <p class="mb15">The second loss is the loss of reconstruction (RL) between the input EEG segments and their reconstructed equivalents decoded by the DCAE and the mean square error defined in Eq. (12) is utilized for calculating this loss.</p> <div class="equationImageholder pb0"> <math display='block'> <mrow> <mrow> <mi>R</mi> <mo>&#x2062;</mo> <mi>L</mi> </mrow> <mo>=</mo> <mrow> <mfrac> <mn>1</mn> <mi>N</mi> </mfrac> <mo>&#x2062;</mo> <mrow> <munderover> <mo largeop="true" movablelimits="false" symmetric="true">&#x2211;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <mrow> <mfrac> <mn>1</mn> <mrow> <mi>m</mi> <mo>&#x2062;</mo> <mi>n</mi> </mrow> </mfrac> <mo>&#x2062;</mo> <mrow> <munderover> <mo largeop="true" movablelimits="false" symmetric="true">&#x2211;</mo> <mrow> <mi>j</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>m</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <mrow> <munderover> <mo largeop="true" movablelimits="false" symmetric="true">&#x2211;</mo> <mrow> <mi>k</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>n</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <msup> <mrow> <mo>(</mo> <mrow> <msub> <mi>y</mi> <mrow> <mi>j</mi> <mo>&#x2062;</mo> <mi>k</mi> </mrow> </msub> <mo>-</mo> <mpadded width="+3.3pt"> <msub> <mover accent="true"> <mi>y</mi> <mo stretchy="false">^</mo> </mover> <mrow> <mi>j</mi> <mo>&#x2062;</mo> <mi>k</mi> </mrow> </msub> </mpadded> </mrow> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </mrow> </mrow> </mrow> </mrow> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>12</mn><mo stretchy='false'>)</mo></math> </div> <p class="mb15">Where an <i>y</i><sub><i>jk</i></sub> is the original value at the position indexed by <i>j</i>, <i>k</i> in an input EEG segment matrix of size (<i>m</i> &#x00D7; <i>n</i>), <math id="INEQ15"><msub><mover accent="true"><mi>y</mi><mo stretchy="false">^</mo></mover><mrow><mi>j</mi><mo>&#x2062;</mo><mi>k</mi></mrow></msub></math> is the reconstructed value and <i>N</i> is the number of segments defined as before.</p> <p class="mb15">There is no much difference between training a deep learning model with a single output or a deep learning model with multiple outputs. In the latter case as in our proposed SDCAE models, the total loss (TL) of a model is calculated as the weighted summation of the CL and the reconstruction loss (RL) as in Eq. (13)</p> <div class="equationImageholder pb0"> <math display='block'> <mrow> <mrow> <mi>T</mi> <mo>&#x2062;</mo> <mi>L</mi> </mrow> <mo>=</mo> <mrow> <mrow> <mrow> <msub> <mi>w</mi> <mi>c</mi> </msub> <mo>&#x00D7;</mo> <mi>C</mi> </mrow> <mo>&#x2062;</mo> <mpadded width="+5pt"> <mi>L</mi> </mpadded> </mrow> <mo rspace="7.5pt">+</mo> <mrow> <mrow> <msub> <mi>w</mi> <mi>r</mi> </msub> <mo>&#x00D7;</mo> <mi>R</mi> </mrow> <mo>&#x2062;</mo> <mi>L</mi> </mrow> </mrow> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>13</mn><mo stretchy='false'>)</mo></math> </div> <p class="mb15">where <i>w</i><sub><i>c</i></sub>and <i>w</i><sub><i>r</i></sub>are the weights and can have any values in the interval (0,1]. In our design, <i>w</i><sub><i>c</i></sub>is chosen to be 0.5 while <i>w_r</i> equals to 1.</p> <p class="mb15">The backpropagation of the loss in both subnetworks starts by calculating two partial derivatives (gradients): <math id="INEQ19"><mfrac><mrow><mo>&#x2202;</mo><mo>&#x2061;</mo><mrow><mi>T</mi><mo>&#x2062;</mo><mi>L</mi></mrow></mrow><mrow><mo>&#x2202;</mo><mo>&#x2061;</mo><mrow><mi>C</mi><mo>&#x2062;</mo><mi>L</mi></mrow></mrow></mfrac></math> and <math id="INEQ20"><mfrac><mrow><mo>&#x2202;</mo><mo>&#x2061;</mo><mrow><mi>T</mi><mo>&#x2062;</mo><mi>L</mi></mrow></mrow><mrow><mo>&#x2202;</mo><mo>&#x2061;</mo><mrow><mi>R</mi><mo>&#x2062;</mo><mi>L</mi></mrow></mrow></mfrac></math>. All other gradients are then calculated using the chaining rule and the weights and biases are then updated in the same way as typical deep learning models.</p> <p class="mb0">Different optimizers such as Stochastic Gradient Descent (SGD; <a href="#B8">Bottou, 2004</a>), root mean square propagation (RMSProp; <a href="#B33">Tieleman and Hinton, 2012</a>), ADADELTA (<a href="#B40">Zeiler, 2012</a>), and Adam (<a href="#B21">Kingma and Ba, 2014</a>) have been tested while training the SDCAE. Eventually, based on different models&#x2019; performances, Adam optimizer was the chosen optimizer with a learning rate set at 0.0001.</p> <h3>Data Selection and Training</h3> <p class="mb15">The performance of the two proposed SDCAE seizure detection models (DCAE + MLP), and (DCAE + Bi-LSTM) is evaluated against that of two regular deep learning models (DCNN + MLP), and (DCNN + Bi-LSTM) using EEG segments of three different lengths. That means a total number of twelve models will be tested and assessed using various performance measures.</p> <p class="mb0">A stratified 10-fold cross-validation methodology (<a href="#B14">He and Ma, 2013</a>). is used to prepare the dataset for training and to evaluate the performance of all models to test their strength and reliability while classifying unseen data. In this methodology, the investigated EEG dataset (containing both interictal and ictal data segments) is randomly divided into ten equal subsamples or folds where the balanced distribution of both classes (ictal and interictal) is preserved within each fold. One ten percent of the dataset (a subsample) is marked as the testing set (testing fold) while the remaining nine folds of the dataset collectively are used as the training set. The cross-validation process is repeated for ten iterations, with each of the 10-folds used exactly once as the testing set. Within each iteration, all models are trained for 200 epochs using a batch size of 32. The average and standard deviation of the classification results of the 10 iterations are calculated to produce the final estimations for different evaluation measures.</p> <h3>Models Performance Evaluation</h3> <p class="mb15">Various statistical metrics commonly used in the literature such as accuracy, sensitivity (recall), specificity, precision, and F1-score (<a href="#B31">Sokolova and Lapalme, 2009</a>) have been calculated to assess the classification efficiency of the models against the testing set, in each of the ten iterations of the 10-fold cross-validation. These evaluation metrics are defined as follows:</p> <div class="equationImageholder pb0"> <math display='block'> <mrow> <mrow> <mi>A</mi> <mo>&#x2062;</mo> <mi>c</mi> <mo>&#x2062;</mo> <mi>c</mi> <mo>&#x2062;</mo> <mi>u</mi> <mo>&#x2062;</mo> <mi>r</mi> <mo>&#x2062;</mo> <mi>a</mi> <mo>&#x2062;</mo> <mi>c</mi> <mo>&#x2062;</mo> <mi>y</mi> </mrow> <mo>=</mo> <mrow> <mfrac> <mrow> <mrow> <mi>T</mi> <mo>&#x2062;</mo> <mi>P</mi> </mrow> <mo>+</mo> <mrow> <mi>T</mi> <mo>&#x2062;</mo> <mi>N</mi> </mrow> </mrow> <mrow> <mrow> <mi>T</mi> <mo>&#x2062;</mo> <mi>P</mi> </mrow> <mo>+</mo> <mrow> <mi>T</mi> <mo>&#x2062;</mo> <mi>N</mi> </mrow> <mo>+</mo> <mrow> <mi>F</mi> <mo>&#x2062;</mo> <mi>P</mi> </mrow> <mo>+</mo> <mrow> <mi>F</mi> <mo>&#x2062;</mo> <mi>N</mi> </mrow> </mrow> </mfrac> <mo>&#x00D7;</mo> <mrow> <mn>100</mn> <mo lspace="0pt" rspace="3.5pt">%</mo> </mrow> </mrow> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>14</mn><mo stretchy='false'>)</mo></math> </div> <div class="equationImageholder pb0"> <math display='block'> <mrow> <mrow> <mi>S</mi> <mo>&#x2062;</mo> <mi>e</mi> <mo>&#x2062;</mo> <mi>n</mi> <mo>&#x2062;</mo> <mi>s</mi> <mo>&#x2062;</mo> <mi>i</mi> <mo>&#x2062;</mo> <mi>t</mi> <mo>&#x2062;</mo> <mi>i</mi> <mo>&#x2062;</mo> <mi>v</mi> <mo>&#x2062;</mo> <mi>i</mi> <mo>&#x2062;</mo> <mi>t</mi> <mo>&#x2062;</mo> <mi>y</mi> <mo>&#x2062;</mo> <mrow> <mo>(</mo> <mrow> <mi>R</mi> <mo>&#x2062;</mo> <mi>e</mi> <mo>&#x2062;</mo> <mi>c</mi> <mo>&#x2062;</mo> <mi>a</mi> <mo>&#x2062;</mo> <mi>l</mi> <mo>&#x2062;</mo> <mi>l</mi> </mrow> <mo>)</mo> </mrow> </mrow> <mo>=</mo> <mrow> <mfrac> <mrow> <mi>T</mi> <mo>&#x2062;</mo> <mi>P</mi> </mrow> <mrow> <mrow> <mi>T</mi> <mo>&#x2062;</mo> <mi>P</mi> </mrow> <mo>+</mo> <mrow> <mi>F</mi> <mo>&#x2062;</mo> <mi>N</mi> </mrow> </mrow> </mfrac> <mo>&#x00D7;</mo> <mrow> <mn>100</mn> <mo lspace="0pt" rspace="3.5pt">%</mo> </mrow> </mrow> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>15</mn><mo stretchy='false'>)</mo></math> </div> <div class="equationImageholder pb0"> <math display='block'> <mrow> <mrow> <mi>S</mi> <mo>&#x2062;</mo> <mi>p</mi> <mo>&#x2062;</mo> <mi>e</mi> <mo>&#x2062;</mo> <mi>c</mi> <mo>&#x2062;</mo> <mi>i</mi> <mo>&#x2062;</mo> <mi>f</mi> <mo>&#x2062;</mo> <mi>i</mi> <mo>&#x2062;</mo> <mi>c</mi> <mo>&#x2062;</mo> <mi>i</mi> <mo>&#x2062;</mo> <mi>t</mi> <mo>&#x2062;</mo> <mi>y</mi> </mrow> <mo rspace="7.5pt">=</mo> <mrow> <mfrac> <mrow> <mi>T</mi> <mo>&#x2062;</mo> <mi>N</mi> </mrow> <mrow> <mrow> <mi>T</mi> <mo>&#x2062;</mo> <mi>N</mi> </mrow> <mo>+</mo> <mrow> <mi>F</mi> <mo>&#x2062;</mo> <mi>P</mi> </mrow> </mrow> </mfrac> <mo>&#x00D7;</mo> <mrow> <mn>100</mn> <mo lspace="0pt" rspace="3.5pt">%</mo> </mrow> </mrow> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>16</mn><mo stretchy='false'>)</mo></math> </div> <div class="equationImageholder pb0"> <math display='block'> <mrow> <mrow> <mi>P</mi> <mo>&#x2062;</mo> <mi>r</mi> <mo>&#x2062;</mo> <mi>e</mi> <mo>&#x2062;</mo> <mi>c</mi> <mo>&#x2062;</mo> <mi>i</mi> <mo>&#x2062;</mo> <mi>s</mi> <mo>&#x2062;</mo> <mi>i</mi> <mo>&#x2062;</mo> <mi>o</mi> <mo>&#x2062;</mo> <mi>n</mi> </mrow> <mo>=</mo> <mrow> <mfrac> <mrow> <mi>T</mi> <mo>&#x2062;</mo> <mi>P</mi> </mrow> <mrow> <mrow> <mi>T</mi> <mo>&#x2062;</mo> <mi>P</mi> </mrow> <mo>+</mo> <mrow> <mi>F</mi> <mo>&#x2062;</mo> <mi>P</mi> </mrow> </mrow> </mfrac> <mo>&#x00D7;</mo> <mrow> <mn>100</mn> <mo lspace="0pt" rspace="3.5pt">%</mo> </mrow> </mrow> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>17</mn><mo stretchy='false'>)</mo></math> </div> <div class="equationImageholder pb0"> <math display='block'> <mrow> <mrow> <mrow> <mi>F</mi> <mo>&#x2062;</mo> <mn>1</mn> </mrow> <mo>-</mo> <mrow> <mi>s</mi> <mo>&#x2062;</mo> <mi>c</mi> <mo>&#x2062;</mo> <mi>o</mi> <mo>&#x2062;</mo> <mi>r</mi> <mo>&#x2062;</mo> <mi>e</mi> </mrow> </mrow> <mo>=</mo> <mrow> <mn>2</mn> <mo>&#x00D7;</mo> <mfrac> <mrow> <mrow> <mrow> <mi>P</mi> <mo>&#x2062;</mo> <mi>r</mi> <mo>&#x2062;</mo> <mi>e</mi> <mo>&#x2062;</mo> <mi>c</mi> <mo>&#x2062;</mo> <mi>i</mi> <mo>&#x2062;</mo> <mi>s</mi> <mo>&#x2062;</mo> <mi>i</mi> <mo>&#x2062;</mo> <mi>o</mi> <mo>&#x2062;</mo> <mi>n</mi> </mrow> <mo>&#x00D7;</mo> <mi>R</mi> </mrow> <mo>&#x2062;</mo> <mi>e</mi> <mo>&#x2062;</mo> <mi>c</mi> <mo>&#x2062;</mo> <mi>a</mi> <mo>&#x2062;</mo> <mi>l</mi> <mo>&#x2062;</mo> <mi>l</mi> </mrow> <mrow> <mrow> <mi>P</mi> <mo>&#x2062;</mo> <mi>r</mi> <mo>&#x2062;</mo> <mi>e</mi> <mo>&#x2062;</mo> <mi>c</mi> <mo>&#x2062;</mo> <mi>i</mi> <mo>&#x2062;</mo> <mi>s</mi> <mo>&#x2062;</mo> <mi>i</mi> <mo>&#x2062;</mo> <mi>o</mi> <mo>&#x2062;</mo> <mi>n</mi> </mrow> <mo>+</mo> <mrow> <mi>R</mi> <mo>&#x2062;</mo> <mi>e</mi> <mo>&#x2062;</mo> <mi>c</mi> <mo>&#x2062;</mo> <mi>a</mi> <mo>&#x2062;</mo> <mi>l</mi> <mo>&#x2062;</mo> <mi>l</mi> </mrow> </mrow> </mfrac> <mo>&#x00D7;</mo> <mrow> <mn>100</mn> <mo lspace="0pt" rspace="3.5pt">%</mo> </mrow> </mrow> </mrow> <mspace width="5em"></mspace><mo stretchy='false'>(</mo><mn>18</mn><mo stretchy='false'>)</mo></math> </div> <p class="mb0">where <i>P</i> denotes the number of positive (ictal) EEG segments while <i>N</i> denotes the number of negative (interictal) EEG segments). <i>TP</i> and <i>TN</i> are the numbers of true positives and true negatives while <i>FP</i> and <i>FN</i> are the numbers of false positives and false negatives, respectively. In this study, accuracy is defined as the percentage of the correctly classified EEG segments belonging to any state (ictal or interictal), sensitivity is the percentage of correctly classified ictal state EEG segments, specificity is the percentage of correctly classified interictal state EEG segments, while precision determines how many of the EEG segments classified as belonging to the ictal state are originally ictal state EEG segments. Finally, the F1-score combines the values of precision and recall in a single metric.</p> <h3>Models Implementation</h3> <p class="mb0">The Python programming language along with many supporting libraries and in particular, the Tensorflow machine learning library&#x2019;s Keras deep learning API, has been used to develop our models. Due to the variations in the hardware resources and different GPU specifications, we have chosen not to include the computational times of training and testing the proposed models as a metric in our comparisons especially since we are developing our models using external resources provided by Google Colaboratory online environment that runs on Google&#x2019;s cloud servers.</p> <a id="h4" name="h4"></a><h2>Results</h2> <p class="mb0">For each of the four models, <a href="#F9">Figure 9</a> shows the ranges of values of the five performance metrics calculated based on the 10-Fold cross-validation classification results of the EEG segments of lengths 1, 2, and 4 s. The mean and standard deviation of all metrics over the 10-folds are then calculated and summarized in <a href="#T2">Table 2</a>.</p> <div class="DottedLine"></div> <div class="Imageheaders">FIGURE 9</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g009.jpg" name="figure9" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g009.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g009.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g009.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g009.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g009.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g009.jpg" alt="www.frontiersin.org" id="F9" loading="lazy"> </picture> </a> <p><strong>Figure 9.</strong> Boxplots showing ranges of performance metrics percentages calculated based on the 10-fold cross-validation results.</p> </div> <div class="clear"></div> <div class="DottedLine mb15"></div> <div class="Imageheaders">TABLE 2</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t002.jpg" name="table2" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t002.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t002.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t002.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t002.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t002.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t002.jpg" alt="www.frontiersin.org" id="T2" loading="lazy"> </picture> </a> <p><strong>Table 2.</strong> Classification results using different EEG segment lengths.</p> </div> <div class="clear"></div> <div class="DottedLine"></div> <p class="mb15 w100pc float_left mt15">The same results are interpreted visually in <a href="#F10">Figure 10</a>.</p> <div class="DottedLine"></div> <div class="Imageheaders">FIGURE 10</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g010.jpg" name="figure10" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g010.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g010.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g010.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g010.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g010.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g010.jpg" alt="www.frontiersin.org" id="F10" loading="lazy"> </picture> </a> <p><strong>Figure 10.</strong> Visualization of the classification results of the models using different EEG segment lengths.</p> </div> <div class="clear"></div> <div class="DottedLine"></div> <a id="h5" name="h5"></a><h2>Discussion</h2> <p class="mb15">As can be seen from the results, for all EEG segment lengths and evaluation metrics, the two proposed SDCAE models (DCAE + MLP and DCAE + Bi-LSTM) have outperformed the other two models (DCNN + MLP and DCNN + Bi-LSTM) that do not use AEs. Furthermore, as highlighted in <a href="#T2">Table 2</a>, using a segment length of 4 s, the DCAE + Bi-LSTM model has achieved the highest performance in terms of all evaluation metrics among all other combinations of models. It is also interesting to see that in all SDCAE models, a 4-s EEG segment length is the best choice to get the best classification performance. Generally, it can be noticed that all models that utilized a Bi-LSTM for classification have accomplished better results compared to their counterpart models that use MLP-based classifiers using the same EEG segment lengths. That can be explained as Bi-LSTM networks are more capable to learn better temporal patterns from the generated latent space sequence better than MLP networks. Finally, by comparing the standard deviations in the evaluation metrics values for all models, it is clear that the results of the SDCAE models mostly have less dispersion compared to the other models, which means that the SDCAE models&#x2019; performance is more consistent across all cross-validation iterations.</p> <p class="mb0"><a href="#F11">Figure 11</a> shows the classification accuracy, CL, and RL curves for the training and testing datasets obtained while training the winning model (DCAE + Bi-LSTM) in one of the iterations of the 10-fold cross-validation.</p> <div class="DottedLine"></div> <div class="Imageheaders">FIGURE 11</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g011.jpg" name="figure11" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g011.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g011.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g011.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g011.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g011.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-g011.jpg" alt="www.frontiersin.org" id="F11" loading="lazy"> </picture> </a> <p><strong>Figure 11.</strong> Accuracy and loss curves against the number of epochs obtained while training the DCAE + Bi-LSTM model.</p> </div> <div class="clear"></div> <div class="DottedLine"></div> <h3>Statistical Analysis</h3> <p class="mb0">The non-parametric Kruskal&#x2013;Wallis <i>H</i> test (<a href="#B23">Kruskal and Wallis, 1952</a>) is used to test the statistical significance of the classification results of the two proposed models (DCAE + MLP and DCAE + Bi-LSTM). For simplicity, the test results for comparing the evaluation metrics of the models obtained using an EEG segment length of 4 s will be demonstrated. When comparing DCAE + Bi-LSTM with the two models (DCNN + MLP and DCNN + Bi-LSTM), the Kruskal&#x2013;Wallis <i>H</i> test produced <i>p</i>-value = 0.0005 for accuracy, <i>p</i>-value = 0.02 for sensitivity, <i>p</i>-value = 0.025 for specificity, <i>p</i>-value = 0.005 for precision, and <i>p</i>-value = 0.001 for F1-score. Also, when comparing DCAE + MLP with the same two models, the statistical test showed <i>p</i>-value = 0.003 for accuracy, <i>p</i>-value = 0.011 for sensitivity, <i>p</i>-value = 0.083 for specificity, <i>p</i>-value = 0.019 for precision, and <i>p</i>-value = 0.002 for F1-score. For all performance assessment metrics, nearly all comparisons yielded a <i>p</i>-value lower than 0.05, apart from only one <i>p</i>-value for specificity. This shows the disparity in the statistical significance between the outcomes of all the proposed models.</p> <h3>Comparison With Other Methods</h3> <p class="mb0">In the literature, not all previous work uses the same set of metrics for evaluating the performance of the seizure classification algorithms. So, comparisons based on the most commonly used metrics which are accuracy, sensitivity, and specificity, will only be provided in this section. <a href="#T3">Table 3</a> summarizes the comparison between our best performing model and some state-of-the-art methods that use deep neural networks for feature extraction and classification of seizures. In <a href="#B39">Yuan et al. (2017)</a>, various stacked sparse denoising autoencoders (SSDAE) have been tested and compared for feature extraction and classification after preprocessing using short-time Fourier transform (STFT). The best accuracy they obtained was 93.82% using a random selection of training and testing datasets. <a href="#B20">Ke et al. (2018)</a> combined global maximal information coefficient (MIC) with visual geometry group network (VGGNet) for feature extraction and classification. Using fivefold cross-validation, they achieved 98.1% accuracy, 98.85% sensitivity, and 97.47% specificity. Using fast Fourier transform (FFT) for frequency domain analysis and CNN, the authors (<a href="#B41">Zhou et al., 2018</a>) performed patient-specific classifications between the ictal and interictal signals. Relying on sixfold cross-validation, the average of the evaluation metrics for all patients was 97.5% accuracy, 96.86% sensitivity, and 98.15% specificity. Finally, in <a href="#B16">Hossain et al. (2019)</a>, the authors used a 2D-CNN model to extract spectral and temporal characteristics of EEG signals and used them for patient-specific classification using a random selection of training and testing datasets. They got 98.05% accuracy, 90% sensitivity, and 91.65% specificity for the cross-patient results. Following the previous comparison, the results obtained by our model have shown to be superior to some of the state-of-the-art systems which all lack the proper statistical analysis for significance testing.</p> <div class="DottedLine"></div> <div class="Imageheaders">TABLE 3</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t003.jpg" name="table3" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t003.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t003.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t003.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t003.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t003.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/650050/fncom-15-650050-HTML/image_m/fncom-15-650050-t003.jpg" alt="www.frontiersin.org" id="T3" loading="lazy"> </picture> </a> <p><strong>Table 3.</strong> Comparison between our best performing model and previous methods using the same dataset.</p> </div> <div class="clear"></div> <div class="DottedLine"></div> <a id="h6" name="h6"></a><h2>Conclusion</h2> <p class="mb0">A novel deep-learning approach for the detection of seizures in pediatric patients is proposed. The novel approach uses a 2D-SDCAE for the detection of epileptic seizures based on classifying minimally pre-processed raw multichannel EEG signal recordings. In this approach, an AE is trained in a supervised way to classify between the ictal and interictal brain state EEG signals to exploit its capabilities of performing both automatic feature learning and classification simultaneously with high efficiency. Two SDCAE models that use Bi-LSTM and MLP networks-based classifiers were designed and tested using three EEG data segment lengths. The performance of both proposed models is compared to two regular deep learning models having the same layers&#x2019; structure, except that the decoder network layers are completely removed. The twelve models are trained and assessed using a 10-fold cross-validation scheme and based on five evaluation metrics, the best performing model was the SDCAE model that uses a Bi-LSTM and 4 s EEG segments. This model has achieved an average of 98.79% accuracy, 98.72% sensitivity, 98.86% specificity, 98.86% precision, and finally an F1-score of 98.79%. The comparison between this SDCAE model and other state-of-the-art systems using the same dataset has shown that the performance of our proposed model is superior to that of most existing systems.</p> <a id="h7" name="h7"></a><h2>Data Availability Statement</h2> <p class="mb0">Publicly available datasets were analyzed in this study. This data can be found here: <a href="https://physionet.org/content/chbmit/1.0.0/">https://physionet.org/content/chbmit/1.0.0/</a>.</p> <a id="h8" name="h8"></a><h2>Ethics Statement</h2> <p class="mb0">Ethical review and approval was not required for the study on human participants in accordance with the local legislation and institutional requirements. Written informed consent from the participants&#x2019; legal guardian/next of kin was not required to participate in this study in accordance with the national legislation and the institutional requirements.</p> <a id="h9" name="h9"></a><h2>Author Contributions</h2> <p class="mb0">AA conceived the presented idea, conducted the analysis, and produced the figures. MB supervised the findings of this work. Both authors discussed the results and contributed to the final manuscript.</p> <a id="conf1" name="conf1"></a><h2>Conflict of Interest</h2> <p class="mb0">The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p> <a id="refer1" name="refer1"></a><h2>References</h2> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B1" id="B1"></a>Abdelhameed, A. M., and Bayoumi, M. (2018). &#x201C;Semi-supervised deep learning system for epileptic seizures onset prediction,&#x201D; in <i>Proceedings of the 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA)</i>, Orlando, FL. doi: 10.1109/icmla.2018.00191</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1109/icmla.2018.00191" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Semi-supervised+deep+learning+system+for+epileptic+seizures+onset+prediction&#x0026;journal=Proceedings+of+the+2018+17th+IEEE+International+Conference+on+Machine+Learning+and+Applications+%28ICMLA%29&#x0026;author=Abdelhameed+A.+M.&#x0026;author=Bayoumi+M.&#x0026;publication_year=2018" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B2" id="B2"></a>Abdelhameed, A. M., and Bayoumi, M. (2019). Semi-supervised EEG signals classification system for epileptic seizure detection. <i>IEEE Signal Process. Lett.</i> 26, 1922&#x2013;1926. doi: 10.1109/lsp.2019.2953870</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1109/lsp.2019.2953870" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Semi-supervised+EEG+signals+classification+system+for+epileptic+seizure+detection%2E&#x0026;journal=IEEE+Signal+Process%2E+Lett%2E&#x0026;author=Abdelhameed+A.+M.&#x0026;author=Bayoumi+M.&#x0026;publication_year=2019&#x0026;volume=26&#x0026;pages=1922&#x2013;1926" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B3" id="B3"></a>Abdelhameed, A. M., Daoud, H. G., and Bayoumi, M. (2018a). &#x201C;Deep convolutional bidirectional LSTM recurrent neural network for epileptic seizure detection,&#x201D; in <i>Proceedings of the 2018 16th IEEE International New Circuits and Systems Conference (NEWCAS)</i>, Montreal, QC. doi: 10.1109/newcas.2018.8585542</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1109/newcas.2018.8585542" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Deep+convolutional+bidirectional+LSTM+recurrent+neural+network+for+epileptic+seizure+detection&#x0026;journal=Proceedings+of+the+2018+16th+IEEE+International+New+Circuits+and+Systems+Conference+%28NEWCAS%29&#x0026;author=Abdelhameed+A.+M.&#x0026;author=Daoud+H.+G.&#x0026;author=Bayoumi+M.&#x0026;publication_year=2018a" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B4" id="B4"></a>Abdelhameed, A. M., Daoud, H. G., and Bayoumi, M. (2018b). &#x201C;Epileptic seizure detection using deep convolutional autoencoder,&#x201D; in <i>Proceedings of the 2018 IEEE International Workshop on Signal Processing Systems (SiPS)</i>, Cape Town. doi: 10.1109/sips.2018.8598447</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1109/sips.2018.8598447" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Epileptic+seizure+detection+using+deep+convolutional+autoencoder&#x0026;journal=Proceedings+of+the+2018+IEEE+International+Workshop+on+Signal+Processing+Systems+%28SiPS%29&#x0026;author=Abdelhameed+A.+M.&#x0026;author=Daoud+H.+G.&#x0026;author=Bayoumi+M.&#x0026;publication_year=2018b" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B5" id="B5"></a>Acharya, U. R., Oh, S. L., Hagiwara, Y., Tan, J. H., and Adeli, H. (2018). Deep convolutional neural network for the automated detection and diagnosis of seizure using EEG signals. <i>Comput. Biol. Med.</i> 100, 270&#x2013;278. doi: 10.1016/j.compbiomed.2017.09.017</p> <p class="ReferencesCopy2"><a href="https://pubmed.ncbi.nlm.nih.gov/28974302" target="_blank">PubMed Abstract</a> | <a href="https://doi.org/10.1016/j.compbiomed.2017.09.017" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Deep+convolutional+neural+network+for+the+automated+detection+and+diagnosis+of+seizure+using+EEG+signals%2E&#x0026;journal=Comput%2E+Biol%2E+Med%2E&#x0026;author=Acharya+U.+R.&#x0026;author=Oh+S.+L.&#x0026;author=Hagiwara+Y.&#x0026;author=Tan+J.+H.&#x0026;author=Adeli+H.&#x0026;publication_year=2018&#x0026;volume=100&#x0026;pages=270&#x2013;278" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B6" id="B6"></a>Acharya, U. R., Sree, S. V., Swapna, G., Martis, R. J., and Suri, J. S. (2013). Automated EEG analysis of epilepsy: a review. <i>Knowled. Based Syst.</i> 45, 147&#x2013;165. doi: 10.1016/j.knosys.2013.02.014</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1016/j.knosys.2013.02.014" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Automated+EEG+analysis+of+epilepsy%3A+a+review%2E&#x0026;journal=Knowled%2E+Based+Syst%2E&#x0026;author=Acharya+U.+R.&#x0026;author=Sree+S.+V.&#x0026;author=Swapna+G.&#x0026;author=Martis+R.+J.&#x0026;author=Suri+J.+S.&#x0026;publication_year=2013&#x0026;volume=45&#x0026;pages=147&#x2013;165" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B7" id="B7"></a>Bengio, Y., Simard, P., and Frasconi, P. (1994). Learning long-term dependencies with gradient descent is difficult. <i>IEEE Transact. Neur. Netw.</i> 5, 157&#x2013;166. doi: 10.1109/72.279181</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1109/72.279181" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Learning+long-term+dependencies+with+gradient+descent+is+difficult%2E&#x0026;journal=IEEE+Transact%2E+Neur%2E+Netw%2E&#x0026;author=Bengio+Y.&#x0026;author=Simard+P.&#x0026;author=Frasconi+P.&#x0026;publication_year=1994&#x0026;volume=5&#x0026;pages=157&#x2013;166" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B8" id="B8"></a>Bottou, L. (2004). <i>Stochastic Learning. Advanced Lectures on Machine Learning, LNAI</i>, Vol. 3176. Berlin: Springer, 146&#x2013;168.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?&#x0026;journal=Stochastic+Learning%2E+Advanced+Lectures+on+Machine+Learning%2C+LNAI&#x0026;author=Bottou+L.&#x0026;publication_year=2004&#x0026;volume=Vol. 3176&#x0026;pages=146&#x2013;168" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B9" id="B9"></a>Chen, G., Xie, W., Bui, T. D., and Krzy&#x017C;ak, A. (2017). Automatic epileptic seizure detection in EEG using nonsubsampled wavelet&#x2013;fourier features. <i>J. Med. Biol. Eng.</i> 37, 123&#x2013;131. doi: 10.1007/s40846-016-0214-0</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1007/s40846-016-0214-0" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Automatic+epileptic+seizure+detection+in+EEG+using+nonsubsampled+wavelet&#x2013;fourier+features%2E&#x0026;journal=J%2E+Med%2E+Biol%2E+Eng%2E&#x0026;author=Chen+G.&#x0026;author=Xie+W.&#x0026;author=Bui+T.+D.&#x0026;author=Krzy&#x017C;ak+A.&#x0026;publication_year=2017&#x0026;volume=37&#x0026;pages=123&#x2013;131" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B10" id="B10"></a>Epilepsy in Children (2020). <i>Diagnosis &#x0026; Treatment HealthyChildren.org.</i> Available online at: <a href="https://www.healthychildren.org/English/health-issues/conditions/seizures/Pages/Epilepsy-in-Children-Diagnosis-and-Treatment.aspx">https://www.healthychildren.org/English/health-issues/conditions/seizures/Pages/Epilepsy-in-Children-Diagnosis-and-Treatment.aspx</a> (accessed December 15, 2020).</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?&#x0026;journal=Diagnosis+&#x0026;+Treatment+HealthyChildren%2Eorg%2E&#x0026;publication_year=2020" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B11" id="B11"></a>Gao, Y., Gao, B., Chen, Q., Liu, J., and Zhang, Y. (2020). Deep convolutional neural network-based epileptic electroencephalogram (EEG) signal classification. <i>Front. Neurol.</i> 11:375. doi: 10.3389/fneur.2020.00375</p> <p class="ReferencesCopy2"><a href="https://pubmed.ncbi.nlm.nih.gov/32528398" target="_blank">PubMed Abstract</a> | <a href="https://doi.org/10.3389/fneur.2020.00375" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Deep+convolutional+neural+network-based+epileptic+electroencephalogram+%28EEG%29+signal+classification%2E&#x0026;journal=Front%2E+Neurol%2E&#x0026;author=Gao+Y.&#x0026;author=Gao+B.&#x0026;author=Chen+Q.&#x0026;author=Liu+J.&#x0026;author=Zhang+Y.&#x0026;publication_year=2020&#x0026;volume=11&#x0026;issue=375" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B12" id="B12"></a>Gogna, A., Majumdar, A., and Ward, R. (2017). Semi-supervised stacked label consistent autoencoder for reconstruction and analysis of biomedical signals. <i>IEEE Transact. Biomed. Eng.</i> 64, 2196&#x2013;2205. doi: 10.1109/tbme.2016.2631620</p> <p class="ReferencesCopy2"><a href="https://pubmed.ncbi.nlm.nih.gov/27893378" target="_blank">PubMed Abstract</a> | <a href="https://doi.org/10.1109/tbme.2016.2631620" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Semi-supervised+stacked+label+consistent+autoencoder+for+reconstruction+and+analysis+of+biomedical+signals%2E&#x0026;journal=IEEE+Transact%2E+Biomed%2E+Eng%2E&#x0026;author=Gogna+A.&#x0026;author=Majumdar+A.&#x0026;author=Ward+R.&#x0026;publication_year=2017&#x0026;volume=64&#x0026;pages=2196&#x2013;2205" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B13" id="B13"></a>Goodfellow, I., Bengio, Y., and Courville, A. (2017). <i>Deep Learning.</i> Cambridge, MA: The MIT Press.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?&#x0026;journal=Deep+Learning%2E&#x0026;author=Goodfellow+I.&#x0026;author=Bengio+Y.&#x0026;author=Courville+A.&#x0026;publication_year=2017" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B14" id="B14"></a>He, H., and Ma, Y. (2013). <i>Imbalanced Learning Foundations, Algorithms, and Applications.</i> Hoboken, NJ: IEEE Press.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?&#x0026;journal=Imbalanced+Learning+Foundations%2C+Algorithms%2C+and+Applications%2E&#x0026;author=He+H.&#x0026;author=Ma+Y.&#x0026;publication_year=2013" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B15" id="B15"></a>Hochreiter, S., and Schmidhuber, J. (1997). Long short-term memory. <i>Neur. Comput.</i> 9, 1735&#x2013;1780.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Long+short-term+memory%2E&#x0026;journal=Neur%2E+Comput%2E&#x0026;author=Hochreiter+S.&#x0026;author=Schmidhuber+J.&#x0026;publication_year=1997&#x0026;volume=9&#x0026;pages=1735&#x2013;1780" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B16" id="B16"></a>Hossain, M. S., Amin, S. U., Alsulaiman, M., and Muhammad, G. (2019). Applying deep learning for epilepsy seizure detection and brain mapping visualization. <i>ACM Transact. Multimed. Comput. Commun. Appl.</i> 15, 1&#x2013;17. doi: 10.1145/3241056</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1145/3241056" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Applying+deep+learning+for+epilepsy+seizure+detection+and+brain+mapping+visualization%2E&#x0026;journal=ACM+Transact%2E+Multimed%2E+Comput%2E+Commun%2E+Appl%2E&#x0026;author=Hossain+M.+S.&#x0026;author=Amin+S.+U.&#x0026;author=Alsulaiman+M.&#x0026;author=Muhammad+G.&#x0026;publication_year=2019&#x0026;volume=15&#x0026;pages=1&#x2013;17" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B17" id="B17"></a>Hu, W., Cao, J., Lai, X., and Liu, J. (2019). Mean amplitude spectrum based epileptic state classification for seizure prediction using convolutional neural networks. <i>J. Ambient Intell. Human. Comput.</i> doi: 10.1007/s12652-019-01220-6</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1007/s12652-019-01220-6" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Mean+amplitude+spectrum+based+epileptic+state+classification+for+seizure+prediction+using+convolutional+neural+networks%2E&#x0026;journal=J%2E+Ambient+Intell%2E+Human%2E+Comput%2E&#x0026;author=Hu+W.&#x0026;author=Cao+J.&#x0026;author=Lai+X.&#x0026;author=Liu+J.&#x0026;publication_year=2019" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B18" id="B18"></a>Ioffe, S., and Szegedy, C. (2015). Batch normalization: accelerating deep network training by reducing internal covariate shift. <i>arXiv</i> [Preprint]. arXiv:1502.03167.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Batch+normalization%3A+accelerating+deep+network+training+by+reducing+internal+covariate+shift%2E&#x0026;journal=arXiv&#x0026;author=Ioffe+S.&#x0026;author=Szegedy+C.&#x0026;publication_year=2015" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B19" id="B19"></a>Jaiswal, A. K., and Banka, H. (2017). Local pattern transformation based feature extraction techniques for classification of epileptic EEG signals. <i>Biomed. Signal Process. Control</i> 34, 81&#x2013;92. doi: 10.1016/j.bspc.2017.01.005</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1016/j.bspc.2017.01.005" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Local+pattern+transformation+based+feature+extraction+techniques+for+classification+of+epileptic+EEG+signals%2E&#x0026;journal=Biomed%2E+Signal+Process%2E+Control&#x0026;author=Jaiswal+A.+K.&#x0026;author=Banka+H.&#x0026;publication_year=2017&#x0026;volume=34&#x0026;pages=81&#x2013;92" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B20" id="B20"></a>Ke, H., Chen, D., Li, X., Tang, Y., Shah, T., and Ranjan, R. (2018). Towards brain big data classification: epileptic EEG identification with a lightweight VGGNet on global MIC. <i>IEEE Access</i> 6, 14722&#x2013;14733. doi: 10.1109/access.2018.2810882</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1109/access.2018.2810882" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Towards+brain+big+data+classification%3A+epileptic+EEG+identification+with+a+lightweight+VGGNet+on+global+MIC%2E&#x0026;journal=IEEE+Access&#x0026;author=Ke+H.&#x0026;author=Chen+D.&#x0026;author=Li+X.&#x0026;author=Tang+Y.&#x0026;author=Shah+T.&#x0026;author=Ranjan+R.&#x0026;publication_year=2018&#x0026;volume=6&#x0026;pages=14722&#x2013;14733" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B21" id="B21"></a>Kingma, D., and Ba, J. (2014). Adam: a method for stochastic optimization. <i>ArXiv</i> [Preprint]. arXiv:1412.6980.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Adam%3A+a+method+for+stochastic+optimization%2E&#x0026;journal=ArXiv&#x0026;author=Kingma+D.&#x0026;author=Ba+J.&#x0026;publication_year=2014" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B22" id="B22"></a>Krizhevsky, A., Sutskever, I., and Hinton, G. E. (2017). ImageNet classification with deep convolutional neural networks. <i>Commun. ACM</i> 60, 84&#x2013;90. doi: 10.1145/3065386</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1145/3065386" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=ImageNet+classification+with+deep+convolutional+neural+networks%2E&#x0026;journal=Commun%2E+ACM&#x0026;author=Krizhevsky+A.&#x0026;author=Sutskever+I.&#x0026;author=Hinton+G.+E.&#x0026;publication_year=2017&#x0026;volume=60&#x0026;pages=84&#x2013;90" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B23" id="B23"></a>Kruskal, W. H., and Wallis, W. A. (1952). Use of ranks in one-criterion variance analysis. <i>J. Am. Statist. Associat.</i> 47, 583&#x2013;621. doi: 10.1080/01621459.1952.10483441</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1080/01621459.1952.10483441" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Use+of+ranks+in+one-criterion+variance+analysis%2E&#x0026;journal=J%2E+Am%2E+Statist%2E+Associat%2E&#x0026;author=Kruskal+W.+H.&#x0026;author=Wallis+W.+A.&#x0026;publication_year=1952&#x0026;volume=47&#x0026;pages=583&#x2013;621" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B24" id="B24"></a>Mozer, M. C. (1989). A focused backpropagation algorithm for temporal pattern recognition. <i>Comp. Syst.</i> 3, 349&#x2013;381.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?&#x0026;title=A+focused+backpropagation+algorithm+for+temporal+pattern+recognition%2E&#x0026;journal=Comp%2E+Syst%2E&#x0026;author=Mozer+M.+C.&#x0026;publication_year=1989&#x0026;volume=3&#x0026;pages=349&#x2013;381" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B25" id="B25"></a>Orhan, U., Hekim, M., and Ozer, M. (2011). EEG signals classification using the K-means clustering and a multilayer perceptron neural network model. <i>Exp. Syst. Appl.</i> 38, 13475&#x2013;13481. doi: 10.1016/j.eswa.2011.04.149</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1016/j.eswa.2011.04.149" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=EEG+signals+classification+using+the+K-means+clustering+and+a+multilayer+perceptron+neural+network+model%2E&#x0026;journal=Exp%2E+Syst%2E+Appl%2E&#x0026;author=Orhan+U.&#x0026;author=Hekim+M.&#x0026;author=Ozer+M.&#x0026;publication_year=2011&#x0026;volume=38&#x0026;pages=13475&#x2013;13481" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B26" id="B26"></a>Raghu, S., Sriraam, N., Hegde, A. S., and Kubben, P. L. (2019). A novel approach for classification of epileptic seizures using matrix determinant. <i>Exp. Syst. Appl.</i> 127, 323&#x2013;341. doi: 10.1016/j.eswa.2019.03.021</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1016/j.eswa.2019.03.021" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=A+novel+approach+for+classification+of+epileptic+seizures+using+matrix+determinant%2E&#x0026;journal=Exp%2E+Syst%2E+Appl%2E&#x0026;author=Raghu+S.&#x0026;author=Sriraam+N.&#x0026;author=Hegde+A.+S.&#x0026;author=Kubben+P.+L.&#x0026;publication_year=2019&#x0026;volume=127&#x0026;pages=323&#x2013;341" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B27" id="B27"></a>Samiee, K., Kovacs, P., and Gabbouj, M. (2015). Epileptic seizure classification of EEG time-series using rational discrete short-time fourier transform. <i>IEEE Transact. Biomed. Eng.</i> 62, 541&#x2013;552. doi: 10.1109/tbme.2014.2360101</p> <p class="ReferencesCopy2"><a href="https://pubmed.ncbi.nlm.nih.gov/25265603" target="_blank">PubMed Abstract</a> | <a href="https://doi.org/10.1109/tbme.2014.2360101" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Epileptic+seizure+classification+of+EEG+time-series+using+rational+discrete+short-time+fourier+transform%2E&#x0026;journal=IEEE+Transact%2E+Biomed%2E+Eng%2E&#x0026;author=Samiee+K.&#x0026;author=Kovacs+P.&#x0026;author=Gabbouj+M.&#x0026;publication_year=2015&#x0026;volume=62&#x0026;pages=541&#x2013;552" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B28" id="B28"></a>Schomer, D. L., and Lopez da Silva, H. F. (2018). <i>Niedermeyer&#x2019;s Electroencephalography: Basic Principles, Clinical Applications, and Related Fields.</i> New York, NY: Oxford University Press.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?&#x0026;journal=Niedermeyer&#x2019;s+Electroencephalography%3A+Basic+Principles%2C+Clinical+Applications%2C+and+Related+Fields%2E&#x0026;author=Schomer+D.+L.&#x0026;author=Lopez+da+Silva+H.+F.&#x0026;publication_year=2018" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B29" id="B29"></a>She, Q., Hu Bo Luo, Z., Nguyen, T., and Zhang, L. (2018). A hierarchical semi-supervised extreme learning machine method for EEG recognition. <i>Med. Biol. Eng. Comput.</i> 57, 147&#x2013;157. doi: 10.1007/s11517-018-1875-3</p> <p class="ReferencesCopy2"><a href="https://pubmed.ncbi.nlm.nih.gov/30054779" target="_blank">PubMed Abstract</a> | <a href="https://doi.org/10.1007/s11517-018-1875-3" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=A+hierarchical+semi-supervised+extreme+learning+machine+method+for+EEG+recognition%2E&#x0026;journal=Med%2E+Biol%2E+Eng%2E+Comput%2E&#x0026;author=She+Q.&#x0026;author=Hu&#x0026;author=Bo&#x0026;author=Luo+Z.&#x0026;author=Nguyen+T.&#x0026;author=Zhang+L.&#x0026;publication_year=2018&#x0026;volume=57&#x0026;pages=147&#x2013;157" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B30" id="B30"></a>Shoeb, A. H. (2009). <i>Application of Machine Learning to Epileptic Seizure Onset Detection and Treatment.</i> Ph.D. thesis, Massachusetts Institute of Technology, Cambridge, MA.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?&#x0026;journal=Application+of+Machine+Learning+to+Epileptic+Seizure+Onset+Detection+and+Treatment%2E&#x0026;author=Shoeb+A.+H.&#x0026;publication_year=2009" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B31" id="B31"></a>Sokolova, M., and Lapalme, G. (2009). A systematic analysis of performance measures for classification tasks. <i>Inform. Process. Manag.</i> 45, 427&#x2013;437. doi: 10.1016/j.ipm.2009.03.002</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1016/j.ipm.2009.03.002" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=A+systematic+analysis+of+performance+measures+for+classification+tasks%2E&#x0026;journal=Inform%2E+Process%2E+Manag%2E&#x0026;author=Sokolova+M.&#x0026;author=Lapalme+G.&#x0026;publication_year=2009&#x0026;volume=45&#x0026;pages=427&#x2013;437" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B32" id="B32"></a>Song, Y., Crowcroft, J., and Zhang, J. (2012). Automatic epileptic seizure detection in EEGs based on optimized sample entropy and extreme learning machine. <i>J. Neurosci. Methods</i> 210, 132&#x2013;146. doi: 10.1016/j.jneumeth.2012.07.003</p> <p class="ReferencesCopy2"><a href="https://pubmed.ncbi.nlm.nih.gov/22824535" target="_blank">PubMed Abstract</a> | <a href="https://doi.org/10.1016/j.jneumeth.2012.07.003" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Automatic+epileptic+seizure+detection+in+EEGs+based+on+optimized+sample+entropy+and+extreme+learning+machine%2E&#x0026;journal=J%2E+Neurosci%2E+Methods&#x0026;author=Song+Y.&#x0026;author=Crowcroft+J.&#x0026;author=Zhang+J.&#x0026;publication_year=2012&#x0026;volume=210&#x0026;pages=132&#x2013;146" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B33" id="B33"></a>Tieleman, T., and Hinton, G. (2012). Lecture 6.5-RMSprop: divide the gradient by a running average of its recent magnitude. <i>COURSERA Neur. Netw. Mach. Learn.</i> 4, 26&#x2013;31.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Lecture+6%2E5-RMSprop%3A+divide+the+gradient+by+a+running+average+of+its+recent+magnitude%2E&#x0026;journal=COURSERA+Neur%2E+Netw%2E+Mach%2E+Learn%2E&#x0026;author=Tieleman+T.&#x0026;author=Hinton+G.&#x0026;publication_year=2012&#x0026;volume=4&#x0026;pages=26&#x2013;31" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B34" id="B34"></a>Wang, D., Ren, D., Li, K., Feng, Y., Ma, D., Yan, X., et al. (2018). Epileptic seizure detection in long-term EEG recordings by using wavelet-based directed transfer function. <i>IEEE Transact. Biomed. Eng.</i> 65, 2591&#x2013;2599. doi: 10.1109/tbme.2018.2809798</p> <p class="ReferencesCopy2"><a href="https://pubmed.ncbi.nlm.nih.gov/29993489" target="_blank">PubMed Abstract</a> | <a href="https://doi.org/10.1109/tbme.2018.2809798" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Epileptic+seizure+detection+in+long-term+EEG+recordings+by+using+wavelet-based+directed+transfer+function%2E&#x0026;journal=IEEE+Transact%2E+Biomed%2E+Eng%2E&#x0026;author=Wang+D.&#x0026;author=Ren+D.&#x0026;author=Li+K.&#x0026;author=Feng+Y.&#x0026;author=Ma+D.&#x0026;author=Yan+X.&#x0026;publication_year=2018&#x0026;volume=65&#x0026;pages=2591&#x2013;2599" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B35" id="B35"></a>Wang, X., Zhao, Y., and Pourpanah, F. (2020). Recent advances in deep learning. <i>Int. J. Mach. Learn. Cybern.</i> 11, 747&#x2013;750. doi: 10.1007/s13042-020-01096-5</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1007/s13042-020-01096-5" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Recent+advances+in+deep+learning%2E&#x0026;journal=Int%2E+J%2E+Mach%2E+Learn%2E+Cybern%2E&#x0026;author=Wang+X.&#x0026;author=Zhao+Y.&#x0026;author=Pourpanah+F.&#x0026;publication_year=2020&#x0026;volume=11&#x0026;pages=747&#x2013;750" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B36" id="B36"></a>Wei, X., Zhou, L., Zhang, Z., Chen, Z., and Zhou, Y. (2019). Early prediction of epileptic seizures using a long-term recurrent convolutional network. <i>J. Neurosci. Methods</i> 327:108395. doi: 10.1016/j.jneumeth.2019.108395</p> <p class="ReferencesCopy2"><a href="https://pubmed.ncbi.nlm.nih.gov/31408651" target="_blank">PubMed Abstract</a> | <a href="https://doi.org/10.1016/j.jneumeth.2019.108395" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Early+prediction+of+epileptic+seizures+using+a+long-term+recurrent+convolutional+network%2E&#x0026;journal=J%2E+Neurosci%2E+Methods&#x0026;author=Wei+X.&#x0026;author=Zhou+L.&#x0026;author=Zhang+Z.&#x0026;author=Chen+Z.&#x0026;author=Zhou+Y.&#x0026;publication_year=2019&#x0026;volume=327&#x0026;issue=108395" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B37" id="B37"></a>World Health Organization (2020). <i>Epilepsy.</i> Available online at: <a href="https://www.who.int/en/news-room/fact-sheets/detail/epilepsy">https://www.who.int/en/news-room/fact-sheets/detail/epilepsy</a> (accessed December 5, 2020).</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?&#x0026;journal=Epilepsy%2E&#x0026;publication_year=2020" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B38" id="B38"></a>Yavuz, E., Kasapba&#x015F;&#x0131;, M. C., Ey&#x00FC;po&#x011F;lu, C., and Yaz&#x0131;c&#x0131;, R. (2018). An epileptic seizure detection system based on cepstral analysis and generalized regression neural network. <i>Biocybern. Biomed. Eng.</i> 38, 201&#x2013;216. doi: 10.1016/j.bbe.2018.01.002</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1016/j.bbe.2018.01.002" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=An+epileptic+seizure+detection+system+based+on+cepstral+analysis+and+generalized+regression+neural+network%2E&#x0026;journal=Biocybern%2E+Biomed%2E+Eng%2E&#x0026;author=Yavuz+E.&#x0026;author=Kasapba&#x015F;&#x0131;+M.+C.&#x0026;author=Ey&#x00FC;po&#x011F;lu+C.&#x0026;author=Yaz&#x0131;c&#x0131;+R.&#x0026;publication_year=2018&#x0026;volume=38&#x0026;pages=201&#x2013;216" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B39" id="B39"></a>Yuan, Y., Xun, G., Jia, K., and Zhang, A. (2017). &#x201C;A multi-view deep learning method for epileptic seizure detection using short-time fourier transform,&#x201D; in <i>Proceedings of the 8th ACM International Conference on Bioinformatics, Computational Biology, and Health Informatics</i>, New York, NY. doi: 10.1145/3107411.3107419</p> <p class="ReferencesCopy2"><a href="https://doi.org/10.1145/3107411.3107419" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=A+multi-view+deep+learning+method+for+epileptic+seizure+detection+using+short-time+fourier+transform&#x0026;journal=Proceedings+of+the+8th+ACM+International+Conference+on+Bioinformatics%2C+Computational+Biology%2C+and+Health+Informatics&#x0026;author=Yuan+Y.&#x0026;author=Xun+G.&#x0026;author=Jia+K.&#x0026;author=Zhang+A.&#x0026;publication_year=2017" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B40" id="B40"></a>Zeiler, M. D. (2012). ADADELTA: an adaptive learning rate method. <i>arXiv</i> [Preprint]. arXiv:1212.5701.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?&#x0026;title=ADADELTA%3A+an+adaptive+learning+rate+method%2E&#x0026;journal=arXiv&#x0026;author=Zeiler+M.+D.&#x0026;publication_year=2012" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B41" id="B41"></a>Zhou, M., Tian, C., Cao, R., Wang, B., Niu, Y., Hu, T., et al. (2018). Epileptic Seizure detection based on EEG signals and CNN. <i>Fronti. Neuroinform.</i> 12:95. doi: 10.3389/fninf.2018.00095</p> <p class="ReferencesCopy2"><a href="https://pubmed.ncbi.nlm.nih.gov/30618700" target="_blank">PubMed Abstract</a> | <a href="https://doi.org/10.3389/fninf.2018.00095" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?&#x0026;title=Epileptic+Seizure+detection+based+on+EEG+signals+and+CNN%2E&#x0026;journal=Fronti%2E+Neuroinform%2E&#x0026;author=Zhou+M.&#x0026;author=Tian+C.&#x0026;author=Cao+R.&#x0026;author=Wang+B.&#x0026;author=Niu+Y.&#x0026;author=Hu+T.&#x0026;publication_year=2018&#x0026;volume=12&#x0026;issue=95" target="_blank">Google Scholar</a></p> </div> </div> <div class="thinLineM20"></div> <div class="AbstractSummary"> <p><span>Keywords</span>: deep learning, epileptic seizure detection, EEG, autoencoders, classification, convolutional neural network (CNN), bidirectional long short term memory (Bi LSTM)</p> <p><span>Citation:</span> Abdelhameed A and Bayoumi M (2021) A Deep Learning Approach for Automatic Seizure Detection in Children With Epilepsy. <i>Front. Comput. Neurosci.</i> 15:650050. doi: 10.3389/fncom.2021.650050</p> <p id="timestamps"> <span>Received:</span> 06 January 2021; <span>Accepted:</span> 15 March 2021;<br><span>Published:</span> 08 April 2021.</p> <div> <p>Edited by:</p> <a href="https://loop.frontiersin.org/people/1003479/overview">Saman Sargolzaei</a>, University of Tennessee at Martin, United States</div> <div> <p>Reviewed by:</p> <a href="https://loop.frontiersin.org/people/1191759/overview">Nhan Duy Truong</a>, The University of Sydney, Australia<br> <a href="https://loop.frontiersin.org/people/222607/overview">Antonio Dourado</a>, University of Coimbra, Portugal</div> <p><span>Copyright</span> &#x00A9; 2021 Abdelhameed and Bayoumi. This is an open-access article distributed under the terms of the <a rel="license" href="http://creativecommons.org/licenses/by/4.0/" target="_blank">Creative Commons Attribution License (CC BY)</a>. The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p> <p><span>*Correspondence:</span> Ahmed Abdelhameed, <a href="mailto:ahmed.abdelhameed1@louisiana.edu">ahmed.abdelhameed1@louisiana.edu</a></p> <div class="clear"></div> </div></div></div> <p class="AbstractSummary__disclaimer"><span>Disclaimer: </span> All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher. </p></div></section></main></div> <div><!----></div></div></div> <!----> <footer class="Footer"><h1 class="acc-hidden">Footer</h1> <div class="Footer__wrapper"><div class="Footer__sections"><ul class="Accordion"><li class="Accordion__item"><button class="Accordion__headline"><!----> <div class="Accordion__title">Guidelines</div> <div class="Accordion__space"></div> <div class="Accordion__arrow"></div></button> <div class="Accordion__content Accordion__content--fadeOut" style="height:0px;"><ul><li><a href="https://www.frontiersin.org/guidelines/author-guidelines" target="_self" data-event="footer-block_0-a_authorGuidelines">Author guidelines</a></li><li><a href="https://www.frontiersin.org/guidelines/editor-guidelines" target="_self" data-event="footer-block_0-a_editorGuidelines">Editor guidelines</a></li><li><a href="https://www.frontiersin.org/guidelines/policies-and-publication-ethics" target="_self" data-event="footer-block_0-a_policiesAndPublicationE">Policies and publication ethics</a></li><li><a href="https://www.frontiersin.org/about/fee-policy" target="_self" data-event="footer-block_0-a_feePolicy">Fee policy</a></li></ul></div></li><li class="Accordion__item"><button class="Accordion__headline"><!----> <div class="Accordion__title">Explore</div> <div class="Accordion__space"></div> <div class="Accordion__arrow"></div></button> <div class="Accordion__content Accordion__content--fadeOut" style="height:0px;"><ul><li><a href="https://www.frontiersin.org/articles" target="_self" data-event="footer-block_1-a_articles">Articles</a></li><li><a href="https://www.frontiersin.org/research-topics" target="_self" data-event="footer-block_1-a_researchTopics">Research Topics </a></li><li><a href="https://www.frontiersin.org/journals" target="_self" data-event="footer-block_1-a_journals">Journals</a></li><li><a href="https://www.frontiersin.org/about/how-we-publish" target="_self" data-event="footer-block_1-a_howWePublish">How we publish</a></li></ul></div></li><li class="Accordion__item"><button class="Accordion__headline"><!----> <div class="Accordion__title">Outreach</div> <div class="Accordion__space"></div> <div class="Accordion__arrow"></div></button> <div class="Accordion__content Accordion__content--fadeOut" style="height:0px;"><ul><li><a href="https://forum.frontiersin.org/" target="_blank" data-event="footer-block_2-a_frontiersForum">Frontiers Forum </a></li><li><a href="https://policylabs.frontiersin.org/" target="_blank" data-event="footer-block_2-a_frontiersPolicyLabs">Frontiers Policy Labs </a></li><li><a href="https://kids.frontiersin.org/" target="_blank" data-event="footer-block_2-a_frontiersForYoungMinds">Frontiers for Young Minds</a></li><li><a href="https://www.frontiersplanetprize.org/" target="_blank" data-event="footer-block_2-a_frontiersPlanetPrize">Frontiers Planet Prize</a></li></ul></div></li><li class="Accordion__item"><button class="Accordion__headline"><!----> <div class="Accordion__title">Connect</div> <div class="Accordion__space"></div> <div class="Accordion__arrow"></div></button> <div class="Accordion__content Accordion__content--fadeOut" style="height:0px;"><ul><li><a href="https://helpcenter.frontiersin.org" target="_blank" data-event="footer-block_3-a_helpCenter">Help center</a></li><li><a href="https://loop.frontiersin.org/settings/email-preferences?a=publishers" target="_blank" data-event="footer-block_3-a_emailsAndAlerts">Emails and alerts </a></li><li><a href="https://www.frontiersin.org/about/contact" target="_self" data-event="footer-block_3-a_contactUs">Contact us </a></li><li><a href="https://www.frontiersin.org/submission/submit" target="_self" data-event="footer-block_3-a_submit">Submit</a></li><li><a href="https://careers.frontiersin.org/" target="_blank" data-event="footer-block_3-a_careerOpportunities">Career opportunities</a></li></ul></div></li></ul> <div class="Footer__socialLinks"><div class="Footer__socialLinks__title">Follow us</div> <span class="Link__wrapper"><a aria-label="Frontiers Facebook" href="https://www.facebook.com/Frontiersin" target="_blank" data-event="footer-facebook-a_" class="Link Link--linkType Link--grey Link--medium Link--icon Link--facebook Link--right"><span></span></a></span><span class="Link__wrapper"><a aria-label="Frontiers Twitter" href="https://twitter.com/frontiersin" target="_blank" data-event="footer-twitter-a_" class="Link Link--linkType Link--grey Link--medium Link--icon Link--twitter Link--right"><span></span></a></span><span class="Link__wrapper"><a aria-label="Frontiers LinkedIn" href="https://www.linkedin.com/company/frontiers" target="_blank" data-event="footer-linkedIn-a_" class="Link Link--linkType Link--grey Link--medium Link--icon Link--linkedin Link--right"><span></span></a></span><span class="Link__wrapper"><a aria-label="Frontiers Instagram" href="https://www.instagram.com/frontiersin_" target="_blank" data-event="footer-instagram-a_" class="Link Link--linkType Link--grey Link--medium Link--icon Link--instagram Link--right"><span></span></a></span></div></div> <div class="Footer__copyright"><div><span>© 2024 Frontiers Media S.A. All rights reserved</span></div> <div><a href="https://www.frontiersin.org/legal/privacy-policy" target="_blank">Privacy policy</a> <span>|</span> <a href="https://www.frontiersin.org/legal/terms-and-conditions" target="_blank">Terms and conditions</a></div></div></div></footer> <div class="SnackbarWrapper"><ul class="SnackbarContainer"></ul></div></div></div></div><script>window.__NUXT__=(function(a,b,c,d,e,f,g,h,i,j,k,l,m,n,o,p,q,r,s,t,u,v,w,x,y,z,A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z,_,$,aa,ab,ac,ad,ae,af,ag,ah,ai,aj,ak,al,am,an,ao,ap,aq,ar,as,at,au,av,aw,ax,ay,az,aA,aB,aC,aD,aE,aF,aG,aH,aI,aJ,aK,aL,aM,aN,aO,aP,aQ,aR,aS,aT,aU,aV,aW,aX,aY,aZ,a_,a$,ba,bb,bc,bd,be,bf,bg,bh,bi,bj,bk,bl,bm,bn,bo,bp,bq,br,bs,bt,bu,bv,bw,bx,by,bz,bA,bB,bC,bD,bE,bF,bG,bH,bI,bJ,bK,bL,bM,bN,bO,bP,bQ,bR,bS,bT,bU,bV,bW,bX,bY,bZ,b_,b$,ca,cb,cc,cd,ce,cf,cg,ch,ci,cj,ck,cl,cm,cn,co,cp,cq,cr,cs,ct,cu,cv,cw,cx){return {layout:"ArticleLayout",data:[{}],fetch:{},error:e,state:{currentJournal:{identifier:n,name:o,slug:r,banner:[{id:"56C3A6B3-C9F9-41FF-A2AD73FFDD914D8D",src:P,name:"FNCOM_Main Visual_Red_Website",tags:["colorful","microscopy","embryonic","laser","microscope","network","tissue","molecular","digital","glowing","design",C,"laboratory","brain","organism","research","art","neural","staining","texture","concept","light","biology","nature",Q,"bright","color","abstract","pseudocolor","background",R,"graphic","scanning","neuron","processing",S,"illustration","black","cell",T,"sensory","pattern"],type:C,width:4822,height:3348,idHash:"3a51263c7d601f",archive:m,brandId:"22C10171-81B3-4DA6-99342F272A32E8BB",limited:m,fileSize:7981135,isPublic:c,original:"https:\u002F\u002Fbrand.frontiersin.org\u002Fm\u002F3a51263c7d601f\u002Foriginal\u002FFNCOM_Main-Visual_Red_Website.jpg",copyright:h,extension:["jpg"],thumbnails:{mini:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002F56C3A6B3-C9F9-41FF-A2AD73FFDD914D8D\u002Fmini-2E126F87-2962-4BFC-82DC4F2F62AC1FFC.png",thul:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002F56C3A6B3-C9F9-41FF-A2AD73FFDD914D8D\u002Fthul-0045B832-2786-4122-B78A1DA6346D1917.png",webimage:P,Guidelines:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002F56C3A6B3-C9F9-41FF-A2AD73FFDD914D8D\u002F86959402-D995-4B96-81E00235CA2EDB4D\u002FGuidelines-FNCOM_Main Visual_Red_Website.png",WebsiteJpg_XL:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002F56C3A6B3-C9F9-41FF-A2AD73FFDD914D8D\u002F86959402-D995-4B96-81E00235CA2EDB4D\u002FWebsiteJpg_XL-FNCOM_Main Visual_Red_Website.jpg",WebsiteWebP_L:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002F56C3A6B3-C9F9-41FF-A2AD73FFDD914D8D\u002F86959402-D995-4B96-81E00235CA2EDB4D\u002FWebsiteWebP_L-FNCOM_Main Visual_Red_Website.webp",WebsiteWebP_M:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002F56C3A6B3-C9F9-41FF-A2AD73FFDD914D8D\u002F86959402-D995-4B96-81E00235CA2EDB4D\u002FWebsiteWebP_M-FNCOM_Main Visual_Red_Website.webp",WebsiteWebP_XL:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002F56C3A6B3-C9F9-41FF-A2AD73FFDD914D8D\u002F86959402-D995-4B96-81E00235CA2EDB4D\u002FWebsiteWebP_XL-FNCOM_Main Visual_Red_Website.webp"},dateCreated:U,description:"Neurons from rat embryonic dorsal root ganglion; Shutterstock ID 1265983318; purchase_order: -; job: -; client: -; other: -",orientation:"landscape",userCreated:"Caroline Sutter",watermarked:m,dateModified:U,datePublished:"2022-06-27T09:27:09Z",ecsArchiveFiles:[],propertyOptions:["414FB2D4-2283-43FD-BE14E534ECA67928","6C18119B-14BD-4951-B437696F4357BD33","7C692885-DB25-4858-B1FB4FF47B241E9B","D88C0047-EC30-4506-A7DF28A4D765E1CF"],property_Channel:["frontiersin_org"],"property_Sub-Type":["Main_Visual"],property_Asset_Type:["Photography"],activeOriginalFocusPoint:{x:2411,y:1674},property_Office_Department:["Publishing"]}],description:"Part of the world's most cited neuroscience series, this journal promotes theoretical modeling of brain function, building key communication between theoretical and experimental neuroscience.",mission:"\u003Cp\u003EFrontiers in Computational Neuroscience is a multidisciplinary journal that focuses on the theoretical modeling of brain function and encourages multidisciplinary interactions between theoretical and experimental neuroscience.\u003C\u002Fp\u003E\n\n\u003Cp\u003ELed by Specialty Chief Editors Misha Tsodyks (Weizmann Institute of Science, Israel), and Si Wu (Peking University, China) Frontiers in Computational Neuroscience welcomes research contributions in various domains of computational neuroscience, which bridge the gap between theoretical models and experimental studies to validate them. Topics include, but are not limited to:\u003C\u002Fp\u003E\n\n\u003Cul\u003E\n \u003Cli\u003Ebiophysically motivated realistic simulations of neurons and synapses\u003C\u002Fli\u003E\n \u003Cli\u003Eexperimental studies that validate and test theoretical conclusions\u003C\u002Fli\u003E\n \u003Cli\u003Ehigh-level abstract models of inference and decision making\u003C\u002Fli\u003E\n \u003Cli\u003Euse of computational tools and AI for prediction and better detection of mental and neurological disorders.\u003C\u002Fli\u003E\n\u003C\u002Ful\u003E\n\n\u003Cp\u003EThe journal supports and advances the Sustainable Development Goal (SDG) 3: good health and well-being by promoting a deeper understanding of brain function and fostering research and collaboration in the field. This contributes to the development of new knowledge and technologies that can potentially improve mental health, neurological disorders, and overall well-being, aligning with the broader goal of ensuring healthy lives and promoting well-being for all at all ages.\u003C\u002Fp\u003E\n\n\u003Cp\u003EManuscripts relating to topics such as the use of artificial intelligence (including neuronal networks method) that does not have any relevant theoretical model implications or practical applications for the neuroscience field are not suitable for publication in this journal.\u003C\u002Fp\u003E\n\n\u003Cp\u003EFrontiers in Computational Neuroscience is dedicated to advancing developments in the field of computational neuroscience by allowing unrestricted access to articles and communicating scientific knowledge to researchers and the public alike, enabling the scientific breakthroughs of the future.\u003C\u002Fp\u003E",palette:S,impactFactor:"3.2",citeScore:"4.8",citations:"30800",showTagline:e,twitter:"@FrontNeurosci",__typename:"Journal"},currentFrontiersJournal:{id:n,name:o,slug:r,printISSN:e,shortName:D,electronicISSN:E,abbreviation:V,specialtyId:W,publicationDate:e,isOnline:g,isOpenForSubmissions:g,spaceId:c,field:{id:X,domainId:c,__typename:"journal_field"},__typename:a},articleHubSlug:h,articleHubPage:F,currentArticle:{id:650050,doi:Y,title:Z,acceptanceDate:new Date(1615820730000),receptionDate:new Date(1609913547000),publicationDate:new Date(1617840000000),isPublished:g,abstract:_,researchTopic:{id:14859,title:"Deep Learning for Neurological Disorders in Children",articlesCount:G,isMagazinePage:l,slug:"deep-learning-for-neurological-disorders-in-children",isOpenForSubmission:l},articleType:{id:24,name:"Original Research"},stage:{id:H,name:h},keywords:["deep learning","Epileptic seizure detection","EEG","Autoencoders","Classification","Convolutional Neural Networks","Bidirectional long short term memory (Bi LSTM)"],authors:[{id:$,firstName:aa,lastName:"Abdelhameed",givenNames:aa,isCorresponding:l,isProfilePublic:g,userId:$,affiliations:[{organizationName:ab,countryName:I,cityName:h,stateName:h,zipCode:h}]},{id:m,firstName:ac,lastName:"Bayoumi",givenNames:ac,isCorresponding:l,isProfilePublic:l,userId:m,affiliations:[{organizationName:ab,countryName:I,cityName:h,stateName:h,zipCode:h}]}],editors:[{id:ad,firstName:ae,lastName:"Sargolzaei",givenNames:ae,isCorresponding:l,isProfilePublic:g,userId:ad,affiliations:[{organizationName:"University of Tennessee at Martin",countryName:I,cityName:h,stateName:h,zipCode:h}]}],reviewers:[{id:af,firstName:ag,lastName:"Dourado",givenNames:ag,isCorresponding:l,isProfilePublic:g,userId:af,affiliations:[{organizationName:"University of Coimbra",countryName:"Portugal",cityName:h,stateName:h,zipCode:h}]},{id:ah,firstName:ai,lastName:"Truong",givenNames:ai,isCorresponding:l,isProfilePublic:g,userId:ah,affiliations:[{organizationName:"The University of Sydney",countryName:"Australia",cityName:h,stateName:h,zipCode:h}]}],journal:{id:n,slug:r,name:o,shortName:D,electronicISSN:E,field:{id:X,domainId:c},specialtyId:W,journalSectionPaths:[{section:e}]},section:e,impactMetrics:{views:19223,downloads:aj,citations:101},volume:ak,articleVolume:"Volume 15 - 2021",relatedArticles:[],isPublishedV2:l,contents:{fullTextHtml:"\u003Cdiv class=\"JournalAbstract\"\u003E\r\n\u003Ca id=\"h1\" name=\"h1\"\u003E\u003C\u002Fa\u003E\u003Ch1\u003EA Deep Learning Approach for Automatic Seizure Detection in Children With Epilepsy\u003C\u002Fh1\u003E\r\n\u003Cdiv class=\"authors\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Fpeople\u002Fu\u002F1163050\" class=\"user-id-1163050\"\u003E\u003Cimg class=\"pr5\" src=\"https:\u002F\u002Floop.frontiersin.org\u002Fimages\u002Fprofile\u002F1163050\u002F24\" onerror=\"this.src='http:\u002F\u002F3b76aaf63d1816bb57bf-a34624e694c43cdf8b40aa048a644ca4.r96.cf2.rackcdn.com\u002FDesign\u002FImages\u002Fnewprofile_default_profileimage_new.jpg'\" alt=\"\"\u002F\u003EAhmed Abdelhameed\u003C\u002Fa\u003E\u003Csup\u003E*\u003C\u002Fsup\u003E and \u003Cimg class=\"pr5\" src=\"http:\u002F\u002F3b76aaf63d1816bb57bf-a34624e694c43cdf8b40aa048a644ca4.r96.cf2.rackcdn.com\u002FDesign\u002FImages\u002Fnewprofile_default_profileimage_new.jpg\" alt=\"\"\u002F\u003EMagdy Bayoumi\u003C\u002Fdiv\u003E\r\n\u003Cul class=\"notes\"\u003E\r\n\u003Cli class=\"pl0\"\u003EDepartment of Electrical and Computer Engineering, University of Louisiana at Lafayette, Lafayette, LA, United States\u003C\u002Fli\u003E\r\n\u003C\u002Ful\u003E\r\n\u003Cp class=\"mb0\"\u003EOver the last few decades, electroencephalogram (EEG) has become one of the most vital tools used by physicians to diagnose several neurological disorders of the human brain and, in particular, to detect seizures. Because of its peculiar nature, the consequent impact of epileptic seizures on the quality of life of patients made the precise diagnosis of epilepsy extremely essential. Therefore, this article proposes a novel deep-learning approach for detecting seizures in pediatric patients based on the classification of raw multichannel EEG signal recordings that are minimally pre-processed. The new approach takes advantage of the automatic feature learning capabilities of a two-dimensional deep convolution autoencoder (2D-DCAE) linked to a neural network-based classifier to form a unified system that is trained in a supervised way to achieve the best classification accuracy between the ictal and interictal brain state signals. For testing and evaluating our approach, two models were designed and assessed using three different EEG data segment lengths and a 10-fold cross-validation scheme. Based on five evaluation metrics, the best performing model was a supervised deep convolutional autoencoder (SDCAE) model that uses a bidirectional long short-term memory (Bi-LSTM) &#x2013; based classifier, and EEG segment length of 4 s. Using the public dataset collected from the Children&#x2019;s Hospital Boston (CHB) and the Massachusetts Institute of Technology (MIT), this model has obtained 98.79 &#x00B1; 0.53% accuracy, 98.72 &#x00B1; 0.77% sensitivity, 98.86 &#x00B1; 0.53% specificity, 98.86 &#x00B1; 0.53% precision, and an F1-score of 98.79 &#x00B1; 0.53%, respectively. Based on these results, our new approach was able to present one of the most effective seizure detection methods compared to other existing state-of-the-art methods applied to the same dataset.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"JournalFullText\"\u003E\r\n\u003Ca id=\"h2\" name=\"h2\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EIntroduction\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb15\"\u003EEpilepsy is inevitably recognized to be one of the most critical and persistent neurological disorders affecting the human brain. It has spread to more than 50 million patients of various ages worldwide (\u003Ca href=\"#B37\"\u003EWorld Health Organization, 2020\u003C\u002Fa\u003E) with approximately 450,000 patients under the age of 17 in the United States out of nearly 3 million American patients diagnosed with this disease (\u003Ca href=\"#B10\"\u003EEpilepsy in Children, 2020\u003C\u002Fa\u003E). Epilepsy can be characterized apparently by its recurrent unprovoked seizures. A seizure is a period of anomalous, synchronous innervation of a population of neurons that may last from seconds to a few minutes. Epileptic seizures are ephemeral instances of partial or complete abnormal unintentional movements of the body that may also be combined with a loss of consciousness. While epileptic seizures rarely occur in each patient, their ensuing effects on the patients&#x2019; emotions, social interactions, and physical communications make diagnosis and treatment of epileptic seizures of ultimate significance.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EElectroencephalograms (EEGs; \u003Ca href=\"#B28\"\u003ESchomer and Lopez da Silva, 2018\u003C\u002Fa\u003E) which have been around for a long time, are commonly used among neurologists to diagnose several brain disorders and in particular, epilepsy attributable to workable reasons, such as its availability, effortlessness, and low cost. EEG operates by positioning several electrodes along the surface of the human scalp and then recording and measuring the voltage oscillations emanating from the ion current flowing through the brain. These voltage oscillations, which correspond to the neuronal activity of the brain, are then transformed into multiple time series called signals. EEG is a very powerful non-invasive diagnostic tool since we can use it precisely to capture and denote epileptic signals that are characterized by spikes, sharp waves, or spike-and-wave complexities. As a result, EEG signals have been the most widely used in the clinical examination of various epileptic brain states, for both the detection and prediction of epileptic seizures.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EBy interpreting the recorded EEG signals visually, neurologists can substantially distinguish between epileptic brain activities during a seizure (ictal) state and normal brain activities between seizures (interictal) state. Over the last two decades, however, an abundance of automated EEG-based epilepsy diagnostic studies has been established. This was motivated by the exhausting and time-consuming nature of the human visual evaluation process that depends mainly on the doctors&#x2019; expertise. Besides that, the need for objective, rapid, and effective systems for the processing of vast amounts of EEG recordings has become unavoidable to be able to diminish the possibility of misinterpretations. The availability of such systems would greatly enhance the quality of life of epileptic patients.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EFollowing the acquisition and pre-processing of EEG raw signals, most of the automated seizure detection techniques consist of two key successive stages. The first stage concerns the extraction and selection of certain features of the EEG signals. In the second step, a classification system is then built and trained to utilize these extracted features for the detection of epileptic activities. The feature extraction step has a direct effect on the precision and sophistication of the developed automatic seizure detection technique. Due to the non-stationary property of the EEG signals, the feature extraction stage typically involves considerable work and significant domain-knowledge to study and analyze the signals either in the time domain, the frequency domain, or in the time-frequency domain (\u003Ca href=\"#B6\"\u003EAcharya et al., 2013\u003C\u002Fa\u003E). Predicated on this research, it has become the mission of the system designer to devise the extraction of the best-representing features that can precisely discriminate between the epileptic brain states from the EEG signals of different subjects.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EIn the literature, several EEG signal features extracted by various methods have been proposed for seizure detection. For example (\u003Ca href=\"#B32\"\u003ESong et al., 2012\u003C\u002Fa\u003E), used approximate entropy and sample entropy as EEG features, and integrated them with an extreme learning machine (ELM) for the automated detection of epileptic seizures. \u003Ca href=\"#B9\"\u003EChen et al. (2017)\u003C\u002Fa\u003E used non-subsampled wavelet&#x2013;Fourier features for seizure detection. \u003Ca href=\"#B34\"\u003EWang et al. (2018)\u003C\u002Fa\u003E proposed an algorithm that combines wavelet decomposition and the directed transfer function (DTF) for feature extraction. \u003Ca href=\"#B26\"\u003ERaghu et al. (2019)\u003C\u002Fa\u003E proposed using matrix determinant as a feature for the analysis of epileptic EEG signals. Certainly, even with the achievement of great results, it is not inherently guaranteed that the features derived through the intricate, and error-prone manual feature extraction methodology would yield the maximum possible classification accuracy. As such, it would be very fitting to work out how to build substantial systems that can automatically learn the best representative features from minimally preprocessed EEG signals while at the same time realize optimum classification performance.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EThe recent advances in machine learning science and particularly the deep learning techniques breakthroughs have shown its superiority for automatically learning very robust features that outperformed the human-engineered features in many fields such as speech recognition, natural language processing, and computer vision as well as medical diagnosis (\u003Ca href=\"#B35\"\u003EWang et al., 2020\u003C\u002Fa\u003E). Multiple seizure detection systems that used artificial neural networks (ANNs) as classifiers, after traditional feature extraction, were reported in previous work. For instance (\u003Ca href=\"#B25\"\u003EOrhan et al., 2011\u003C\u002Fa\u003E), used multilayer perceptron (MLP) for classification after using discrete wavelet transform (DWT) and K-means algorithm for feature extraction. \u003Ca href=\"#B27\"\u003ESamiee et al. (2015)\u003C\u002Fa\u003E also used MLP as a classifier after using discrete short-time Fourier transform (DSTFT) for feature extraction. In \u003Ca href=\"#B19\"\u003EJaiswal and Banka (2017)\u003C\u002Fa\u003E, ANNs were evaluated for classification after using the local neighbor descriptive pattern (LNDP) and one-dimensional local gradient pattern (1D-LGP) techniques for feature extraction. \u003Ca href=\"#B38\"\u003EYavuz et al. (2018)\u003C\u002Fa\u003E performed cepstral analysis utilizing generalized regression neural network for EEG signals classification. On the other hand, convolutional neural networks (CNNs) were adopted for both automatic feature learning and classification. For example (\u003Ca href=\"#B5\"\u003EAcharya et al., 2018\u003C\u002Fa\u003E), proposed a deep CNN consisting of 13 layers for automatic seizure detection. For the same purpose (\u003Ca href=\"#B3\"\u003EAbdelhameed et al., 2018a\u003C\u002Fa\u003E), designed a system that combined a one-dimensional CNN with a bidirectional long short-term memory (Bi-LSTM) recurrent neural network. \u003Ca href=\"#B20\"\u003EKe et al. (2018)\u003C\u002Fa\u003E; \u003Ca href=\"#B41\"\u003EZhou et al. (2018)\u003C\u002Fa\u003E, and \u003Ca href=\"#B16\"\u003EHossain et al. (2019)\u003C\u002Fa\u003E also used CNN for feature extraction and classification. In \u003Ca href=\"#B17\"\u003EHu et al. (2019)\u003C\u002Fa\u003E, CNN and support vector machine (SVM) were incorporated together for feature extraction and classification of EEG signals.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EAs reported, most of the deep learning algorithms that involved automatic feature learning have targeted single-channel epileptic EEG signals. It is therefore still important to research more data-driven algorithms that can handle more complex multichannel epileptic EEG signals.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EIn general, supervised learning is the most widely used technique for classifying EEG signals among all other machine learning techniques. Several researchers have recently experimented with semi-supervised deep learning strategies in which an autoencoder (AE) neural network can benefit from training using both unlabeled and labeled data to improve the efficacy of the classification process (\u003Ca href=\"#B12\"\u003EGogna et al., 2017\u003C\u002Fa\u003E; \u003Ca href=\"#B39\"\u003EYuan et al., 2017\u003C\u002Fa\u003E; \u003Ca href=\"#B1\"\u003EAbdelhameed and Bayoumi, 2018\u003C\u002Fa\u003E, \u003Ca href=\"#B2\"\u003E2019\u003C\u002Fa\u003E; \u003Ca href=\"#B29\"\u003EShe et al., 2018\u003C\u002Fa\u003E). Two approaches of using AEs have been used in the literature. The first one is the stacked AEs approach, where each layer of a neural network consisting of multiple hidden layers is trained individually using an AE in an unsupervised way. After that, all trained layers are stacked together and a softmax layer is attached to form a stacked network that is finally trained in a supervised fashion. The second approach uses deep AEs to pre-train all layers of the neural network simultaneously instead of that greedy layer-wise training. This latter approach still suffers from one particular drawback which is the necessity to train the semi-supervised deep learning model twice. One training episode is conducted in an unsupervised way using unlabeled training data that enables the AE to learn good initial parameters (weights). In the second episode, the first half of the pre-trained AE (the encoder network) attached to a selected classifier is trained as a new system in a supervised manner using labeled data to perform the final classification task.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003ETherefore, in this work, to address the limitation of the classification schemes alluded to above, a novel deep learning-based system that uses a two-dimensional supervised deep convolutional autoencoder (2D-SDCAE) is proposed for the detection of epileptic seizures in multichannel EEG signals recordings. The innovative approach in the proposed system is that the AE is trained only once in a supervised way to perform two tasks at the same time. The first task is to automatically learn the best features from the EEG signals and to summarize them in a succinct, low-dimensional, latent space representation while performing the classification task efficiently. The method of consolidating the simultaneous learning to perform both tasks in a single model, which is trained only once in a supervised way, has proven to have a good impact on improving the learning capabilities of the model and thus achieving better classification accuracy.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EIn addition to operating directly on raw EEG signal data, there are several advantages to our approach. First of all, the SDCAE is faster compared to conventional semi-supervised classification systems since it is trained only once. Second, to minimize the total number of network parameters, the proposed SDCAE uses convolutional layers for learning features instead of fully connected layers that are commonly used in regular MLP-based AEs. Third, the proposed system can be used in signal compression schemes as the original high-dimensional signals can be perfectly reconstructed from the low-dimensional latent representation using the second half of the AE (the decoder network). Finally, the training of AEs in a supervised way is more effective in learning more structured latent representation, making it very feasible to deploy very simple classifiers and still have very high-precision seizure detection systems. It is also worth noting that performance and hardware resource-saving have been taken into account to make the proposed system more suitable for real-time use and potential hardware implementation and deployment.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb0\"\u003ETwo SDCAE models are designed to test our novel approach, and their performance for seizure detection in children is evaluated. Both models are used to classify EEG data segments to distinguish between ictal and interictal brain states. The first model is a two-dimensional deep convolution autoencoder (2D-DCAE) in which the convolutional layers of the encoder network are attached to a simple MLP network consisting of two fully connected hidden layers and one output layer for classification. The second system is also a 2D-DCAE but in this system, the convolutional layers of the encoder network are attached to one Bi-LSTM recurrent neural network layer to do the classification task. Besides, the performance of both proposed models is further compared to two regular deep learning models having the same layers&#x2019; structure, except that the decoder network layers are completely removed. These two models are trained in a supervised manner to only do the classification task. By quantitatively evaluating the performance of the proposed models using different EEG segment lengths, our new approach of using SDCAE will prove to be a very good candidate for producing one of the most accurate seizure detection systems.\u003C\u002Fp\u003E\r\n\u003Ca id=\"h3\" name=\"h3\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EMaterials and Methods\u003C\u002Fh2\u003E\r\n\u003Ch3 class=\"pt0\"\u003EDataset\u003C\u002Fh3\u003E\r\n\u003Cp class=\"mb0\"\u003EPatients&#x2019; data obtained from the online Children&#x2019;s Hospital Boston&#x2013;Massachusetts Institute of Technology (CHB&#x2013;MIT) Database were used to assess and measure the efficacy of the proposed models. The dataset is recorded at Boston Children&#x2019;s Hospital and consists of long-term EEG scalp recordings of 23 pediatric patients with intractable seizures (\u003Ca href=\"#B30\"\u003EShoeb, 2009\u003C\u002Fa\u003E). 23 channels EEG signals recordings are collected using 21 electrodes whose names are specified by the International 10&#x2013;20 electrode positioning system using the modified combinatorial nomenclature as shown in \u003Ca href=\"#F1\"\u003EFigure 1\u003C\u002Fa\u003E. The signals are then sampled at 256 Hz and the band-pass filtered between 0 and 128 Hz.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003EFIGURE 1\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_m\u002Ffncom-15-650050-g001.jpg\" name=\"figure1\" target=\"_blank\"\u003E\r\n\u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_t\u002Ffncom-15-650050-g001.gif\" id=\"F1\" alt=\"www.frontiersin.org\" \u002F\u003E\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003EFigure 1.\u003C\u002Fstrong\u003E 21 EEG electrode positions based on the 10&#x2013;20 system using modified combinatorial nomenclature.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb15 w100pc float_left mt15\"\u003EIn this study, 16 out of the 23 pediatric patients are selected for the assessment of the classification models. More details about the selected patients are listed in \u003Ca href=\"#T1\"\u003ETable 1\u003C\u002Fa\u003E. Seizures less than 10 s are too short so, all Chb16&#x2019;s seizures were not considered for testing (\u003Ca href=\"#B11\"\u003EGao et al., 2020\u003C\u002Fa\u003E). The seizures of the two patients (Chb12 and Chb13) were omitted due to the excess variations in channel naming and electrode positioning swapping. Four patients (Chb04, Chb15, Chb18, and Chb19) have been excluded since they are 16 years of age and older because the aim is to research seizure detection in young children.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003ETABLE 1\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_m\u002Ffncom-15-650050-t001.jpg\" name=\"table1\" target=\"_blank\"\u003E\r\n\u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_t\u002Ffncom-15-650050-t001.gif\" id=\"T1\" alt=\"www.frontiersin.org\" \u002F\u003E\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003ETable 1.\u003C\u002Fstrong\u003E Seizure information of the selected patients.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb15 w100pc float_left mt15\"\u003ETypically, epileptic patients have limited numbers of seizures that span much shorter times relative to seizure-free periods. A discrepancy between the number of ictal and interictal EEG data segments is often present. To surmount the bias in the training process of the classification models in which classifiers tend to favor the class with the largest number of segments, and as a design choice, the number of interictal segments is chosen to be equal to the number of ictal segments while forming the final dataset. Downsampling the original interictal dataset can be found in previous work as in \u003Ca href=\"#B36\"\u003EWei et al. (2019)\u003C\u002Fa\u003E, \u003Ca href=\"#B11\"\u003EGao et al. (2020)\u003C\u002Fa\u003E. Non-overlapped EEG segments of 1, 2, and 4 s duration were tested for evaluating the proposed models. A single EEG segment is represented as a matrix whose dimension is (\u003Ci\u003EL\u003C\u002Fi\u003E &#x00D7; \u003Ci\u003EN\u003C\u002Fi\u003E) where \u003Ci\u003EL\u003C\u002Fi\u003E is the sequence length = 256 &#x00D7; segment duration and \u003Ci\u003EN\u003C\u002Fi\u003E is the number of channels. As an example, one 2-s segment is represented as a 512 &#x00D7; 23 matrix. The EEG dataset is then formed by putting all the ictal and interictal segments in one matrix whose dimension is (\u003Ci\u003E2KL\u003C\u002Fi\u003E &#x00D7; 23) where \u003Ci\u003EK\u003C\u002Fi\u003E is the number of the ictal or interictal segments and \u003Ci\u003EL\u003C\u002Fi\u003E is as defined before.\u003C\u002Fp\u003E\r\n\u003Ch3\u003EDataset Preparation\u003C\u002Fh3\u003E\r\n\u003Cp class=\"mb15\"\u003ETo prepare the EEG dataset before the training phase, all segments combined are pre-processed by applying \u003Ci\u003Ez\u003C\u002Fi\u003E-score normalization for all channels at one to ensure that all values are standardized by having a zero mean (&#x03BC;) and unit standard deviation (&#x03C3;) using the Eq. (1)\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Ex\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmfrac\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Ex\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmi mathvariant=\"normal\"\u003E&#x03BC;\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmi mathvariant=\"normal\"\u003E&#x03C3;\u003C\u002Fmi\u003E\r\n\u003C\u002Fmfrac\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E1\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb0\"\u003ENext, as a batch, the whole dataset values are scaled to the [0, 1] range using Min&#x2013;Max normalization to ensure that the original and the reconstructed segments have the same range of values. Finally, the channel&#x2019;s dimension of the segments is extended by one column to be more suitable for the AE to be used.\u003C\u002Fp\u003E\r\n\u003Ch3\u003EProposed Systems Architecture\u003C\u002Fh3\u003E\r\n\u003Cp class=\"mb0\"\u003EThe objective of the article is to build accurate and reliable deep learning models for epileptic seizure detection based on differentiating between two classes of epileptic brain states, interictal and ictal. The proposed models automatically learn powerful features that help to achieve a high classification accuracy of minimally pre-processed EEG signals. Our target is to eliminate the overhead induced by the exhausting manual feature extraction process and also replacing complex systems that require long training times with a much simpler, faster, and more efficient system that benefits from the structure and functionality of AEs. An AE neural network consists of two subnetworks: an encoder and a decoder. The encoder network is used for compressing (encoding) the input information (EEG signals in our case) into a lower-dimensional representation and the decoder is used in a reverse way to decompress or reconstruct the original signal. AE-based compression is accomplished by continually training a network to reconstruct its input while trying to minimize a loss function between the original input and the reconstructed one. 2D-DCAE-based models are proposed for automatically learning inherent signal features from labeled EEG segments while being trained in a supervised way. \u003Ca href=\"#F2\"\u003EFigure 2\u003C\u002Fa\u003E shows the block diagram of the first proposed model which consists of a 2D-DCAE where the encoder output, the latent space representation, is also fed into an MLP network to perform the classification task.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003EFIGURE 2\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_m\u002Ffncom-15-650050-g002.jpg\" name=\"figure2\" target=\"_blank\"\u003E\r\n\u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_t\u002Ffncom-15-650050-g002.gif\" id=\"F2\" alt=\"www.frontiersin.org\" \u002F\u003E\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003EFigure 2.\u003C\u002Fstrong\u003E Block diagram of 2D-DCAE + MLP model for seizure detection.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb15 w100pc float_left mt15\"\u003E\u003Ca href=\"#F3\"\u003EFigure 3\u003C\u002Fa\u003E shows the block diagram of the second proposed model which consists of a 2D-DCAE but in this case, the encoder output which is the latent space representation is feed into a Bi-LSTM recurrent neural network to perform the classification task.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003EFIGURE 3\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_m\u002Ffncom-15-650050-g003.jpg\" name=\"figure3\" target=\"_blank\"\u003E\r\n\u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_t\u002Ffncom-15-650050-g003.gif\" id=\"F3\" alt=\"www.frontiersin.org\" \u002F\u003E\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003EFigure 3.\u003C\u002Fstrong\u003E Block diagram of 2D-DCAE + Bi-LSTM model for seizure detection.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb15 w100pc float_left mt15\"\u003EThe performance of the two proposed models will be compared with two other models. One of the new models comprises a two-dimensional deep convolutional neural network (2D-DCNN) connected to an MLP, \u003Ca href=\"#F4\"\u003EFigure 4A\u003C\u002Fa\u003E, while a 2D-DCNN is connected to a Bi-LSTM to form the second model, \u003Ca href=\"#F4\"\u003EFigure 4B\u003C\u002Fa\u003E.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003EFIGURE 4\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_m\u002Ffncom-15-650050-g004.jpg\" name=\"figure4\" target=\"_blank\"\u003E\r\n\u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_t\u002Ffncom-15-650050-g004.gif\" id=\"F4\" alt=\"www.frontiersin.org\" \u002F\u003E\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003EFigure 4.\u003C\u002Fstrong\u003E Block diagram of the two-dimensional deep convolutional neural network-based models for seizure detection, \u003Cb\u003E(A)\u003C\u002Fb\u003E 2D-DCNN + MLP model and \u003Cb\u003E(B)\u003C\u002Fb\u003E 2D-DCNN + Bi-LSTM model.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Ch3\u003ETwo-Dimensional Deep Convolutional Autoencoder\u003C\u002Fh3\u003E\r\n\u003Cp class=\"mb15\"\u003EConvolutional neural networks are a special class of feedforward neural networks that are very well-suited for processing multidimensional data like images or multi-channel EEG signals. Applications of CNNs in a variety of disciplines, such as computer vision and pattern recognition, have recorded very impressive outcomes (\u003Ca href=\"#B22\"\u003EKrizhevsky et al., 2017\u003C\u002Fa\u003E). This is due to its great ability to hierarchically learn excellent spatial features for the representation of data of different types. The parameter sharing and sparse connections properties of CNNs make them much more memory-savers compared to MLPs networks that consist of fully connected layers. As a result of these advantages, a two-dimensional convolution autoencoder stacked with convolution and pooling layers is proposed in this work rather than a standard AE that uses only fully connected layers.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EThe encoder subnetwork of the AE is a CNN consists of four convolutional layers and four max-pooling layers stacked interchangeably. The convolutional layers are responsible for learning the spatial and temporal features in the input EEG signals segments while the max-pooling layers are used for dimensionality reduction by downsampling. A single convolutional layer is made up of filters (kernels) consisting of trainable parameters (weights) that slide over and convolve with the input to generate feature maps where the number of feature maps equals the number of the applied filters. A configurable parameter (stride) controls how much the filter window is sliding over the input. The pooling layer performs down-sampling by lowering the dimension of the feature maps to reduce computational complexity. The low dimensional output of the encoding network is called latent space representation or bottleneck. On the other side, the decoder subnetwork consists of four convolutional layers and four upsampling layers which are also deployed interchangeably and are used to reconstruct the original input.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EIn all models, in the encoder network, the convolutional layers are configured with 32, 32, 64, and 64 filters, respectively. In the decoder network, the first three convolutional layers are configured with 64, 32, and 32 filters while the last layer has only one filter. All convolutional layers have a kernel size of 3 &#x00D7; 2, and a default stride value equals one. To keep the height and width of the feature maps at the same values, all convolutional layers are configured using the same padding technique. The activation function used in all convolutional layers, except the last layer, is the rectified linear unit (ReLU) defined in Eq. (2) because of its sparsity, computational simplicity, and sturdiness against noise in the input signals (\u003Ca href=\"#B13\"\u003EGoodfellow et al., 2017\u003C\u002Fa\u003E).\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Ef\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo stretchy=\"false\"\u003E(\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ex\u003C\u002Fmi\u003E\r\n\u003Cmo stretchy=\"false\"\u003E)\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmpadded width=\"+3.3pt\"\u003E\r\n\u003Cmtext\u003Emax\u003C\u002Fmtext\u003E\r\n\u003C\u002Fmpadded\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo stretchy=\"false\"\u003E{\u003C\u002Fmo\u003E\r\n\u003Cmn\u003E0\u003C\u002Fmn\u003E\r\n\u003Cmo\u003E,\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ex\u003C\u002Fmi\u003E\r\n\u003Cmo stretchy=\"false\"\u003E}\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E2\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb15\"\u003Ewhere \u003Ci\u003Ex\u003C\u002Fi\u003E is the weighted sum of the inputs and \u003Ci\u003Ef\u003C\u002Fi\u003E(\u003Ci\u003Ex\u003C\u002Fi\u003E) is the ReLU activation function.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EThe final convolutional layer of the 2D-DCAE uses the sigmoid activation function defined in Eq. (3) to generate an output in the range [0, 1].\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmpadded width=\"+5pt\"\u003E\r\n\u003Cmi\u003Ey\u003C\u002Fmi\u003E\r\n\u003C\u002Fmpadded\u003E\r\n\u003Cmo rspace=\"7.5pt\"\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmfrac\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003Cmo\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmsup\u003E\r\n\u003Cmi\u003Ee\u003C\u002Fmi\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ex\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmsup\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmfrac\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E3\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb15\"\u003Ewhere \u003Ci\u003Ex\u003C\u002Fi\u003E is the weighted sum of the inputs and \u003Ci\u003Ey\u003C\u002Fi\u003E is the output of the activation function.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EAll max-pooling layers are configured to perform input downsampling by taking the maximum value over windows of sizes (2, 2) except the last layer that uses a window of size (2, 3). The first upsampling layer does its job by interpolating the rows and columns of the input data using a size (2, 3) while the last three upsampling layers use (2, 2) sizes.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EOur models apply the Batch Normalization (batch norm) technique for speeding up and stabilizing the training process and to ensure high performance. The batch norm transform (\u003Ca href=\"#B18\"\u003EIoffe and Szegedy, 2015\u003C\u002Fa\u003E) is defined as:\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003EB\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003EN\u003C\u002Fmi\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi mathvariant=\"normal\"\u003E&#x03B3;\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E,\u003C\u002Fmo\u003E\r\n\u003Cmi mathvariant=\"normal\"\u003E&#x03B2;\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo\u003E(\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ex\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E)\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi mathvariant=\"normal\"\u003E&#x03B3;\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmfrac\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ex\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi mathvariant=\"normal\"\u003E&#x03BC;\u003C\u002Fmi\u003E\r\n\u003Cmi\u003EB\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmsqrt\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmsubsup\u003E\r\n\u003Cmi mathvariant=\"normal\"\u003E&#x03C3;\u003C\u002Fmi\u003E\r\n\u003Cmi\u003EB\u003C\u002Fmi\u003E\r\n\u003Cmn\u003E2\u003C\u002Fmn\u003E\r\n\u003C\u002Fmsubsup\u003E\r\n\u003Cmo\u003E+\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E&#x2208;\u003C\u002Fmo\u003E\r\n\u003Cmi\u002F\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmsqrt\u003E\r\n\u003C\u002Fmfrac\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmi mathvariant=\"normal\"\u003E&#x03B2;\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E4\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb0\"\u003Ewhere an input vector \u003Ci\u003Ex\u003C\u002Fi\u003E\u003Csub\u003E\u003Ci\u003Ei\u003C\u002Fi\u003E\u003C\u002Fsub\u003E is normalized within a mini-batch \u003Ci\u003EB\u003C\u002Fi\u003E = {\u003Ci\u003Ex\u003C\u002Fi\u003E\u003Csub\u003E1\u003C\u002Fsub\u003E,\u003Ci\u003Ex\u003C\u002Fi\u003E\u003Csub\u003E2\u003C\u002Fsub\u003E&#x2026;\u003Ci\u003Ex\u003C\u002Fi\u003E\u003Csub\u003E\u003Ci\u003Em\u003C\u002Fi\u003E\u003C\u002Fsub\u003E} having a mean &#x03BC;\u003Csub\u003E\u003Ci\u003EB\u003C\u002Fi\u003E\u003C\u002Fsub\u003Eand variance \u003Cmath id=\"INEQ4\"\u003E\u003Cmsubsup\u003E\u003Cmi mathvariant=\"normal\"\u003E&#x03C3;\u003C\u002Fmi\u003E\u003Cmi\u003EB\u003C\u002Fmi\u003E\u003Cmn\u003E2\u003C\u002Fmn\u003E\u003C\u002Fmsubsup\u003E\u003C\u002Fmath\u003E. &#x03B2; and &#x03B3; are two parameters that are learned jointly and used to scale and shift the normalized value while &#x2208; is a constant added for numerical stability. Four batch normalization layers are deployed between the four convolutional and max-pooling layers of the encoder subnetwork.\u003C\u002Fp\u003E\r\n\u003Ch3\u003EProposed Classification Models\u003C\u002Fh3\u003E\r\n\u003Ch4 class=\"pt0\"\u003ETwo-Dimensional Deep Convolution Autoencoder + MLP\u003C\u002Fh4\u003E\r\n\u003Cp class=\"mb0\"\u003EIn the first proposed model depicted in \u003Ca href=\"#F5\"\u003EFigure 5\u003C\u002Fa\u003E, the output of the decoder subnetwork (the latent space representation) is also converted from its multi-dimensional form to a vector using a flatten layer and then fed into an MLP network-based classifier. The MLP network consists of two hidden fully connected layers having 50 and 32 neurons (units), respectively. Both layers use the Relu activation function. The output layer of the MLP has a sigmoid activation function whose output represents the probability that an input EEG segment belongs to one of the classes.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003EFIGURE 5\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_m\u002Ffncom-15-650050-g005.jpg\" name=\"figure5\" target=\"_blank\"\u003E\r\n\u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_t\u002Ffncom-15-650050-g005.gif\" id=\"F5\" alt=\"www.frontiersin.org\" \u002F\u003E\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003EFigure 5.\u003C\u002Fstrong\u003E Proposed 2D-DCAE + MLP architecture (assuming that an EEG segment length is 2 s).\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Ch4\u003ETwo-Dimensional Deep Convolution Autoencoder + Bi-LSTM\u003C\u002Fh4\u003E\r\n\u003Cp class=\"mb0\"\u003ELong short-term memory (LSTM) is a particular architecture of recurrent neural networks. It was developed to solve numerous problems that vanilla RNNs suffer during training using backpropagation over Time (BPTT) (\u003Ca href=\"#B24\"\u003EMozer, 1989\u003C\u002Fa\u003E) such as information morphing and exploding and vanishing gradients (\u003Ca href=\"#B7\"\u003EBengio et al., 1994\u003C\u002Fa\u003E). By proposing the concept of memory cells (units) with three controlling gates, LSTMs are capable of maintaining gradients values calculated by backpropagation during network training while preserving long-term temporal dependencies between inputs (\u003Ca href=\"#B15\"\u003EHochreiter and Schmidhuber, 1997\u003C\u002Fa\u003E). \u003Ca href=\"#F6\"\u003EFigure 6\u003C\u002Fa\u003E shows the structure of a single LSTM cell.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003EFIGURE 6\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_m\u002Ffncom-15-650050-g006.jpg\" name=\"figure6\" target=\"_blank\"\u003E\r\n\u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_t\u002Ffncom-15-650050-g006.gif\" id=\"F6\" alt=\"www.frontiersin.org\" \u002F\u003E\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003EFigure 6.\u003C\u002Fstrong\u003E Long short-term memory cell structure.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb15 w100pc float_left mt15\"\u003EThe following equations show how information is processed inside the LSTM cell.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ef\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmi mathvariant=\"normal\"\u003E&#x03C3;\u003C\u002Fmi\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo stretchy=\"false\"\u003E(\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003EW\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Ef\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E.\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo\u003E[\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Eh\u003C\u002Fmi\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo rspace=\"12.5pt\"\u003E,\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ex\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E]\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Eb\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Ef\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo rspace=\"7.5pt\" stretchy=\"false\"\u003E)\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E5\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmi mathvariant=\"normal\"\u003E&#x03C3;\u003C\u002Fmi\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo stretchy=\"false\"\u003E(\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003EW\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E.\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo\u003E[\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Eh\u003C\u002Fmi\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo rspace=\"7.5pt\"\u003E,\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ex\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E]\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Eb\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo stretchy=\"false\"\u003E)\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E6\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Eo\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmi mathvariant=\"normal\"\u003E&#x03C3;\u003C\u002Fmi\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo\u003E(\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003EW\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Eo\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E.\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo\u003E[\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Eh\u003C\u002Fmi\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo rspace=\"7.5pt\"\u003E,\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ex\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E]\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Eb\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Eo\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E)\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E7\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmover accent=\"true\"\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E~\u003C\u002Fmo\u003E\r\n\u003C\u002Fmover\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ea\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003En\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Eh\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo stretchy=\"false\"\u003E(\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003EW\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E&#x22C5;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo stretchy=\"false\"\u003E[\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Eh\u003C\u002Fmi\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E,\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ex\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo stretchy=\"false\"\u003E]\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Eb\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo stretchy=\"false\"\u003E)\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E8\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ef\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo largeop=\"true\" mathsize=\"160%\" movablelimits=\"false\" stretchy=\"false\" symmetric=\"true\"\u003E&#x2299;\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo largeop=\"true\" mathsize=\"160%\" movablelimits=\"false\" rspace=\"5.8pt\" stretchy=\"false\" symmetric=\"true\"\u003E&#x2299;\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmover accent=\"true\"\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E~\u003C\u002Fmo\u003E\r\n\u003C\u002Fmover\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E9\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Eh\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Eo\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo largeop=\"true\" mathsize=\"160%\" movablelimits=\"false\" stretchy=\"false\" symmetric=\"true\"\u003E&#x2299;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ea\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003En\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Eh\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo stretchy=\"false\"\u003E(\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo stretchy=\"false\"\u003E)\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E10\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb15\"\u003Ewhere \u003Ci\u003Ex_t\u003C\u002Fi\u003E is the input at time \u003Ci\u003Et\u003C\u002Fi\u003E in a sequence \u003Ci\u003EX\u003C\u002Fi\u003E = (\u003Ci\u003Ex\u003C\u002Fi\u003E\u003Csub\u003E1\u003C\u002Fsub\u003E,\u003Ci\u003Ex\u003C\u002Fi\u003E\u003Csub\u003E2\u003C\u002Fsub\u003E,\u003Ci\u003Ex\u003C\u002Fi\u003E\u003Csub\u003E3\u003C\u002Fsub\u003E,.,\u003Ci\u003Ex\u003C\u002Fi\u003E\u003Csub\u003E\u003Ci\u003En\u003C\u002Fi\u003E\u003C\u002Fsub\u003E) of \u003Ci\u003En\u003C\u002Fi\u003E time steps. \u003Ci\u003Eh\u003C\u002Fi\u003E\u003Csub\u003E\u003Ci\u003Et&#x2013;1\u003C\u002Fi\u003E\u003C\u002Fsub\u003E and c\u003Csub\u003Et&#x2212;1\u003C\u002Fsub\u003E are the hidden state output and cell state at the previous time step, respectively. h\u003Csub\u003Et\u003C\u002Fsub\u003E and \u003Ci\u003Ec_t\u003C\u002Fi\u003E are the current hidden state and cell state. \u003Ci\u003Ef_t\u003C\u002Fi\u003E,\u003Ci\u003Ei\u003C\u002Fi\u003E\u003Csub\u003E\u003Ci\u003Et\u003C\u002Fi\u003E\u003C\u002Fsub\u003E, and \u003Ci\u003Eo_t\u003C\u002Fi\u003E are the forget, input, and output gates. \u003Ci\u003EW\u003C\u002Fi\u003E and \u003Ci\u003Eb\u003C\u002Fi\u003E represent the weights and biases matrices and vectors while &#x03C3; is the sigmoid (logistic) function and &#x2299; is the Hadamard product operator. The memory cell starts operation by selecting which information to keep or forget from the previous states using the forget gate \u003Ci\u003Ef_t\u003C\u002Fi\u003E. Then, the cell calculates the candidate state \u003Cmath id=\"INEQ10\"\u003E\u003Cmsub\u003E\u003Cmover accent=\"true\"\u003E\u003Cmi\u003Ec\u003C\u002Fmi\u003E\u003Cmo stretchy=\"false\"\u003E~\u003C\u002Fmo\u003E\u003C\u002Fmover\u003E\u003Cmi\u003Et\u003C\u002Fmi\u003E\u003C\u002Fmsub\u003E\u003C\u002Fmath\u003E. After that, using the prior cell state c\u003Csub\u003Et&#x2212;1\u003C\u002Fsub\u003E and the input gate \u003Ci\u003Ei_t\u003C\u002Fi\u003E, the cell decides what further information to write to the current state \u003Ci\u003Ec\u003C\u002Fi\u003E\u003Csub\u003E\u003Ci\u003Et\u003C\u002Fi\u003E\u003C\u002Fsub\u003E. Finally, the output gate \u003Ci\u003Eo_t\u003C\u002Fi\u003E calculates how much state information h\u003Csub\u003Et\u003C\u002Fsub\u003E will be transported to the next time step. Note that, in \u003Ca href=\"#F6\"\u003EFigure 6\u003C\u002Fa\u003E, the biases and the multiplication operations between the matrix of the concatenated input and the hidden state, and the weight matrices are not shown to make the figure simpler.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb0\"\u003EIn the second proposed model, the output of the decoder subnetwork is fed into a Bi-LSTM recurrent neural network-based classifier as shown in \u003Ca href=\"#F7\"\u003EFigure 7\u003C\u002Fa\u003E.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003EFIGURE 7\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_m\u002Ffncom-15-650050-g007.jpg\" name=\"figure7\" target=\"_blank\"\u003E\r\n\u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_t\u002Ffncom-15-650050-g007.gif\" id=\"F7\" alt=\"www.frontiersin.org\" \u002F\u003E\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003EFigure 7.\u003C\u002Fstrong\u003E Proposed 2D-DCAE + Bi-LSTM architecture (assuming that an EEG segment length is 2 s).\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb15 w100pc float_left mt15\"\u003EFor classification, a single-layer Bi-LSTM network consisting of two LSTM blocks (cells) is used in this model. The Bi-LSTM network architecture is similar to the standard unidirectional LSTM architecture, except that both LSTM blocks process the output of the encoder, reshaped as a sequence, simultaneously in two opposite directions instead of one direction. After passing through the entire input sequence, the average of the two outputs of both blocks concatenated together is computed and used for the classification task. Bi-LSTMs are useful in that they take into account the temporal dependence between the current input at a certain time and its previous and subsequent counterparts, which offers a strong advantage for enhancing the classification results (\u003Ca href=\"#B4\"\u003EAbdelhameed et al., 2018b\u003C\u002Fa\u003E). \u003Ca href=\"#F8\"\u003EFigure 8\u003C\u002Fa\u003E shows a single-layer Bi-LSTM network unrolled over n time steps.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003EFIGURE 8\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_m\u002Ffncom-15-650050-g008.jpg\" name=\"figure8\" target=\"_blank\"\u003E\r\n\u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_t\u002Ffncom-15-650050-g008.gif\" id=\"F8\" alt=\"www.frontiersin.org\" \u002F\u003E\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003EFigure 8.\u003C\u002Fstrong\u003E Unrolled single-layer Bi-LSTM network.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb15 w100pc float_left mt15\"\u003EThe Bi-LSTM layer is configured to have 50 units and to overcome overfitting, the dropout regularization technique is used with a value of 0.1. As in the first model, the sigmoid activation function is used to predict the EEG segment class label.\u003C\u002Fp\u003E\r\n\u003Ch4\u003ELoss Functions and Optimizer\u003C\u002Fh4\u003E\r\n\u003Cp class=\"mb15\"\u003EAs the SDCAE is performing the two tasks of input reconstruction and classification simultaneously, both proposed models are designed to minimize two losses during network training. The first loss is the supervised classification loss (CL) between the predicted and actual class labels. The binary cross-entropy, defined in Eq. (11), is chosen as the loss function.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003EC\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EL\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmfrac\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003Cmi\u003EN\u003C\u002Fmi\u003E\r\n\u003C\u002Fmfrac\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmunderover\u003E\r\n\u003Cmo largeop=\"true\" movablelimits=\"false\" symmetric=\"true\"\u003E&#x2211;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmn\u003E0\u003C\u002Fmn\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003EN\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmunderover\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ey\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E&#x22C5;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Elog\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo stretchy=\"false\"\u003E(\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmover accent=\"true\"\u003E\r\n\u003Cmi\u003Ey\u003C\u002Fmi\u003E\r\n\u003Cmo stretchy=\"false\"\u003E^\u003C\u002Fmo\u003E\r\n\u003C\u002Fmover\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo rspace=\"7.5pt\" stretchy=\"false\"\u003E)\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo rspace=\"7.5pt\"\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo stretchy=\"false\"\u003E(\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ey\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo stretchy=\"false\"\u003E)\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E&#x22C5;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Elog\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo stretchy=\"false\"\u003E(\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmover accent=\"true\"\u003E\r\n\u003Cmi\u003Ey\u003C\u002Fmi\u003E\r\n\u003Cmo stretchy=\"false\"\u003E^\u003C\u002Fmo\u003E\r\n\u003C\u002Fmover\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo stretchy=\"false\"\u003E)\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E11\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb15\"\u003Ewhere \u003Cmath id=\"INEQ14\"\u003E\u003Cmsub\u003E\u003Cmover accent=\"true\"\u003E\u003Cmi\u003Ey\u003C\u002Fmi\u003E\u003Cmo stretchy=\"false\"\u003E^\u003C\u002Fmo\u003E\u003C\u002Fmover\u003E\u003Cmi\u003Ei\u003C\u002Fmi\u003E\u003C\u002Fmsub\u003E\u003C\u002Fmath\u003E is the predicted model output for a single EEG segment, and \u003Ci\u003Ey_i\u003C\u002Fi\u003E is the corresponding actual class label in a training batch equals \u003Ci\u003EN.\u003C\u002Fi\u003E\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EThe second loss is the loss of reconstruction (RL) between the input EEG segments and their reconstructed equivalents decoded by the DCAE and the mean square error defined in Eq. (12) is utilized for calculating this loss.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003ER\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EL\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmfrac\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003Cmi\u003EN\u003C\u002Fmi\u003E\r\n\u003C\u002Fmfrac\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmunderover\u003E\r\n\u003Cmo largeop=\"true\" movablelimits=\"false\" symmetric=\"true\"\u003E&#x2211;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmn\u003E0\u003C\u002Fmn\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003EN\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmunderover\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmfrac\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Em\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003En\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmfrac\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmunderover\u003E\r\n\u003Cmo largeop=\"true\" movablelimits=\"false\" symmetric=\"true\"\u003E&#x2211;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Ej\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmn\u003E0\u003C\u002Fmn\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Em\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmunderover\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmunderover\u003E\r\n\u003Cmo largeop=\"true\" movablelimits=\"false\" symmetric=\"true\"\u003E&#x2211;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Ek\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmn\u003E0\u003C\u002Fmn\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003En\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmunderover\u003E\r\n\u003Cmsup\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo\u003E(\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ey\u003C\u002Fmi\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Ej\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ek\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmpadded width=\"+3.3pt\"\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmover accent=\"true\"\u003E\r\n\u003Cmi\u003Ey\u003C\u002Fmi\u003E\r\n\u003Cmo stretchy=\"false\"\u003E^\u003C\u002Fmo\u003E\r\n\u003C\u002Fmover\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Ej\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ek\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003C\u002Fmpadded\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E)\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmn\u003E2\u003C\u002Fmn\u003E\r\n\u003C\u002Fmsup\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E12\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb15\"\u003EWhere an \u003Ci\u003Ey\u003C\u002Fi\u003E\u003Csub\u003E\u003Ci\u003Ejk\u003C\u002Fi\u003E\u003C\u002Fsub\u003E is the original value at the position indexed by \u003Ci\u003Ej\u003C\u002Fi\u003E, \u003Ci\u003Ek\u003C\u002Fi\u003E in an input EEG segment matrix of size (\u003Ci\u003Em\u003C\u002Fi\u003E &#x00D7; \u003Ci\u003En\u003C\u002Fi\u003E), \u003Cmath id=\"INEQ15\"\u003E\u003Cmsub\u003E\u003Cmover accent=\"true\"\u003E\u003Cmi\u003Ey\u003C\u002Fmi\u003E\u003Cmo stretchy=\"false\"\u003E^\u003C\u002Fmo\u003E\u003C\u002Fmover\u003E\u003Cmrow\u003E\u003Cmi\u003Ej\u003C\u002Fmi\u003E\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\u003Cmi\u003Ek\u003C\u002Fmi\u003E\u003C\u002Fmrow\u003E\u003C\u002Fmsub\u003E\u003C\u002Fmath\u003E is the reconstructed value and \u003Ci\u003EN\u003C\u002Fi\u003E is the number of segments defined as before.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EThere is no much difference between training a deep learning model with a single output or a deep learning model with multiple outputs. In the latter case as in our proposed SDCAE models, the total loss (TL) of a model is calculated as the weighted summation of the CL and the reconstruction loss (RL) as in Eq. (13)\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003ET\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EL\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ew\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E&#x00D7;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EC\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmpadded width=\"+5pt\"\u003E\r\n\u003Cmi\u003EL\u003C\u002Fmi\u003E\r\n\u003C\u002Fmpadded\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo rspace=\"7.5pt\"\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmsub\u003E\r\n\u003Cmi\u003Ew\u003C\u002Fmi\u003E\r\n\u003Cmi\u003Er\u003C\u002Fmi\u003E\r\n\u003C\u002Fmsub\u003E\r\n\u003Cmo\u003E&#x00D7;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003ER\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EL\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E13\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb15\"\u003Ewhere \u003Ci\u003Ew\u003C\u002Fi\u003E\u003Csub\u003E\u003Ci\u003Ec\u003C\u002Fi\u003E\u003C\u002Fsub\u003Eand \u003Ci\u003Ew\u003C\u002Fi\u003E\u003Csub\u003E\u003Ci\u003Er\u003C\u002Fi\u003E\u003C\u002Fsub\u003Eare the weights and can have any values in the interval (0,1]. In our design, \u003Ci\u003Ew\u003C\u002Fi\u003E\u003Csub\u003E\u003Ci\u003Ec\u003C\u002Fi\u003E\u003C\u002Fsub\u003Eis chosen to be 0.5 while \u003Ci\u003Ew_r\u003C\u002Fi\u003E equals to 1.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EThe backpropagation of the loss in both subnetworks starts by calculating two partial derivatives (gradients): \u003Cmath id=\"INEQ19\"\u003E\u003Cmfrac\u003E\u003Cmrow\u003E\u003Cmo\u003E&#x2202;\u003C\u002Fmo\u003E\u003Cmo\u003E&#x2061;\u003C\u002Fmo\u003E\u003Cmrow\u003E\u003Cmi\u003ET\u003C\u002Fmi\u003E\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\u003Cmi\u003EL\u003C\u002Fmi\u003E\u003C\u002Fmrow\u003E\u003C\u002Fmrow\u003E\u003Cmrow\u003E\u003Cmo\u003E&#x2202;\u003C\u002Fmo\u003E\u003Cmo\u003E&#x2061;\u003C\u002Fmo\u003E\u003Cmrow\u003E\u003Cmi\u003EC\u003C\u002Fmi\u003E\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\u003Cmi\u003EL\u003C\u002Fmi\u003E\u003C\u002Fmrow\u003E\u003C\u002Fmrow\u003E\u003C\u002Fmfrac\u003E\u003C\u002Fmath\u003E and \u003Cmath id=\"INEQ20\"\u003E\u003Cmfrac\u003E\u003Cmrow\u003E\u003Cmo\u003E&#x2202;\u003C\u002Fmo\u003E\u003Cmo\u003E&#x2061;\u003C\u002Fmo\u003E\u003Cmrow\u003E\u003Cmi\u003ET\u003C\u002Fmi\u003E\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\u003Cmi\u003EL\u003C\u002Fmi\u003E\u003C\u002Fmrow\u003E\u003C\u002Fmrow\u003E\u003Cmrow\u003E\u003Cmo\u003E&#x2202;\u003C\u002Fmo\u003E\u003Cmo\u003E&#x2061;\u003C\u002Fmo\u003E\u003Cmrow\u003E\u003Cmi\u003ER\u003C\u002Fmi\u003E\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\u003Cmi\u003EL\u003C\u002Fmi\u003E\u003C\u002Fmrow\u003E\u003C\u002Fmrow\u003E\u003C\u002Fmfrac\u003E\u003C\u002Fmath\u003E. All other gradients are then calculated using the chaining rule and the weights and biases are then updated in the same way as typical deep learning models.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb0\"\u003EDifferent optimizers such as Stochastic Gradient Descent (SGD; \u003Ca href=\"#B8\"\u003EBottou, 2004\u003C\u002Fa\u003E), root mean square propagation (RMSProp; \u003Ca href=\"#B33\"\u003ETieleman and Hinton, 2012\u003C\u002Fa\u003E), ADADELTA (\u003Ca href=\"#B40\"\u003EZeiler, 2012\u003C\u002Fa\u003E), and Adam (\u003Ca href=\"#B21\"\u003EKingma and Ba, 2014\u003C\u002Fa\u003E) have been tested while training the SDCAE. Eventually, based on different models&#x2019; performances, Adam optimizer was the chosen optimizer with a learning rate set at 0.0001.\u003C\u002Fp\u003E\r\n\u003Ch3\u003EData Selection and Training\u003C\u002Fh3\u003E\r\n\u003Cp class=\"mb15\"\u003EThe performance of the two proposed SDCAE seizure detection models (DCAE + MLP), and (DCAE + Bi-LSTM) is evaluated against that of two regular deep learning models (DCNN + MLP), and (DCNN + Bi-LSTM) using EEG segments of three different lengths. That means a total number of twelve models will be tested and assessed using various performance measures.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb0\"\u003EA stratified 10-fold cross-validation methodology (\u003Ca href=\"#B14\"\u003EHe and Ma, 2013\u003C\u002Fa\u003E). is used to prepare the dataset for training and to evaluate the performance of all models to test their strength and reliability while classifying unseen data. In this methodology, the investigated EEG dataset (containing both interictal and ictal data segments) is randomly divided into ten equal subsamples or folds where the balanced distribution of both classes (ictal and interictal) is preserved within each fold. One ten percent of the dataset (a subsample) is marked as the testing set (testing fold) while the remaining nine folds of the dataset collectively are used as the training set. The cross-validation process is repeated for ten iterations, with each of the 10-folds used exactly once as the testing set. Within each iteration, all models are trained for 200 epochs using a batch size of 32. The average and standard deviation of the classification results of the 10 iterations are calculated to produce the final estimations for different evaluation measures.\u003C\u002Fp\u003E\r\n\u003Ch3\u003EModels Performance Evaluation\u003C\u002Fh3\u003E\r\n\u003Cp class=\"mb15\"\u003EVarious statistical metrics commonly used in the literature such as accuracy, sensitivity (recall), specificity, precision, and F1-score (\u003Ca href=\"#B31\"\u003ESokolova and Lapalme, 2009\u003C\u002Fa\u003E) have been calculated to assess the classification efficiency of the models against the testing set, in each of the ten iterations of the 10-fold cross-validation. These evaluation metrics are defined as follows:\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003EA\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Eu\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Er\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ea\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ey\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmfrac\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003ET\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EP\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003ET\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EN\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003ET\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EP\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003ET\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EN\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003EF\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EP\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003EF\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EN\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmfrac\u003E\r\n\u003Cmo\u003E&#x00D7;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmn\u003E100\u003C\u002Fmn\u003E\r\n\u003Cmo lspace=\"0pt\" rspace=\"3.5pt\"\u003E%\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E14\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003ES\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ee\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003En\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Es\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ev\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ey\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmo\u003E(\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003ER\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ee\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ea\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003El\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003El\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E)\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmfrac\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003ET\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EP\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003ET\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EP\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003EF\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EN\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmfrac\u003E\r\n\u003Cmo\u003E&#x00D7;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmn\u003E100\u003C\u002Fmn\u003E\r\n\u003Cmo lspace=\"0pt\" rspace=\"3.5pt\"\u003E%\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E15\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003ES\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ep\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ee\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ef\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Et\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ey\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo rspace=\"7.5pt\"\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmfrac\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003ET\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EN\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003ET\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EN\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003EF\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EP\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmfrac\u003E\r\n\u003Cmo\u003E&#x00D7;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmn\u003E100\u003C\u002Fmn\u003E\r\n\u003Cmo lspace=\"0pt\" rspace=\"3.5pt\"\u003E%\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E16\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003EP\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Er\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ee\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Es\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Eo\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003En\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmfrac\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003ET\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EP\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003ET\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EP\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003EF\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003EP\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmfrac\u003E\r\n\u003Cmo\u003E&#x00D7;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmn\u003E100\u003C\u002Fmn\u003E\r\n\u003Cmo lspace=\"0pt\" rspace=\"3.5pt\"\u003E%\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E17\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"equationImageholder pb0\"\u003E\r\n\u003Cmath display='block'\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003EF\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmn\u003E1\u003C\u002Fmn\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E-\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003Es\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Eo\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Er\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ee\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E=\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmn\u003E2\u003C\u002Fmn\u003E\r\n\u003Cmo\u003E&#x00D7;\u003C\u002Fmo\u003E\r\n\u003Cmfrac\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003EP\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Er\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ee\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Es\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Eo\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003En\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E&#x00D7;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003ER\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ee\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ea\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003El\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003El\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003EP\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Er\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ee\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Es\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ei\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Eo\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003En\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmo\u003E+\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmi\u003ER\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ee\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ec\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003Ea\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003El\u003C\u002Fmi\u003E\r\n\u003Cmo\u003E&#x2062;\u003C\u002Fmo\u003E\r\n\u003Cmi\u003El\u003C\u002Fmi\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmfrac\u003E\r\n\u003Cmo\u003E&#x00D7;\u003C\u002Fmo\u003E\r\n\u003Cmrow\u003E\r\n\u003Cmn\u003E100\u003C\u002Fmn\u003E\r\n\u003Cmo lspace=\"0pt\" rspace=\"3.5pt\"\u003E%\u003C\u002Fmo\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003C\u002Fmrow\u003E\r\n\u003Cmspace width=\"5em\"\u002F\u003E\u003Cmo stretchy='false'\u003E(\u003C\u002Fmo\u003E\u003Cmn\u003E18\u003C\u002Fmn\u003E\u003Cmo stretchy='false'\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmath\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb0\"\u003Ewhere \u003Ci\u003EP\u003C\u002Fi\u003E denotes the number of positive (ictal) EEG segments while \u003Ci\u003EN\u003C\u002Fi\u003E denotes the number of negative (interictal) EEG segments). \u003Ci\u003ETP\u003C\u002Fi\u003E and \u003Ci\u003ETN\u003C\u002Fi\u003E are the numbers of true positives and true negatives while \u003Ci\u003EFP\u003C\u002Fi\u003E and \u003Ci\u003EFN\u003C\u002Fi\u003E are the numbers of false positives and false negatives, respectively. In this study, accuracy is defined as the percentage of the correctly classified EEG segments belonging to any state (ictal or interictal), sensitivity is the percentage of correctly classified ictal state EEG segments, specificity is the percentage of correctly classified interictal state EEG segments, while precision determines how many of the EEG segments classified as belonging to the ictal state are originally ictal state EEG segments. Finally, the F1-score combines the values of precision and recall in a single metric.\u003C\u002Fp\u003E\r\n\u003Ch3\u003EModels Implementation\u003C\u002Fh3\u003E\r\n\u003Cp class=\"mb0\"\u003EThe Python programming language along with many supporting libraries and in particular, the Tensorflow machine learning library&#x2019;s Keras deep learning API, has been used to develop our models. Due to the variations in the hardware resources and different GPU specifications, we have chosen not to include the computational times of training and testing the proposed models as a metric in our comparisons especially since we are developing our models using external resources provided by Google Colaboratory online environment that runs on Google&#x2019;s cloud servers.\u003C\u002Fp\u003E\r\n\u003Ca id=\"h4\" name=\"h4\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EResults\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb0\"\u003EFor each of the four models, \u003Ca href=\"#F9\"\u003EFigure 9\u003C\u002Fa\u003E shows the ranges of values of the five performance metrics calculated based on the 10-Fold cross-validation classification results of the EEG segments of lengths 1, 2, and 4 s. The mean and standard deviation of all metrics over the 10-folds are then calculated and summarized in \u003Ca href=\"#T2\"\u003ETable 2\u003C\u002Fa\u003E.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003EFIGURE 9\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_m\u002Ffncom-15-650050-g009.jpg\" name=\"figure9\" target=\"_blank\"\u003E\r\n\u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_t\u002Ffncom-15-650050-g009.gif\" id=\"F9\" alt=\"www.frontiersin.org\" \u002F\u003E\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003EFigure 9.\u003C\u002Fstrong\u003E Boxplots showing ranges of performance metrics percentages calculated based on the 10-fold cross-validation results.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine mb15\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003ETABLE 2\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_m\u002Ffncom-15-650050-t002.jpg\" name=\"table2\" target=\"_blank\"\u003E\r\n\u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_t\u002Ffncom-15-650050-t002.gif\" id=\"T2\" alt=\"www.frontiersin.org\" \u002F\u003E\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003ETable 2.\u003C\u002Fstrong\u003E Classification results using different EEG segment lengths.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb15 w100pc float_left mt15\"\u003EThe same results are interpreted visually in \u003Ca href=\"#F10\"\u003EFigure 10\u003C\u002Fa\u003E.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003EFIGURE 10\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_m\u002Ffncom-15-650050-g010.jpg\" name=\"figure10\" target=\"_blank\"\u003E\r\n\u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_t\u002Ffncom-15-650050-g010.gif\" id=\"F10\" alt=\"www.frontiersin.org\" \u002F\u003E\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003EFigure 10.\u003C\u002Fstrong\u003E Visualization of the classification results of the models using different EEG segment lengths.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Ca id=\"h5\" name=\"h5\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EDiscussion\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb15\"\u003EAs can be seen from the results, for all EEG segment lengths and evaluation metrics, the two proposed SDCAE models (DCAE + MLP and DCAE + Bi-LSTM) have outperformed the other two models (DCNN + MLP and DCNN + Bi-LSTM) that do not use AEs. Furthermore, as highlighted in \u003Ca href=\"#T2\"\u003ETable 2\u003C\u002Fa\u003E, using a segment length of 4 s, the DCAE + Bi-LSTM model has achieved the highest performance in terms of all evaluation metrics among all other combinations of models. It is also interesting to see that in all SDCAE models, a 4-s EEG segment length is the best choice to get the best classification performance. Generally, it can be noticed that all models that utilized a Bi-LSTM for classification have accomplished better results compared to their counterpart models that use MLP-based classifiers using the same EEG segment lengths. That can be explained as Bi-LSTM networks are more capable to learn better temporal patterns from the generated latent space sequence better than MLP networks. Finally, by comparing the standard deviations in the evaluation metrics values for all models, it is clear that the results of the SDCAE models mostly have less dispersion compared to the other models, which means that the SDCAE models&#x2019; performance is more consistent across all cross-validation iterations.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb0\"\u003E\u003Ca href=\"#F11\"\u003EFigure 11\u003C\u002Fa\u003E shows the classification accuracy, CL, and RL curves for the training and testing datasets obtained while training the winning model (DCAE + Bi-LSTM) in one of the iterations of the 10-fold cross-validation.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003EFIGURE 11\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_m\u002Ffncom-15-650050-g011.jpg\" name=\"figure11\" target=\"_blank\"\u003E\r\n\u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_t\u002Ffncom-15-650050-g011.gif\" id=\"F11\" alt=\"www.frontiersin.org\" \u002F\u003E\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003EFigure 11.\u003C\u002Fstrong\u003E Accuracy and loss curves against the number of epochs obtained while training the DCAE + Bi-LSTM model.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Ch3\u003EStatistical Analysis\u003C\u002Fh3\u003E\r\n\u003Cp class=\"mb0\"\u003EThe non-parametric Kruskal&#x2013;Wallis \u003Ci\u003EH\u003C\u002Fi\u003E test (\u003Ca href=\"#B23\"\u003EKruskal and Wallis, 1952\u003C\u002Fa\u003E) is used to test the statistical significance of the classification results of the two proposed models (DCAE + MLP and DCAE + Bi-LSTM). For simplicity, the test results for comparing the evaluation metrics of the models obtained using an EEG segment length of 4 s will be demonstrated. When comparing DCAE + Bi-LSTM with the two models (DCNN + MLP and DCNN + Bi-LSTM), the Kruskal&#x2013;Wallis \u003Ci\u003EH\u003C\u002Fi\u003E test produced \u003Ci\u003Ep\u003C\u002Fi\u003E-value = 0.0005 for accuracy, \u003Ci\u003Ep\u003C\u002Fi\u003E-value = 0.02 for sensitivity, \u003Ci\u003Ep\u003C\u002Fi\u003E-value = 0.025 for specificity, \u003Ci\u003Ep\u003C\u002Fi\u003E-value = 0.005 for precision, and \u003Ci\u003Ep\u003C\u002Fi\u003E-value = 0.001 for F1-score. Also, when comparing DCAE + MLP with the same two models, the statistical test showed \u003Ci\u003Ep\u003C\u002Fi\u003E-value = 0.003 for accuracy, \u003Ci\u003Ep\u003C\u002Fi\u003E-value = 0.011 for sensitivity, \u003Ci\u003Ep\u003C\u002Fi\u003E-value = 0.083 for specificity, \u003Ci\u003Ep\u003C\u002Fi\u003E-value = 0.019 for precision, and \u003Ci\u003Ep\u003C\u002Fi\u003E-value = 0.002 for F1-score. For all performance assessment metrics, nearly all comparisons yielded a \u003Ci\u003Ep\u003C\u002Fi\u003E-value lower than 0.05, apart from only one \u003Ci\u003Ep\u003C\u002Fi\u003E-value for specificity. This shows the disparity in the statistical significance between the outcomes of all the proposed models.\u003C\u002Fp\u003E\r\n\u003Ch3\u003EComparison With Other Methods\u003C\u002Fh3\u003E\r\n\u003Cp class=\"mb0\"\u003EIn the literature, not all previous work uses the same set of metrics for evaluating the performance of the seizure classification algorithms. So, comparisons based on the most commonly used metrics which are accuracy, sensitivity, and specificity, will only be provided in this section. \u003Ca href=\"#T3\"\u003ETable 3\u003C\u002Fa\u003E summarizes the comparison between our best performing model and some state-of-the-art methods that use deep neural networks for feature extraction and classification of seizures. In \u003Ca href=\"#B39\"\u003EYuan et al. (2017)\u003C\u002Fa\u003E, various stacked sparse denoising autoencoders (SSDAE) have been tested and compared for feature extraction and classification after preprocessing using short-time Fourier transform (STFT). The best accuracy they obtained was 93.82% using a random selection of training and testing datasets. \u003Ca href=\"#B20\"\u003EKe et al. (2018)\u003C\u002Fa\u003E combined global maximal information coefficient (MIC) with visual geometry group network (VGGNet) for feature extraction and classification. Using fivefold cross-validation, they achieved 98.1% accuracy, 98.85% sensitivity, and 97.47% specificity. Using fast Fourier transform (FFT) for frequency domain analysis and CNN, the authors (\u003Ca href=\"#B41\"\u003EZhou et al., 2018\u003C\u002Fa\u003E) performed patient-specific classifications between the ictal and interictal signals. Relying on sixfold cross-validation, the average of the evaluation metrics for all patients was 97.5% accuracy, 96.86% sensitivity, and 98.15% specificity. Finally, in \u003Ca href=\"#B16\"\u003EHossain et al. (2019)\u003C\u002Fa\u003E, the authors used a 2D-CNN model to extract spectral and temporal characteristics of EEG signals and used them for patient-specific classification using a random selection of training and testing datasets. They got 98.05% accuracy, 90% sensitivity, and 91.65% specificity for the cross-patient results. Following the previous comparison, the results obtained by our model have shown to be superior to some of the state-of-the-art systems which all lack the proper statistical analysis for significance testing.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003ETABLE 3\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_m\u002Ffncom-15-650050-t003.jpg\" name=\"table3\" target=\"_blank\"\u003E\r\n\u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_t\u002Ffncom-15-650050-t003.gif\" id=\"T3\" alt=\"www.frontiersin.org\" \u002F\u003E\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003ETable 3.\u003C\u002Fstrong\u003E Comparison between our best performing model and previous methods using the same dataset.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Ca id=\"h6\" name=\"h6\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EConclusion\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb0\"\u003EA novel deep-learning approach for the detection of seizures in pediatric patients is proposed. The novel approach uses a 2D-SDCAE for the detection of epileptic seizures based on classifying minimally pre-processed raw multichannel EEG signal recordings. In this approach, an AE is trained in a supervised way to classify between the ictal and interictal brain state EEG signals to exploit its capabilities of performing both automatic feature learning and classification simultaneously with high efficiency. Two SDCAE models that use Bi-LSTM and MLP networks-based classifiers were designed and tested using three EEG data segment lengths. The performance of both proposed models is compared to two regular deep learning models having the same layers&#x2019; structure, except that the decoder network layers are completely removed. The twelve models are trained and assessed using a 10-fold cross-validation scheme and based on five evaluation metrics, the best performing model was the SDCAE model that uses a Bi-LSTM and 4 s EEG segments. This model has achieved an average of 98.79% accuracy, 98.72% sensitivity, 98.86% specificity, 98.86% precision, and finally an F1-score of 98.79%. The comparison between this SDCAE model and other state-of-the-art systems using the same dataset has shown that the performance of our proposed model is superior to that of most existing systems.\u003C\u002Fp\u003E\r\n\u003Ca id=\"h7\" name=\"h7\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EData Availability Statement\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb0\"\u003EPublicly available datasets were analyzed in this study. This data can be found here: \u003Ca href=\"https:\u002F\u002Fphysionet.org\u002Fcontent\u002Fchbmit\u002F1.0.0\u002F\"\u003Ehttps:\u002F\u002Fphysionet.org\u002Fcontent\u002Fchbmit\u002F1.0.0\u002F\u003C\u002Fa\u003E.\u003C\u002Fp\u003E\r\n\u003Ca id=\"h8\" name=\"h8\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EEthics Statement\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb0\"\u003EEthical review and approval was not required for the study on human participants in accordance with the local legislation and institutional requirements. Written informed consent from the participants&#x2019; legal guardian\u002Fnext of kin was not required to participate in this study in accordance with the national legislation and the institutional requirements.\u003C\u002Fp\u003E\r\n\u003Ca id=\"h9\" name=\"h9\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EAuthor Contributions\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb0\"\u003EAA conceived the presented idea, conducted the analysis, and produced the figures. MB supervised the findings of this work. Both authors discussed the results and contributed to the final manuscript.\u003C\u002Fp\u003E\r\n\u003Ca id=\"conf1\" name=\"conf1\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EConflict of Interest\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb0\"\u003EThe authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.\u003C\u002Fp\u003E\r\n\u003Ca id=\"refer1\" name=\"refer1\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EReferences\u003C\u002Fh2\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B1\" id=\"B1\"\u003E\u003C\u002Fa\u003EAbdelhameed, A. M., and Bayoumi, M. (2018). &#x201C;Semi-supervised deep learning system for epileptic seizures onset prediction,&#x201D; in \u003Ci\u003EProceedings of the 2018 17th IEEE International Conference on Machine Learning and Applications (ICMLA)\u003C\u002Fi\u003E, Orlando, FL. doi: 10.1109\u002Ficmla.2018.00191\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1109\u002Ficmla.2018.00191\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Semi-supervised+deep+learning+system+for+epileptic+seizures+onset+prediction&#x0026;journal=Proceedings+of+the+2018+17th+IEEE+International+Conference+on+Machine+Learning+and+Applications+%28ICMLA%29&#x0026;author=Abdelhameed+A.+M.&#x0026;author=Bayoumi+M.&#x0026;publication_year=2018\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B2\" id=\"B2\"\u003E\u003C\u002Fa\u003EAbdelhameed, A. M., and Bayoumi, M. (2019). Semi-supervised EEG signals classification system for epileptic seizure detection. \u003Ci\u003EIEEE Signal Process. Lett.\u003C\u002Fi\u003E 26, 1922&#x2013;1926. doi: 10.1109\u002Flsp.2019.2953870\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1109\u002Flsp.2019.2953870\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Semi-supervised+EEG+signals+classification+system+for+epileptic+seizure+detection%2E&#x0026;journal=IEEE+Signal+Process%2E+Lett%2E&#x0026;author=Abdelhameed+A.+M.&#x0026;author=Bayoumi+M.&#x0026;publication_year=2019&#x0026;volume=26&#x0026;pages=1922&#x2013;1926\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B3\" id=\"B3\"\u003E\u003C\u002Fa\u003EAbdelhameed, A. M., Daoud, H. G., and Bayoumi, M. (2018a). &#x201C;Deep convolutional bidirectional LSTM recurrent neural network for epileptic seizure detection,&#x201D; in \u003Ci\u003EProceedings of the 2018 16th IEEE International New Circuits and Systems Conference (NEWCAS)\u003C\u002Fi\u003E, Montreal, QC. doi: 10.1109\u002Fnewcas.2018.8585542\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1109\u002Fnewcas.2018.8585542\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Deep+convolutional+bidirectional+LSTM+recurrent+neural+network+for+epileptic+seizure+detection&#x0026;journal=Proceedings+of+the+2018+16th+IEEE+International+New+Circuits+and+Systems+Conference+%28NEWCAS%29&#x0026;author=Abdelhameed+A.+M.&#x0026;author=Daoud+H.+G.&#x0026;author=Bayoumi+M.&#x0026;publication_year=2018a\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B4\" id=\"B4\"\u003E\u003C\u002Fa\u003EAbdelhameed, A. M., Daoud, H. G., and Bayoumi, M. (2018b). &#x201C;Epileptic seizure detection using deep convolutional autoencoder,&#x201D; in \u003Ci\u003EProceedings of the 2018 IEEE International Workshop on Signal Processing Systems (SiPS)\u003C\u002Fi\u003E, Cape Town. doi: 10.1109\u002Fsips.2018.8598447\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1109\u002Fsips.2018.8598447\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Epileptic+seizure+detection+using+deep+convolutional+autoencoder&#x0026;journal=Proceedings+of+the+2018+IEEE+International+Workshop+on+Signal+Processing+Systems+%28SiPS%29&#x0026;author=Abdelhameed+A.+M.&#x0026;author=Daoud+H.+G.&#x0026;author=Bayoumi+M.&#x0026;publication_year=2018b\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B5\" id=\"B5\"\u003E\u003C\u002Fa\u003EAcharya, U. R., Oh, S. L., Hagiwara, Y., Tan, J. H., and Adeli, H. (2018). Deep convolutional neural network for the automated detection and diagnosis of seizure using EEG signals. \u003Ci\u003EComput. Biol. Med.\u003C\u002Fi\u003E 100, 270&#x2013;278. doi: 10.1016\u002Fj.compbiomed.2017.09.017\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fpubmed.ncbi.nlm.nih.gov\u002F28974302\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.compbiomed.2017.09.017\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Deep+convolutional+neural+network+for+the+automated+detection+and+diagnosis+of+seizure+using+EEG+signals%2E&#x0026;journal=Comput%2E+Biol%2E+Med%2E&#x0026;author=Acharya+U.+R.&#x0026;author=Oh+S.+L.&#x0026;author=Hagiwara+Y.&#x0026;author=Tan+J.+H.&#x0026;author=Adeli+H.&#x0026;publication_year=2018&#x0026;volume=100&#x0026;pages=270&#x2013;278\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B6\" id=\"B6\"\u003E\u003C\u002Fa\u003EAcharya, U. R., Sree, S. V., Swapna, G., Martis, R. J., and Suri, J. S. (2013). Automated EEG analysis of epilepsy: a review. \u003Ci\u003EKnowled. Based Syst.\u003C\u002Fi\u003E 45, 147&#x2013;165. doi: 10.1016\u002Fj.knosys.2013.02.014\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.knosys.2013.02.014\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Automated+EEG+analysis+of+epilepsy%3A+a+review%2E&#x0026;journal=Knowled%2E+Based+Syst%2E&#x0026;author=Acharya+U.+R.&#x0026;author=Sree+S.+V.&#x0026;author=Swapna+G.&#x0026;author=Martis+R.+J.&#x0026;author=Suri+J.+S.&#x0026;publication_year=2013&#x0026;volume=45&#x0026;pages=147&#x2013;165\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B7\" id=\"B7\"\u003E\u003C\u002Fa\u003EBengio, Y., Simard, P., and Frasconi, P. (1994). Learning long-term dependencies with gradient descent is difficult. \u003Ci\u003EIEEE Transact. Neur. Netw.\u003C\u002Fi\u003E 5, 157&#x2013;166. doi: 10.1109\u002F72.279181\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1109\u002F72.279181\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Learning+long-term+dependencies+with+gradient+descent+is+difficult%2E&#x0026;journal=IEEE+Transact%2E+Neur%2E+Netw%2E&#x0026;author=Bengio+Y.&#x0026;author=Simard+P.&#x0026;author=Frasconi+P.&#x0026;publication_year=1994&#x0026;volume=5&#x0026;pages=157&#x2013;166\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B8\" id=\"B8\"\u003E\u003C\u002Fa\u003EBottou, L. (2004). \u003Ci\u003EStochastic Learning. Advanced Lectures on Machine Learning, LNAI\u003C\u002Fi\u003E, Vol. 3176. Berlin: Springer, 146&#x2013;168.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;journal=Stochastic+Learning%2E+Advanced+Lectures+on+Machine+Learning%2C+LNAI&#x0026;author=Bottou+L.&#x0026;publication_year=2004&#x0026;volume=Vol. 3176&#x0026;pages=146&#x2013;168\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B9\" id=\"B9\"\u003E\u003C\u002Fa\u003EChen, G., Xie, W., Bui, T. D., and Krzy&#x017C;ak, A. (2017). Automatic epileptic seizure detection in EEG using nonsubsampled wavelet&#x2013;fourier features. \u003Ci\u003EJ. Med. Biol. Eng.\u003C\u002Fi\u003E 37, 123&#x2013;131. doi: 10.1007\u002Fs40846-016-0214-0\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1007\u002Fs40846-016-0214-0\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Automatic+epileptic+seizure+detection+in+EEG+using+nonsubsampled+wavelet&#x2013;fourier+features%2E&#x0026;journal=J%2E+Med%2E+Biol%2E+Eng%2E&#x0026;author=Chen+G.&#x0026;author=Xie+W.&#x0026;author=Bui+T.+D.&#x0026;author=Krzy&#x017C;ak+A.&#x0026;publication_year=2017&#x0026;volume=37&#x0026;pages=123&#x2013;131\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B10\" id=\"B10\"\u003E\u003C\u002Fa\u003EEpilepsy in Children (2020). \u003Ci\u003EDiagnosis &#x0026; Treatment HealthyChildren.org.\u003C\u002Fi\u003E Available online at: \u003Ca href=\"https:\u002F\u002Fwww.healthychildren.org\u002FEnglish\u002Fhealth-issues\u002Fconditions\u002Fseizures\u002FPages\u002FEpilepsy-in-Children-Diagnosis-and-Treatment.aspx\"\u003Ehttps:\u002F\u002Fwww.healthychildren.org\u002FEnglish\u002Fhealth-issues\u002Fconditions\u002Fseizures\u002FPages\u002FEpilepsy-in-Children-Diagnosis-and-Treatment.aspx\u003C\u002Fa\u003E (accessed December 15, 2020).\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;journal=Diagnosis+&#x0026;+Treatment+HealthyChildren%2Eorg%2E&#x0026;publication_year=2020\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B11\" id=\"B11\"\u003E\u003C\u002Fa\u003EGao, Y., Gao, B., Chen, Q., Liu, J., and Zhang, Y. (2020). Deep convolutional neural network-based epileptic electroencephalogram (EEG) signal classification. \u003Ci\u003EFront. Neurol.\u003C\u002Fi\u003E 11:375. doi: 10.3389\u002Ffneur.2020.00375\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fpubmed.ncbi.nlm.nih.gov\u002F32528398\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.3389\u002Ffneur.2020.00375\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Deep+convolutional+neural+network-based+epileptic+electroencephalogram+%28EEG%29+signal+classification%2E&#x0026;journal=Front%2E+Neurol%2E&#x0026;author=Gao+Y.&#x0026;author=Gao+B.&#x0026;author=Chen+Q.&#x0026;author=Liu+J.&#x0026;author=Zhang+Y.&#x0026;publication_year=2020&#x0026;volume=11&#x0026;issue=375\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B12\" id=\"B12\"\u003E\u003C\u002Fa\u003EGogna, A., Majumdar, A., and Ward, R. (2017). Semi-supervised stacked label consistent autoencoder for reconstruction and analysis of biomedical signals. \u003Ci\u003EIEEE Transact. Biomed. Eng.\u003C\u002Fi\u003E 64, 2196&#x2013;2205. doi: 10.1109\u002Ftbme.2016.2631620\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fpubmed.ncbi.nlm.nih.gov\u002F27893378\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1109\u002Ftbme.2016.2631620\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Semi-supervised+stacked+label+consistent+autoencoder+for+reconstruction+and+analysis+of+biomedical+signals%2E&#x0026;journal=IEEE+Transact%2E+Biomed%2E+Eng%2E&#x0026;author=Gogna+A.&#x0026;author=Majumdar+A.&#x0026;author=Ward+R.&#x0026;publication_year=2017&#x0026;volume=64&#x0026;pages=2196&#x2013;2205\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B13\" id=\"B13\"\u003E\u003C\u002Fa\u003EGoodfellow, I., Bengio, Y., and Courville, A. (2017). \u003Ci\u003EDeep Learning.\u003C\u002Fi\u003E Cambridge, MA: The MIT Press.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;journal=Deep+Learning%2E&#x0026;author=Goodfellow+I.&#x0026;author=Bengio+Y.&#x0026;author=Courville+A.&#x0026;publication_year=2017\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B14\" id=\"B14\"\u003E\u003C\u002Fa\u003EHe, H., and Ma, Y. (2013). \u003Ci\u003EImbalanced Learning Foundations, Algorithms, and Applications.\u003C\u002Fi\u003E Hoboken, NJ: IEEE Press.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;journal=Imbalanced+Learning+Foundations%2C+Algorithms%2C+and+Applications%2E&#x0026;author=He+H.&#x0026;author=Ma+Y.&#x0026;publication_year=2013\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B15\" id=\"B15\"\u003E\u003C\u002Fa\u003EHochreiter, S., and Schmidhuber, J. (1997). Long short-term memory. \u003Ci\u003ENeur. Comput.\u003C\u002Fi\u003E 9, 1735&#x2013;1780.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Long+short-term+memory%2E&#x0026;journal=Neur%2E+Comput%2E&#x0026;author=Hochreiter+S.&#x0026;author=Schmidhuber+J.&#x0026;publication_year=1997&#x0026;volume=9&#x0026;pages=1735&#x2013;1780\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B16\" id=\"B16\"\u003E\u003C\u002Fa\u003EHossain, M. S., Amin, S. U., Alsulaiman, M., and Muhammad, G. (2019). Applying deep learning for epilepsy seizure detection and brain mapping visualization. \u003Ci\u003EACM Transact. Multimed. Comput. Commun. Appl.\u003C\u002Fi\u003E 15, 1&#x2013;17. doi: 10.1145\u002F3241056\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1145\u002F3241056\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Applying+deep+learning+for+epilepsy+seizure+detection+and+brain+mapping+visualization%2E&#x0026;journal=ACM+Transact%2E+Multimed%2E+Comput%2E+Commun%2E+Appl%2E&#x0026;author=Hossain+M.+S.&#x0026;author=Amin+S.+U.&#x0026;author=Alsulaiman+M.&#x0026;author=Muhammad+G.&#x0026;publication_year=2019&#x0026;volume=15&#x0026;pages=1&#x2013;17\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B17\" id=\"B17\"\u003E\u003C\u002Fa\u003EHu, W., Cao, J., Lai, X., and Liu, J. (2019). Mean amplitude spectrum based epileptic state classification for seizure prediction using convolutional neural networks. \u003Ci\u003EJ. Ambient Intell. Human. Comput.\u003C\u002Fi\u003E doi: 10.1007\u002Fs12652-019-01220-6\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1007\u002Fs12652-019-01220-6\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Mean+amplitude+spectrum+based+epileptic+state+classification+for+seizure+prediction+using+convolutional+neural+networks%2E&#x0026;journal=J%2E+Ambient+Intell%2E+Human%2E+Comput%2E&#x0026;author=Hu+W.&#x0026;author=Cao+J.&#x0026;author=Lai+X.&#x0026;author=Liu+J.&#x0026;publication_year=2019\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B18\" id=\"B18\"\u003E\u003C\u002Fa\u003EIoffe, S., and Szegedy, C. (2015). Batch normalization: accelerating deep network training by reducing internal covariate shift. \u003Ci\u003EarXiv\u003C\u002Fi\u003E [Preprint]. arXiv:1502.03167.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Batch+normalization%3A+accelerating+deep+network+training+by+reducing+internal+covariate+shift%2E&#x0026;journal=arXiv&#x0026;author=Ioffe+S.&#x0026;author=Szegedy+C.&#x0026;publication_year=2015\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B19\" id=\"B19\"\u003E\u003C\u002Fa\u003EJaiswal, A. K., and Banka, H. (2017). Local pattern transformation based feature extraction techniques for classification of epileptic EEG signals. \u003Ci\u003EBiomed. Signal Process. Control\u003C\u002Fi\u003E 34, 81&#x2013;92. doi: 10.1016\u002Fj.bspc.2017.01.005\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.bspc.2017.01.005\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Local+pattern+transformation+based+feature+extraction+techniques+for+classification+of+epileptic+EEG+signals%2E&#x0026;journal=Biomed%2E+Signal+Process%2E+Control&#x0026;author=Jaiswal+A.+K.&#x0026;author=Banka+H.&#x0026;publication_year=2017&#x0026;volume=34&#x0026;pages=81&#x2013;92\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B20\" id=\"B20\"\u003E\u003C\u002Fa\u003EKe, H., Chen, D., Li, X., Tang, Y., Shah, T., and Ranjan, R. (2018). Towards brain big data classification: epileptic EEG identification with a lightweight VGGNet on global MIC. \u003Ci\u003EIEEE Access\u003C\u002Fi\u003E 6, 14722&#x2013;14733. doi: 10.1109\u002Faccess.2018.2810882\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1109\u002Faccess.2018.2810882\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Towards+brain+big+data+classification%3A+epileptic+EEG+identification+with+a+lightweight+VGGNet+on+global+MIC%2E&#x0026;journal=IEEE+Access&#x0026;author=Ke+H.&#x0026;author=Chen+D.&#x0026;author=Li+X.&#x0026;author=Tang+Y.&#x0026;author=Shah+T.&#x0026;author=Ranjan+R.&#x0026;publication_year=2018&#x0026;volume=6&#x0026;pages=14722&#x2013;14733\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B21\" id=\"B21\"\u003E\u003C\u002Fa\u003EKingma, D., and Ba, J. (2014). Adam: a method for stochastic optimization. \u003Ci\u003EArXiv\u003C\u002Fi\u003E [Preprint]. arXiv:1412.6980.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Adam%3A+a+method+for+stochastic+optimization%2E&#x0026;journal=ArXiv&#x0026;author=Kingma+D.&#x0026;author=Ba+J.&#x0026;publication_year=2014\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B22\" id=\"B22\"\u003E\u003C\u002Fa\u003EKrizhevsky, A., Sutskever, I., and Hinton, G. E. (2017). ImageNet classification with deep convolutional neural networks. \u003Ci\u003ECommun. ACM\u003C\u002Fi\u003E 60, 84&#x2013;90. doi: 10.1145\u002F3065386\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1145\u002F3065386\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=ImageNet+classification+with+deep+convolutional+neural+networks%2E&#x0026;journal=Commun%2E+ACM&#x0026;author=Krizhevsky+A.&#x0026;author=Sutskever+I.&#x0026;author=Hinton+G.+E.&#x0026;publication_year=2017&#x0026;volume=60&#x0026;pages=84&#x2013;90\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B23\" id=\"B23\"\u003E\u003C\u002Fa\u003EKruskal, W. H., and Wallis, W. A. (1952). Use of ranks in one-criterion variance analysis. \u003Ci\u003EJ. Am. Statist. Associat.\u003C\u002Fi\u003E 47, 583&#x2013;621. doi: 10.1080\u002F01621459.1952.10483441\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1080\u002F01621459.1952.10483441\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Use+of+ranks+in+one-criterion+variance+analysis%2E&#x0026;journal=J%2E+Am%2E+Statist%2E+Associat%2E&#x0026;author=Kruskal+W.+H.&#x0026;author=Wallis+W.+A.&#x0026;publication_year=1952&#x0026;volume=47&#x0026;pages=583&#x2013;621\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B24\" id=\"B24\"\u003E\u003C\u002Fa\u003EMozer, M. C. (1989). A focused backpropagation algorithm for temporal pattern recognition. \u003Ci\u003EComp. Syst.\u003C\u002Fi\u003E 3, 349&#x2013;381.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=A+focused+backpropagation+algorithm+for+temporal+pattern+recognition%2E&#x0026;journal=Comp%2E+Syst%2E&#x0026;author=Mozer+M.+C.&#x0026;publication_year=1989&#x0026;volume=3&#x0026;pages=349&#x2013;381\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B25\" id=\"B25\"\u003E\u003C\u002Fa\u003EOrhan, U., Hekim, M., and Ozer, M. (2011). EEG signals classification using the K-means clustering and a multilayer perceptron neural network model. \u003Ci\u003EExp. Syst. Appl.\u003C\u002Fi\u003E 38, 13475&#x2013;13481. doi: 10.1016\u002Fj.eswa.2011.04.149\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.eswa.2011.04.149\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=EEG+signals+classification+using+the+K-means+clustering+and+a+multilayer+perceptron+neural+network+model%2E&#x0026;journal=Exp%2E+Syst%2E+Appl%2E&#x0026;author=Orhan+U.&#x0026;author=Hekim+M.&#x0026;author=Ozer+M.&#x0026;publication_year=2011&#x0026;volume=38&#x0026;pages=13475&#x2013;13481\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B26\" id=\"B26\"\u003E\u003C\u002Fa\u003ERaghu, S., Sriraam, N., Hegde, A. S., and Kubben, P. L. (2019). A novel approach for classification of epileptic seizures using matrix determinant. \u003Ci\u003EExp. Syst. Appl.\u003C\u002Fi\u003E 127, 323&#x2013;341. doi: 10.1016\u002Fj.eswa.2019.03.021\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.eswa.2019.03.021\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=A+novel+approach+for+classification+of+epileptic+seizures+using+matrix+determinant%2E&#x0026;journal=Exp%2E+Syst%2E+Appl%2E&#x0026;author=Raghu+S.&#x0026;author=Sriraam+N.&#x0026;author=Hegde+A.+S.&#x0026;author=Kubben+P.+L.&#x0026;publication_year=2019&#x0026;volume=127&#x0026;pages=323&#x2013;341\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B27\" id=\"B27\"\u003E\u003C\u002Fa\u003ESamiee, K., Kovacs, P., and Gabbouj, M. (2015). Epileptic seizure classification of EEG time-series using rational discrete short-time fourier transform. \u003Ci\u003EIEEE Transact. Biomed. Eng.\u003C\u002Fi\u003E 62, 541&#x2013;552. doi: 10.1109\u002Ftbme.2014.2360101\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fpubmed.ncbi.nlm.nih.gov\u002F25265603\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1109\u002Ftbme.2014.2360101\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Epileptic+seizure+classification+of+EEG+time-series+using+rational+discrete+short-time+fourier+transform%2E&#x0026;journal=IEEE+Transact%2E+Biomed%2E+Eng%2E&#x0026;author=Samiee+K.&#x0026;author=Kovacs+P.&#x0026;author=Gabbouj+M.&#x0026;publication_year=2015&#x0026;volume=62&#x0026;pages=541&#x2013;552\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B28\" id=\"B28\"\u003E\u003C\u002Fa\u003ESchomer, D. L., and Lopez da Silva, H. F. (2018). \u003Ci\u003ENiedermeyer&#x2019;s Electroencephalography: Basic Principles, Clinical Applications, and Related Fields.\u003C\u002Fi\u003E New York, NY: Oxford University Press.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;journal=Niedermeyer&#x2019;s+Electroencephalography%3A+Basic+Principles%2C+Clinical+Applications%2C+and+Related+Fields%2E&#x0026;author=Schomer+D.+L.&#x0026;author=Lopez+da+Silva+H.+F.&#x0026;publication_year=2018\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B29\" id=\"B29\"\u003E\u003C\u002Fa\u003EShe, Q., Hu Bo Luo, Z., Nguyen, T., and Zhang, L. (2018). A hierarchical semi-supervised extreme learning machine method for EEG recognition. \u003Ci\u003EMed. Biol. Eng. Comput.\u003C\u002Fi\u003E 57, 147&#x2013;157. doi: 10.1007\u002Fs11517-018-1875-3\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fpubmed.ncbi.nlm.nih.gov\u002F30054779\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1007\u002Fs11517-018-1875-3\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=A+hierarchical+semi-supervised+extreme+learning+machine+method+for+EEG+recognition%2E&#x0026;journal=Med%2E+Biol%2E+Eng%2E+Comput%2E&#x0026;author=She+Q.&#x0026;author=Hu&#x0026;author=Bo&#x0026;author=Luo+Z.&#x0026;author=Nguyen+T.&#x0026;author=Zhang+L.&#x0026;publication_year=2018&#x0026;volume=57&#x0026;pages=147&#x2013;157\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B30\" id=\"B30\"\u003E\u003C\u002Fa\u003EShoeb, A. H. (2009). \u003Ci\u003EApplication of Machine Learning to Epileptic Seizure Onset Detection and Treatment.\u003C\u002Fi\u003E Ph.D. thesis, Massachusetts Institute of Technology, Cambridge, MA.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;journal=Application+of+Machine+Learning+to+Epileptic+Seizure+Onset+Detection+and+Treatment%2E&#x0026;author=Shoeb+A.+H.&#x0026;publication_year=2009\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B31\" id=\"B31\"\u003E\u003C\u002Fa\u003ESokolova, M., and Lapalme, G. (2009). A systematic analysis of performance measures for classification tasks. \u003Ci\u003EInform. Process. Manag.\u003C\u002Fi\u003E 45, 427&#x2013;437. doi: 10.1016\u002Fj.ipm.2009.03.002\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.ipm.2009.03.002\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=A+systematic+analysis+of+performance+measures+for+classification+tasks%2E&#x0026;journal=Inform%2E+Process%2E+Manag%2E&#x0026;author=Sokolova+M.&#x0026;author=Lapalme+G.&#x0026;publication_year=2009&#x0026;volume=45&#x0026;pages=427&#x2013;437\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B32\" id=\"B32\"\u003E\u003C\u002Fa\u003ESong, Y., Crowcroft, J., and Zhang, J. (2012). Automatic epileptic seizure detection in EEGs based on optimized sample entropy and extreme learning machine. \u003Ci\u003EJ. Neurosci. Methods\u003C\u002Fi\u003E 210, 132&#x2013;146. doi: 10.1016\u002Fj.jneumeth.2012.07.003\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fpubmed.ncbi.nlm.nih.gov\u002F22824535\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.jneumeth.2012.07.003\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Automatic+epileptic+seizure+detection+in+EEGs+based+on+optimized+sample+entropy+and+extreme+learning+machine%2E&#x0026;journal=J%2E+Neurosci%2E+Methods&#x0026;author=Song+Y.&#x0026;author=Crowcroft+J.&#x0026;author=Zhang+J.&#x0026;publication_year=2012&#x0026;volume=210&#x0026;pages=132&#x2013;146\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B33\" id=\"B33\"\u003E\u003C\u002Fa\u003ETieleman, T., and Hinton, G. (2012). Lecture 6.5-RMSprop: divide the gradient by a running average of its recent magnitude. \u003Ci\u003ECOURSERA Neur. Netw. Mach. Learn.\u003C\u002Fi\u003E 4, 26&#x2013;31.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Lecture+6%2E5-RMSprop%3A+divide+the+gradient+by+a+running+average+of+its+recent+magnitude%2E&#x0026;journal=COURSERA+Neur%2E+Netw%2E+Mach%2E+Learn%2E&#x0026;author=Tieleman+T.&#x0026;author=Hinton+G.&#x0026;publication_year=2012&#x0026;volume=4&#x0026;pages=26&#x2013;31\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B34\" id=\"B34\"\u003E\u003C\u002Fa\u003EWang, D., Ren, D., Li, K., Feng, Y., Ma, D., Yan, X., et al. (2018). Epileptic seizure detection in long-term EEG recordings by using wavelet-based directed transfer function. \u003Ci\u003EIEEE Transact. Biomed. Eng.\u003C\u002Fi\u003E 65, 2591&#x2013;2599. doi: 10.1109\u002Ftbme.2018.2809798\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fpubmed.ncbi.nlm.nih.gov\u002F29993489\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1109\u002Ftbme.2018.2809798\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Epileptic+seizure+detection+in+long-term+EEG+recordings+by+using+wavelet-based+directed+transfer+function%2E&#x0026;journal=IEEE+Transact%2E+Biomed%2E+Eng%2E&#x0026;author=Wang+D.&#x0026;author=Ren+D.&#x0026;author=Li+K.&#x0026;author=Feng+Y.&#x0026;author=Ma+D.&#x0026;author=Yan+X.&#x0026;publication_year=2018&#x0026;volume=65&#x0026;pages=2591&#x2013;2599\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B35\" id=\"B35\"\u003E\u003C\u002Fa\u003EWang, X., Zhao, Y., and Pourpanah, F. (2020). Recent advances in deep learning. \u003Ci\u003EInt. J. Mach. Learn. Cybern.\u003C\u002Fi\u003E 11, 747&#x2013;750. doi: 10.1007\u002Fs13042-020-01096-5\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1007\u002Fs13042-020-01096-5\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Recent+advances+in+deep+learning%2E&#x0026;journal=Int%2E+J%2E+Mach%2E+Learn%2E+Cybern%2E&#x0026;author=Wang+X.&#x0026;author=Zhao+Y.&#x0026;author=Pourpanah+F.&#x0026;publication_year=2020&#x0026;volume=11&#x0026;pages=747&#x2013;750\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B36\" id=\"B36\"\u003E\u003C\u002Fa\u003EWei, X., Zhou, L., Zhang, Z., Chen, Z., and Zhou, Y. (2019). Early prediction of epileptic seizures using a long-term recurrent convolutional network. \u003Ci\u003EJ. Neurosci. Methods\u003C\u002Fi\u003E 327:108395. doi: 10.1016\u002Fj.jneumeth.2019.108395\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fpubmed.ncbi.nlm.nih.gov\u002F31408651\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.jneumeth.2019.108395\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Early+prediction+of+epileptic+seizures+using+a+long-term+recurrent+convolutional+network%2E&#x0026;journal=J%2E+Neurosci%2E+Methods&#x0026;author=Wei+X.&#x0026;author=Zhou+L.&#x0026;author=Zhang+Z.&#x0026;author=Chen+Z.&#x0026;author=Zhou+Y.&#x0026;publication_year=2019&#x0026;volume=327&#x0026;issue=108395\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B37\" id=\"B37\"\u003E\u003C\u002Fa\u003EWorld Health Organization (2020). \u003Ci\u003EEpilepsy.\u003C\u002Fi\u003E Available online at: \u003Ca href=\"https:\u002F\u002Fwww.who.int\u002Fen\u002Fnews-room\u002Ffact-sheets\u002Fdetail\u002Fepilepsy\"\u003Ehttps:\u002F\u002Fwww.who.int\u002Fen\u002Fnews-room\u002Ffact-sheets\u002Fdetail\u002Fepilepsy\u003C\u002Fa\u003E (accessed December 5, 2020).\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;journal=Epilepsy%2E&#x0026;publication_year=2020\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B38\" id=\"B38\"\u003E\u003C\u002Fa\u003EYavuz, E., Kasapba&#x015F;&#x0131;, M. C., Ey&#x00FC;po&#x011F;lu, C., and Yaz&#x0131;c&#x0131;, R. (2018). An epileptic seizure detection system based on cepstral analysis and generalized regression neural network. \u003Ci\u003EBiocybern. Biomed. Eng.\u003C\u002Fi\u003E 38, 201&#x2013;216. doi: 10.1016\u002Fj.bbe.2018.01.002\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.bbe.2018.01.002\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=An+epileptic+seizure+detection+system+based+on+cepstral+analysis+and+generalized+regression+neural+network%2E&#x0026;journal=Biocybern%2E+Biomed%2E+Eng%2E&#x0026;author=Yavuz+E.&#x0026;author=Kasapba&#x015F;&#x0131;+M.+C.&#x0026;author=Ey&#x00FC;po&#x011F;lu+C.&#x0026;author=Yaz&#x0131;c&#x0131;+R.&#x0026;publication_year=2018&#x0026;volume=38&#x0026;pages=201&#x2013;216\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B39\" id=\"B39\"\u003E\u003C\u002Fa\u003EYuan, Y., Xun, G., Jia, K., and Zhang, A. (2017). &#x201C;A multi-view deep learning method for epileptic seizure detection using short-time fourier transform,&#x201D; in \u003Ci\u003EProceedings of the 8th ACM International Conference on Bioinformatics, Computational Biology, and Health Informatics\u003C\u002Fi\u003E, New York, NY. doi: 10.1145\u002F3107411.3107419\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1145\u002F3107411.3107419\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=A+multi-view+deep+learning+method+for+epileptic+seizure+detection+using+short-time+fourier+transform&#x0026;journal=Proceedings+of+the+8th+ACM+International+Conference+on+Bioinformatics%2C+Computational+Biology%2C+and+Health+Informatics&#x0026;author=Yuan+Y.&#x0026;author=Xun+G.&#x0026;author=Jia+K.&#x0026;author=Zhang+A.&#x0026;publication_year=2017\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B40\" id=\"B40\"\u003E\u003C\u002Fa\u003EZeiler, M. D. (2012). ADADELTA: an adaptive learning rate method. \u003Ci\u003EarXiv\u003C\u002Fi\u003E [Preprint]. arXiv:1212.5701.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=ADADELTA%3A+an+adaptive+learning+rate+method%2E&#x0026;journal=arXiv&#x0026;author=Zeiler+M.+D.&#x0026;publication_year=2012\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B41\" id=\"B41\"\u003E\u003C\u002Fa\u003EZhou, M., Tian, C., Cao, R., Wang, B., Niu, Y., Hu, T., et al. (2018). Epileptic Seizure detection based on EEG signals and CNN. \u003Ci\u003EFronti. Neuroinform.\u003C\u002Fi\u003E 12:95. doi: 10.3389\u002Ffninf.2018.00095\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"https:\u002F\u002Fpubmed.ncbi.nlm.nih.gov\u002F30618700\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.3389\u002Ffninf.2018.00095\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?&#x0026;title=Epileptic+Seizure+detection+based+on+EEG+signals+and+CNN%2E&#x0026;journal=Fronti%2E+Neuroinform%2E&#x0026;author=Zhou+M.&#x0026;author=Tian+C.&#x0026;author=Cao+R.&#x0026;author=Wang+B.&#x0026;author=Niu+Y.&#x0026;author=Hu+T.&#x0026;publication_year=2018&#x0026;volume=12&#x0026;issue=95\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"thinLineM20\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"AbstractSummary\"\u003E\r\n\u003Cp\u003E\u003Cspan\u003EKeywords\u003C\u002Fspan\u003E: deep learning, epileptic seizure detection, EEG, autoencoders, classification, convolutional neural network (CNN), bidirectional long short term memory (Bi LSTM)\u003C\u002Fp\u003E\r\n\u003Cp\u003E\u003Cspan\u003ECitation:\u003C\u002Fspan\u003E Abdelhameed A and Bayoumi M (2021) A Deep Learning Approach for Automatic Seizure Detection in Children With Epilepsy. \u003Ci\u003EFront. Comput. Neurosci.\u003C\u002Fi\u003E 15:650050. doi: 10.3389\u002Ffncom.2021.650050\u003C\u002Fp\u003E\r\n\u003Cp id=\"timestamps\"\u003E\r\n\u003Cspan\u003EReceived:\u003C\u002Fspan\u003E 06 January 2021; \u003Cspan\u003EAccepted:\u003C\u002Fspan\u003E 15 March 2021;\u003Cbr\u002F\u003E\u003Cspan\u003EPublished:\u003C\u002Fspan\u003E 08 April 2021.\u003C\u002Fp\u003E\r\n\u003Cdiv\u003E\r\n\u003Cp\u003EEdited by:\u003C\u002Fp\u003E\r\n\u003Ca href=\"https:\u002F\u002Floop.frontiersin.org\u002Fpeople\u002F1003479\u002Foverview\"\u003ESaman Sargolzaei\u003C\u002Fa\u003E, University of Tennessee at Martin, United States\u003C\u002Fdiv\u003E\r\n\u003Cdiv\u003E\r\n\u003Cp\u003EReviewed by:\u003C\u002Fp\u003E\r\n\u003Ca href=\"https:\u002F\u002Floop.frontiersin.org\u002Fpeople\u002F1191759\u002Foverview\"\u003ENhan Duy Truong\u003C\u002Fa\u003E, The University of Sydney, Australia\u003Cbr\u002F\u003E\r\n\u003Ca href=\"https:\u002F\u002Floop.frontiersin.org\u002Fpeople\u002F222607\u002Foverview\"\u003EAntonio Dourado\u003C\u002Fa\u003E, University of Coimbra, Portugal\u003C\u002Fdiv\u003E\r\n\u003Cp\u003E\u003Cspan\u003ECopyright\u003C\u002Fspan\u003E &#x00A9; 2021 Abdelhameed and Bayoumi. This is an open-access article distributed under the terms of the \u003Ca rel=\"license\" href=\"http:\u002F\u002Fcreativecommons.org\u002Flicenses\u002Fby\u002F4.0\u002F\" target=\"_blank\"\u003ECreative Commons Attribution License (CC BY)\u003C\u002Fa\u003E. The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.\u003C\u002Fp\u003E\r\n\u003Cp\u003E\u003Cspan\u003E*Correspondence:\u003C\u002Fspan\u003E Ahmed Abdelhameed, \u003Ca href=\"mailto:ahmed.abdelhameed1@louisiana.edu\"\u003Eahmed.abdelhameed1@louisiana.edu\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003C\u002Fdiv\u003E",menuHtml:"\u003Cul class=\"flyoutJournal\"\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h1\"\u003EAbstract\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h2\"\u003EIntroduction\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h3\"\u003EMaterials and Methods\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h4\"\u003EResults\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h5\"\u003EDiscussion\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h6\"\u003EConclusion\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h7\"\u003EData Availability Statement\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h8\"\u003EEthics Statement\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h9\"\u003EAuthor Contributions\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#conf1\"\u003EConflict of Interest\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#refer1\"\u003EReferences\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003C\u002Ful\u003E\r\n"},files:[{name:"EPUB.epub",fileServerPackageEntryId:h,type:{code:al,name:al}},{name:am,fileServerPackageEntryId:"fncom-15-650050\u002Ffncom-15-650050.pdf",type:{code:p,name:p}},{name:am,fileServerPackageEntryId:h,type:{code:p,name:p}},{name:"fncom-15-650050.xml",fileServerPackageEntryId:"fncom-15-650050\u002Ffncom-15-650050.xml",type:{code:"NLM_XML",name:"XML"}},{name:"Provisional PDF.pdf",fileServerPackageEntryId:h,type:{code:p,name:p}}]},currentArticlePageMetaInfo:{title:an,link:[{rel:"canonical",href:ao}],meta:[{hid:u,property:u,name:u,content:ap},{hid:aq,property:aq,name:"title",content:an},{hid:ar,property:ar,name:u,content:ap},{hid:as,name:as,content:"deep learning,Epileptic seizure detection,EEG,Autoencoders,Classification,Convolutional Neural Networks,Bidirectional long short term memory (Bi LSTM)"},{hid:at,property:at,name:"site_name",content:v},{hid:au,property:au,name:C,content:"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=1200&f=png\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F650050\u002Ffncom-15-650050-HTML\u002Fimage_m\u002Ffncom-15-650050-g001.jpg"},{hid:av,property:av,name:"type",content:"article"},{hid:aw,property:aw,name:"url",content:ao},{hid:ax,name:ax,content:"summary_large_image"},{hid:ay,name:ay,content:"15"},{hid:az,name:az,content:o},{hid:aA,name:aA,content:v},{hid:aB,name:aB,content:D},{hid:aC,name:aC,content:E},{hid:aD,name:aD,content:Y},{hid:aE,name:aE,content:"650050"},{hid:aF,name:aF,content:"English"},{hid:aG,name:aG,content:Z},{hid:aH,name:aH,content:"deep learning; Epileptic seizure detection; EEG; Autoencoders; Classification; Convolutional Neural Networks; Bidirectional long short term memory (Bi LSTM)"},{hid:aI,name:aI,content:_},{hid:aJ,name:aJ,content:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fcomputational-neuroscience\u002Farticles\u002F10.3389\u002Ffncom.2021.650050\u002Fpdf"},{hid:aK,name:aK,content:"2021\u002F03\u002F15"},{hid:aL,name:aL,content:"2021\u002F04\u002F08"},{hid:"citation_author_0",name:aM,content:"Abdelhameed, Ahmed"},{hid:"citation_author_institution_0",name:aN,content:aO},{hid:"citation_author_1",name:aM,content:"Bayoumi, Magdy"},{hid:"citation_author_institution_1",name:aN,content:aO},{hid:aP,name:aP,content:"doi:10.3389\u002Ffncom.2021.650050"}],script:[{src:"https:\u002F\u002Fcdnjs.cloudflare.com\u002Fpolyfill\u002Fv3\u002Fpolyfill.min.js?features=es6",body:g,async:g},{src:"https:\u002F\u002Fcdnjs.cloudflare.com\u002Fajax\u002Flibs\u002Fmathjax\u002F2.7.1\u002FMathJax.js?config=TeX-MML-AM_CHTML",body:g,async:g},{src:"https:\u002F\u002Fd1bxh8uas1mnw7.cloudfront.net\u002Fassets\u002Faltmetric_badges-f0bc9b243ff5677d05460c1eb71834ca998946d764eb3bc244ab4b18ba50d21e.js",body:g,async:g},{src:"https:\u002F\u002Fapi.altmetric.com\u002Fv1\u002Fdoi\u002F10.3389\u002Ffncom.2021.650050?callback=_altmetric.embed_callback&domain=www.frontiersin.org&key=3c130976ca2b8f2e88f8377633751ba1&cache_until=14-15",body:g,async:g},{src:"https:\u002F\u002Fwidgets.figshare.com\u002Fstatic\u002Ffigshare.js",body:g,async:g},{src:"https:\u002F\u002Fcrossmark-cdn.crossref.org\u002Fwidget\u002Fv2.0\u002Fwidget.js",body:g,async:g}]},articleHubArticlesList:[],showCrossmarkWidget:g,hasSupplementalData:l,isPreviewArticlePage:l,settingsFeaturesSwitchers:{displayTitlePillLabels:g,displayRelatedArticlesBox:g,showEditors:g,showReviewers:g,showLoopImpactLink:g},tenantConfig:{spaceId:c,name:v,availableJournalPages:[aQ,aR,aS,"volumes","about"]},components:{ibar:{tenantLogo:h,journalLogo:h,aboutUs:[{title:"Who we are",links:[{text:"Mission and values",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fmission",target:f,ariaLabel:e},{text:"History",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fhistory",target:f,ariaLabel:e},{text:"Leadership",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fleadership",target:f,ariaLabel:e},{text:"Awards",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fawards",target:f,ariaLabel:e}]},{title:"Impact and progress",links:[{text:"Frontiers' impact",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fimpact",target:f,ariaLabel:e},{text:"Progress Report 2022",url:"https:\u002F\u002Fprogressreport.frontiersin.org\u002F?utm_source=fweb&utm_medium=frep&utm_campaign=pr20",target:k,ariaLabel:e},{text:"All progress reports",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fprogress-reports",target:f,ariaLabel:e}]},{title:"Publishing model",links:[{text:aT,url:aU,target:f,ariaLabel:e},{text:"Open access",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fopen-access",target:f,ariaLabel:e},{text:aV,url:aW,target:f,ariaLabel:e},{text:"Peer review",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fpeer-review",target:f,ariaLabel:e},{text:"Research integrity",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fresearch-integrity",target:f,ariaLabel:e},{text:aX,url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fresearch-topics",target:f,ariaLabel:e}]},{title:"Services",links:[{text:"Societies",url:"https:\u002F\u002Fpublishingpartnerships.frontiersin.org\u002F",target:k,ariaLabel:e},{text:"National consortia",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fopen-access-agreements\u002Fconsortia",target:f,ariaLabel:e},{text:"Institutional partnerships",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fopen-access-agreements",target:f,ariaLabel:e},{text:"Collaborators",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fcollaborators",target:f,ariaLabel:e}]},{title:"More from Frontiers",links:[{text:"Frontiers Forum",url:aY,target:k,ariaLabel:"this link will take you to the Frontiers Forum website"},{text:aZ,url:a_,target:k,ariaLabel:a$},{text:"Press office",url:"https:\u002F\u002Fpressoffice.frontiersin.org\u002F",target:k,ariaLabel:"this link will take you to the Frontiers press office website"},{text:"Sustainability",url:"https:\u002F\u002Fwww.frontiersin.orgabout\u002Fsustainability",target:f,ariaLabel:"link to information about Frontiers' sustainability"},{text:ba,url:bb,target:k,ariaLabel:"this link will take you to the Frontiers careers website"},{text:"Contact us",url:bc,target:f,ariaLabel:"this link will take you to the help pages to contact our support team"}]}],submitUrl:"https:\u002F\u002Fwww.frontiersin.org\u002Fsubmission\u002Fsubmit?domainid=1&fieldid=55&specialtyid=237&entitytype=1&entityid=9",showSubmitButton:g,journal:{id:n,name:o,slug:r,sections:[]},sectionTerm:"Sections",aboutJournal:[{title:"Scope",links:[{text:"Specialty chief editors",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fcomputational-neuroscience\u002Fabout#about-editors",target:f,ariaLabel:e},{text:"Mission & scope",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fcomputational-neuroscience\u002Fabout#about-scope",target:f,ariaLabel:e},{text:"Facts",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fcomputational-neuroscience\u002Fabout#about-facts",target:f,ariaLabel:e},{text:"Submission",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fcomputational-neuroscience\u002Fabout#about-submission",target:f,ariaLabel:e},{text:"Open access statement",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fcomputational-neuroscience\u002Fabout#about-open",target:f,ariaLabel:e},{text:"Copyright statement",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fcomputational-neuroscience\u002Fabout#copyright-statement",target:f,ariaLabel:e},{text:"Quality",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fcomputational-neuroscience\u002Fabout#about-quality",target:f,ariaLabel:e}]},{title:"For authors",links:[{text:"Why submit?",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fcomputational-neuroscience\u002Ffor-authors\u002Fwhy-submit",target:f,ariaLabel:e},{text:"Article types",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fcomputational-neuroscience\u002Ffor-authors\u002Farticle-types",target:f,ariaLabel:e},{text:bd,url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fcomputational-neuroscience\u002Ffor-authors\u002Fauthor-guidelines",target:f,ariaLabel:e},{text:be,url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fcomputational-neuroscience\u002Ffor-authors\u002Feditor-guidelines",target:f,ariaLabel:e},{text:"Publishing fees",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fcomputational-neuroscience\u002Ffor-authors\u002Fpublishing-fees",target:f,ariaLabel:e},{text:"Submission checklist",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fcomputational-neuroscience\u002Ffor-authors\u002Fsubmission-checklist",target:f,ariaLabel:e},{text:"Contact editorial office",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fcomputational-neuroscience\u002Ffor-authors\u002Fcontact-editorial-office",target:f,ariaLabel:e}]}],mainLinks:[{text:"All journals",url:bf,target:f,ariaLabel:e},{text:"All articles",url:bg,target:f,ariaLabel:e}],journalLinks:[{text:bh,url:aQ,target:f,ariaLabel:e},{text:aX,url:aS,target:f,ariaLabel:e},{text:"Editorial board",url:aR,target:f,ariaLabel:e}],helpCenterLink:{text:w,url:bi,target:k,ariaLabel:w}},footer:{blocks:[{title:"Guidelines",links:[{text:bd,url:"https:\u002F\u002Fwww.frontiersin.org\u002Fguidelines\u002Fauthor-guidelines",target:f,ariaLabel:e},{text:be,url:"https:\u002F\u002Fwww.frontiersin.org\u002Fguidelines\u002Feditor-guidelines",target:f,ariaLabel:e},{text:"Policies and publication ethics",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fguidelines\u002Fpolicies-and-publication-ethics",target:f,ariaLabel:e},{text:aV,url:aW,target:f,ariaLabel:e}]},{title:"Explore",links:[{text:bh,url:bg,target:f,ariaLabel:e},{text:"Research Topics ",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fresearch-topics",target:f,ariaLabel:e},{text:"Journals",url:bf,target:f,ariaLabel:e},{text:aT,url:aU,target:f,ariaLabel:e}]},{title:"Outreach",links:[{text:"Frontiers Forum ",url:aY,target:k,ariaLabel:"Frontiers Forum website"},{text:"Frontiers Policy Labs ",url:"https:\u002F\u002Fpolicylabs.frontiersin.org\u002F",target:k,ariaLabel:e},{text:bj,url:"https:\u002F\u002Fkids.frontiersin.org\u002F",target:k,ariaLabel:"Frontiers for Young Minds journal"},{text:aZ,url:a_,target:k,ariaLabel:a$}]},{title:"Connect",links:[{text:w,url:bi,target:k,ariaLabel:w},{text:"Emails and alerts ",url:"https:\u002F\u002Floop.frontiersin.org\u002Fsettings\u002Femail-preferences?a=publishers",target:k,ariaLabel:"Subscribe to Frontiers emails"},{text:"Contact us ",url:bc,target:f,ariaLabel:"Subscribe to newsletter"},{text:"Submit",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fsubmission\u002Fsubmit",target:f,ariaLabel:e},{text:ba,url:bb,target:k,ariaLabel:e}]}],socialLinks:[{link:{text:bk,url:"https:\u002F\u002Fwww.facebook.com\u002FFrontiersin",target:k,ariaLabel:bk},type:x,color:y,icon:"Facebook",size:z,hiddenText:g},{link:{text:"Frontiers Twitter",url:"https:\u002F\u002Ftwitter.com\u002Ffrontiersin",target:k,ariaLabel:e},type:x,color:y,icon:"Twitter",size:z,hiddenText:g},{link:{text:"Frontiers LinkedIn",url:"https:\u002F\u002Fwww.linkedin.com\u002Fcompany\u002Ffrontiers",target:k,ariaLabel:e},type:x,color:y,icon:"LinkedIn",size:z,hiddenText:g},{link:{text:"Frontiers Instagram",url:"https:\u002F\u002Fwww.instagram.com\u002Ffrontiersin_",target:k,ariaLabel:e},type:x,color:y,icon:"Instagram",size:z,hiddenText:g}],copyright:"Frontiers Media S.A. All rights reserved",termsAndConditionsUrl:"https:\u002F\u002Fwww.frontiersin.org\u002Flegal\u002Fterms-and-conditions",privacyPolicyUrl:"https:\u002F\u002Fwww.frontiersin.org\u002Flegal\u002Fprivacy-policy"},newsletterComponent:e,snackbarItems:[]},mainHeader:{title:h,image:F,breadcrumbs:[],linksCollection:{total:m,items:[]},metricsCollection:{total:m,items:[]}},user:{loggedUserInfo:F},journals:[{id:n,name:bl,slug:bm,abbreviation:bn,space:{id:i,domainName:j,__typename:b},__typename:a},{id:2445,name:bl,slug:bm,abbreviation:bn,space:{id:c,domainName:d,__typename:b},__typename:a},{id:J,name:"Test SSPH Journal",slug:"test-ssph-journal",abbreviation:"testjournal",space:{id:q,domainName:A,__typename:b},__typename:a},{id:bo,name:"TEST ALF Journal",slug:"test-alf-journal",abbreviation:"talfj",space:{id:s,domainName:K,__typename:b},__typename:a},{id:i,name:bp,slug:bq,abbreviation:br,space:{id:i,domainName:j,__typename:b},__typename:a},{id:2360,name:bp,slug:bq,abbreviation:br,space:{id:c,domainName:d,__typename:b},__typename:a},{id:c,name:"Smoke Test Field",slug:"smoke-test-field",abbreviation:"FJST",space:{id:L,domainName:bs,__typename:b},__typename:a},{id:bo,name:bt,slug:bu,abbreviation:bv,space:{id:q,domainName:A,__typename:b},__typename:a},{id:2077,name:bt,slug:bu,abbreviation:bv,space:{id:c,domainName:d,__typename:b},__typename:a},{id:J,name:bw,slug:bx,abbreviation:by,space:{id:s,domainName:K,__typename:b},__typename:a},{id:J,name:bw,slug:bx,abbreviation:by,space:{id:c,domainName:d,__typename:b},__typename:a},{id:bz,name:bA,slug:bB,abbreviation:bC,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3776,name:bA,slug:bB,abbreviation:bC,space:{id:c,domainName:d,__typename:b},__typename:a},{id:G,name:bD,slug:bE,abbreviation:bF,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3765,name:bD,slug:bE,abbreviation:bF,space:{id:c,domainName:d,__typename:b},__typename:a},{id:14,name:bG,slug:bH,abbreviation:bI,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3414,name:bG,slug:bH,abbreviation:bI,space:{id:c,domainName:d,__typename:b},__typename:a},{id:20,name:bJ,slug:bK,abbreviation:bL,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3754,name:bJ,slug:bK,abbreviation:bL,space:{id:c,domainName:d,__typename:b},__typename:a},{id:L,name:bM,slug:bN,abbreviation:bO,space:{id:i,domainName:j,__typename:b},__typename:a},{id:2444,name:bM,slug:bN,abbreviation:bO,space:{id:c,domainName:d,__typename:b},__typename:a},{id:bP,name:bQ,slug:bR,abbreviation:bS,space:{id:q,domainName:A,__typename:b},__typename:a},{id:bP,name:bQ,slug:bR,abbreviation:bS,space:{id:c,domainName:d,__typename:b},__typename:a},{id:i,name:"GSL Test",slug:"gsl-test",abbreviation:"gslt",space:{id:t,domainName:M,__typename:b},__typename:a},{id:2356,name:"Frontiers in the Internet of Things",slug:"the-internet-of-things",abbreviation:"friot",space:{id:c,domainName:d,__typename:b},__typename:a},{id:656,name:"Frontiers in Zoological Science",slug:"zoological-science",abbreviation:"fzoos",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1720,name:"Frontiers in Zoological Research",slug:"zoological-research",abbreviation:"fzolr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3162,name:"Frontiers in Wound Care",slug:"wound-care",abbreviation:"fwoca",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3136,name:"Frontiers in Worm Science",slug:"worm-science",abbreviation:"fwors",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3583,name:"Frontiers in Wind Energy",slug:"wind-energy",abbreviation:"fwinde",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1451,name:"Frontiers in Water",slug:"water",abbreviation:"frwa",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1561,name:"Frontiers in Virtual Reality",slug:"virtual-reality",abbreviation:"frvir",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2000,name:"Frontiers in Virology",slug:"virology",abbreviation:"fviro",space:{id:c,domainName:d,__typename:b},__typename:a},{id:649,name:"Frontiers in Veterinary Science",slug:"veterinary-science",abbreviation:"fvets",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2176,name:"Frontiers in Urology",slug:"urology",abbreviation:"fruro",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3099,name:"Frontiers in Tuberculosis",slug:"tuberculosis",abbreviation:"ftubr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1843,name:"Frontiers in Tropical Diseases",slug:"tropical-diseases",abbreviation:"fitd",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2417,name:"Frontiers in Transplantation",slug:"transplantation",abbreviation:"frtra",space:{id:c,domainName:d,__typename:b},__typename:a},{id:473,name:"Frontiers in Toxicology",slug:"toxicology",abbreviation:"ftox",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2105,name:"Frontiers in Thermal Engineering",slug:"thermal-engineering",abbreviation:"fther",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3190,name:"Frontiers in The Neurobiology of Pain",slug:"the-neurobiology-of-pain",abbreviation:h,space:{id:c,domainName:d,__typename:b},__typename:a},{id:1967,name:"Frontiers in Test_Field_Science_Archive",slug:"testfieldsciencearchive",abbreviation:"fntesc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1347,name:"Frontiers in Test_Field_Humanities_Archive",slug:"testfieldhumanitiesarchive",abbreviation:"fntes",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3573,name:"Frontiers in Taxonomy",slug:"taxonomy",abbreviation:"Front. Taxon.",space:{id:c,domainName:d,__typename:b},__typename:a},{id:q,name:"Frontiers in Systems Neuroscience",slug:"systems-neuroscience",abbreviation:"fnsys",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1721,name:"Frontiers in Systems Biology",slug:"systems-biology",abbreviation:"fsysb",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3381,name:"Frontiers in Synthetic Biology",slug:"synthetic-biology",abbreviation:"fsybi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:22,name:"Frontiers in Synaptic Neuroscience",slug:"synaptic-neuroscience",abbreviation:"fnsyn",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2299,name:"Frontiers in Sustainable Tourism",slug:"sustainable-tourism",abbreviation:"frsut",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2483,name:"Frontiers in Sustainable Resource Management",slug:"sustainable-resource-management",abbreviation:"fsrma",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1335,name:"Frontiers in Sustainable Food Systems",slug:"sustainable-food-systems",abbreviation:"fsufs",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2726,name:"Frontiers in Sustainable Energy Policy",slug:"sustainable-energy-policy",abbreviation:"fsuep",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1468,name:"Frontiers in Sustainable Cities",slug:"sustainable-cities",abbreviation:"frsc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1397,name:"Frontiers in Sustainable Business",slug:"sustainable-business",abbreviation:"fisb",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1547,name:"Frontiers in Sustainability",slug:"sustainability",abbreviation:"frsus",space:{id:c,domainName:d,__typename:b},__typename:a},{id:604,name:"Frontiers in Surgery",slug:"surgery",abbreviation:"fsurg",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2504,name:"Frontiers in Structural Biology",slug:"structural-biology",abbreviation:"frsbi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2497,name:"Frontiers in Stroke",slug:"stroke",abbreviation:"fstro",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3434,name:"Frontiers in Stem Cells",slug:"stem-cells",abbreviation:"fstce",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1482,name:"Frontiers in Sports and Active Living",slug:"sports-and-active-living",abbreviation:"fspor",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1695,name:"Frontiers in Space Technologies",slug:"space-technologies",abbreviation:"frspt",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3519,name:"Frontiers in Solar Energy",slug:"solar-energy",abbreviation:"fsoln",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1718,name:"Frontiers in Soil Science",slug:"soil-science",abbreviation:"fsoil",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2346,name:"Frontiers in Soft Matter",slug:"soft-matter",abbreviation:"frsfm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1213,name:"Frontiers in Sociology",slug:"sociology",abbreviation:"fsoc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:N,name:"Frontiers in Society Journal Archive",slug:"society-journal-archive",abbreviation:O,space:{id:c,domainName:d,__typename:b},__typename:a},{id:2690,name:"Frontiers in Social Psychology",slug:"social-psychology",abbreviation:"frsps",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2819,name:"Frontiers in Smart Grids",slug:"smart-grids",abbreviation:"frsgr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2288,name:"Frontiers in Sleep",slug:"sleep",abbreviation:"frsle",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2552,name:"Frontiers in Skin Cancer",slug:"skin-cancer",abbreviation:"fskcr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1786,name:"Frontiers in Signal Processing",slug:"signal-processing",abbreviation:"frsip",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1704,name:"Frontiers in Sensors",slug:"sensors",abbreviation:"fsens",space:{id:c,domainName:d,__typename:b},__typename:a},{id:q,name:"Frontiers in Science archive",slug:"science-archive",abbreviation:B,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3737,name:"Frontiers in Science Diplomacy",slug:"science-diplomacy",abbreviation:"fsdip",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2766,name:"Frontiers in Science",slug:Q,abbreviation:"fsci",space:{id:c,domainName:d,__typename:b},__typename:a},{id:657,name:"Frontiers in Robotics and AI",slug:"robotics-and-ai",abbreviation:"frobt",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1606,name:"Frontiers in Research Metrics and Analytics",slug:"research-metrics-and-analytics",abbreviation:"frma",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1479,name:"Frontiers in Reproductive Health",slug:"reproductive-health",abbreviation:"frph",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1830,name:"Frontiers in Remote Sensing",slug:"remote-sensing",abbreviation:"frsen",space:{id:c,domainName:d,__typename:b},__typename:a},{id:659,name:"Frontiers in Rehabilitation Sciences",slug:"rehabilitation-sciences",abbreviation:"fresc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3550,name:"Frontiers in Regenerative Medicine",slug:"regenerative-medicine",abbreviation:"fregm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1949,name:"Frontiers in Radiology",slug:"radiology",abbreviation:"fradi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3189,name:"Frontiers in RNA Research",slug:"rna-research",abbreviation:"frnar",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2306,name:"Frontiers in Quantum Science and Technology",slug:"quantum-science-and-technology",abbreviation:"frqst",space:{id:c,domainName:d,__typename:b},__typename:a},{id:N,name:"Frontiers in Public Health Archive",slug:"public-health-archive",abbreviation:O,space:{id:q,domainName:A,__typename:b},__typename:a},{id:609,name:"Frontiers in Public Health",slug:"public-health",abbreviation:"fpubh",space:{id:c,domainName:d,__typename:b},__typename:a},{id:36,name:"Frontiers in Psychology",slug:"psychology",abbreviation:"fpsyg",space:{id:c,domainName:d,__typename:b},__typename:a},{id:71,name:"Frontiers in Psychiatry",slug:"psychiatry",abbreviation:"fpsyt",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3267,name:"Frontiers in Protistology",slug:"protistology",abbreviation:"frpro",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2452,name:"Frontiers in Proteomics",slug:"proteomics",abbreviation:"fprot",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3171,name:"Frontiers in Prosthetics and Orthotics",slug:"prosthetics-and-orthotics",abbreviation:"fpror ",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3643,name:"Frontiers in Polymer Science",slug:"polymer-science",abbreviation:"fplms",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1558,name:"Frontiers in Political Science",slug:"political-science",abbreviation:"fpos",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3615,name:"Frontiers in Polar Science",slug:"polar-science",abbreviation:"fposc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:373,name:"Frontiers in Plant Science",slug:"plant-science",abbreviation:"fpls",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3477,name:"Frontiers in Plant Physiology",slug:"plant-physiology",abbreviation:"fphgy",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3589,name:"Frontiers in Plant Genomics",slug:"plant-genomics",abbreviation:"fpgen",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3579,name:"Frontiers in Plant Ecology",slug:"plant-ecology",abbreviation:"fpley",space:{id:c,domainName:d,__typename:b},__typename:a},{id:210,name:"Frontiers in Physiology",slug:"physiology",abbreviation:"fphys",space:{id:c,domainName:d,__typename:b},__typename:a},{id:616,name:"Frontiers in Physics",slug:"physics",abbreviation:"fphy",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1803,name:"Frontiers in Photonics",slug:"photonics",abbreviation:"fphot",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3604,name:"Frontiers in Photobiology",slug:"photobiology",abbreviation:"fphbi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:176,name:"Frontiers in Pharmacology",slug:"pharmacology",abbreviation:"fphar",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3388,name:"Frontiers in Personality Disorders",slug:"personality-disorders",abbreviation:"fprsd",space:{id:c,domainName:d,__typename:b},__typename:a},{id:606,name:"Frontiers in Pediatrics",slug:"pediatrics",abbreviation:"fped",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2554,name:"Frontiers in Pediatric Dermatology",slug:"pediatric-dermatology",abbreviation:"fpdm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:N,name:"Frontiers in Pathology and Oncology Archive",slug:"pathology-and-oncology-archive",abbreviation:O,space:{id:s,domainName:K,__typename:b},__typename:a},{id:610,name:bT,slug:bU,abbreviation:bV,space:{id:c,domainName:d,__typename:b},__typename:a},{id:3351,name:bT,slug:bU,abbreviation:bV,space:{id:c,domainName:d,__typename:b},__typename:a},{id:2705,name:"Frontiers in Parasitology",slug:"parasitology",abbreviation:"fpara",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1727,name:"Frontiers in Pain Research",slug:"pain-research",abbreviation:"fpain",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2679,name:"Frontiers in Organizational Psychology",slug:"organizational-psychology",abbreviation:"forgp",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1702,name:"Frontiers in Oral Health",slug:"oral-health",abbreviation:"froh",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2232,name:"Frontiers in Ophthalmology",slug:"ophthalmology",abbreviation:"fopht",space:{id:c,domainName:d,__typename:b},__typename:a},{id:451,name:"Frontiers in Oncology",slug:"oncology",abbreviation:"fonc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3123,name:"Frontiers in Ocean Sustainability",slug:"ocean-sustainability",abbreviation:"focsu",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2612,name:"Frontiers in Occupational Therapy",slug:"occupational-therapy",abbreviation:"froct",space:{id:c,domainName:d,__typename:b},__typename:a},{id:628,name:"Frontiers in Nutrition",slug:"nutrition",abbreviation:"fnut",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2062,name:"Frontiers in Nuclear Medicine",slug:"nuclear-medicine",abbreviation:"fnume",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2172,name:"Frontiers in Nuclear Engineering",slug:"nuclear-engineering",abbreviation:"fnuen",space:{id:c,domainName:d,__typename:b},__typename:a},{id:c,name:"Frontiers in Neuroscience",slug:"neuroscience",abbreviation:"fnins",space:{id:c,domainName:d,__typename:b},__typename:a},{id:bW,name:"Frontiers in Neurorobotics",slug:"neurorobotics",abbreviation:"fnbot",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3056,name:"Frontiers in Neuropsychiatry",slug:"neuropsychiatry",abbreviation:"fnpsy",space:{id:c,domainName:d,__typename:b},__typename:a},{id:141,name:"Frontiers in Neurology",slug:T,abbreviation:"fneur",space:{id:c,domainName:d,__typename:b},__typename:a},{id:bX,name:"Frontiers in Neuroinformatics",slug:"neuroinformatics",abbreviation:"fninf",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3283,name:"Frontiers in Neuroinflammation",slug:"neuroinflammation",abbreviation:"fnein",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1973,name:"Frontiers in Neuroimaging",slug:"neuroimaging",abbreviation:"fnimg",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1833,name:"Frontiers in Neuroergonomics",slug:"neuroergonomics",abbreviation:"fnrgo",space:{id:c,domainName:d,__typename:b},__typename:a},{id:H,name:"Frontiers in Neuroengineering",slug:"neuroengineering",abbreviation:"fneng",space:{id:c,domainName:d,__typename:b},__typename:a},{id:bY,name:"Frontiers in Neuroenergetics",slug:"neuroenergetics",abbreviation:"fnene",space:{id:c,domainName:d,__typename:b},__typename:a},{id:s,name:"Frontiers in Neuroanatomy",slug:"neuroanatomy",abbreviation:"fnana",space:{id:c,domainName:d,__typename:b},__typename:a},{id:G,name:"Frontiers in Neural Circuits",slug:"neural-circuits",abbreviation:"fncir",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2021,name:"Frontiers in Network Physiology",slug:"network-physiology",abbreviation:"fnetp",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3130,name:"Frontiers in Network Neuroscience",slug:"network-neuroscience",abbreviation:"fnnsc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2357,name:"Frontiers in Nephrology",slug:"nephrology",abbreviation:"fneph",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2320,name:"Frontiers in Natural Products",slug:"natural-products",abbreviation:"fntpr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1528,name:"Frontiers in Nanotechnology",slug:"nanotechnology",abbreviation:"fnano",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2882,name:"Frontiers in Musculoskeletal Disorders",slug:"musculoskeletal-disorders",abbreviation:"fmscd",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3275,name:"Frontiers in Multiple Sclerosis",slug:"multiple-sclerosis",abbreviation:"fmscr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3152,name:"Frontiers in Mollusk Science",slug:"mollusk-science",abbreviation:"fmlsc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2031,name:"Frontiers in Molecular Neuroscience",slug:"molecular-neuroscience",abbreviation:"fnmol",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2086,name:"Frontiers in Molecular Medicine",slug:"molecular-medicine",abbreviation:"fmmed",space:{id:c,domainName:d,__typename:b},__typename:a},{id:698,name:"Frontiers in Molecular Biosciences",slug:"molecular-biosciences",abbreviation:"fmolb",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2807,name:"Frontiers in Microbiomes",slug:"microbiomes",abbreviation:"frmbi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:310,name:"Frontiers in Microbiology",slug:"microbiology",abbreviation:"fmicb",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2327,name:"Frontiers in Metals and Alloys",slug:"metals-and-alloys",abbreviation:"ftmal",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2307,name:"Frontiers in Membrane Science and Technology",slug:"membrane-science-and-technology",abbreviation:"frmst",space:{id:c,domainName:d,__typename:b},__typename:a},{id:602,name:"Frontiers in Medicine",slug:"medicine",abbreviation:"fmed",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1573,name:"Frontiers in Medical Technology",slug:"medical-technology",abbreviation:"fmedt",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3135,name:"Frontiers in Medical Engineering",slug:"medical-engineering",abbreviation:"fmede",space:{id:c,domainName:d,__typename:b},__typename:a},{id:950,name:"Frontiers in Mechanical Engineering",slug:"mechanical-engineering",abbreviation:"fmech",space:{id:c,domainName:d,__typename:b},__typename:a},{id:608,name:"Frontiers in Materials",slug:"materials",abbreviation:"fmats",space:{id:c,domainName:d,__typename:b},__typename:a},{id:655,name:"Frontiers in Marine Science",slug:"marine-science",abbreviation:"fmars",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2100,name:"Frontiers in Manufacturing Technology",slug:"manufacturing-technology",abbreviation:"fmtec",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2931,name:"Frontiers in Mammal Science",slug:"mammal-science",abbreviation:"fmamm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2896,name:"Frontiers in Malaria",slug:"malaria",abbreviation:"fmala",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3107,name:"Frontiers in Lupus",slug:"lupus",abbreviation:"flupu",space:{id:c,domainName:d,__typename:b},__typename:a},{id:435,name:"Frontiers in Linguistics",slug:"linguistics",abbreviation:"fling",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2636,name:"Frontiers in Language Sciences",slug:"language-sciences",abbreviation:"flang",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2670,name:"Frontiers in Lab on a Chip Technologies",slug:"lab-on-a-chip-technologies",abbreviation:"frlct",space:{id:c,domainName:d,__typename:b},__typename:a},{id:bZ,name:"Frontiers in Integrative Neuroscience",slug:"integrative-neuroscience",abbreviation:"fnint",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1723,name:"Frontiers in Insect Science",slug:"insect-science",abbreviation:"finsc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3093,name:"Frontiers in Influenza",slug:"influenza",abbreviation:"finfl",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3073,name:"Frontiers in Inflammation",slug:"inflammation",abbreviation:"finmn",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3200,name:"Frontiers in Industrial Microbiology",slug:"industrial-microbiology",abbreviation:"finmi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3291,name:"Frontiers in Industrial Engineering",slug:"industrial-engineering",abbreviation:"fieng",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2765,name:"Frontiers in Impact Journals",slug:"impact-journals",abbreviation:h,space:{id:c,domainName:d,__typename:b},__typename:a},{id:3078,name:"Frontiers in Immunotherapeutics",slug:"immunotherapeutics",abbreviation:"fimms",space:{id:c,domainName:d,__typename:b},__typename:a},{id:276,name:"Frontiers in Immunology",slug:"immunology",abbreviation:"fimmu",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2379,name:"Frontiers in Imaging",slug:"imaging",abbreviation:"fimag",space:{id:c,domainName:d,__typename:b},__typename:a},{id:629,name:"Frontiers in ICT",slug:"ict",abbreviation:"fict",space:{id:c,domainName:d,__typename:b},__typename:a},{id:16,name:"Frontiers in Humanities and Social Sciences Archive",slug:"humanities-and-social-sciences-archive",abbreviation:B,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3759,name:"Frontiers in Human Rights",slug:"human-rights",abbreviation:h,space:{id:c,domainName:d,__typename:b},__typename:a},{id:1588,name:"Frontiers in Human Neuroscience",slug:"human-neuroscience",abbreviation:"fnhum",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1533,name:"Frontiers in Human Dynamics",slug:"human-dynamics",abbreviation:"fhumd",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2733,name:"Frontiers in Horticulture",slug:"horticulture",abbreviation:"fhort",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3316,name:"Frontiers in Histology",slug:"histology",abbreviation:"frhis",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2378,name:"Frontiers in High Performance Computing",slug:"high-performance-computing",abbreviation:"fhpcp",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2456,name:"Frontiers in Hematology",slug:"hematology",abbreviation:"frhem",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2063,name:"Frontiers in Health Services",slug:"health-services",abbreviation:"frhs",space:{id:c,domainName:d,__typename:b},__typename:a},{id:s,name:"Frontiers in Health Archive",slug:"health-archive",abbreviation:B,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3508,name:"Frontiers in Green Chemistry",slug:"green-chemistry",abbreviation:"fgrch",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1728,name:"Frontiers in Global Women's Health",slug:"global-womens-health",abbreviation:"fgwh",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2918,name:"Frontiers in Geochemistry",slug:"geochemistry",abbreviation:"fgeoc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1540,name:"Frontiers in Genome Editing",slug:"genome-editing",abbreviation:"fgeed",space:{id:c,domainName:d,__typename:b},__typename:a},{id:240,name:"Frontiers in Genetics",slug:"genetics",abbreviation:"fgene",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3496,name:"Frontiers in Genetic Microbiology",slug:"genetic-microbiology",abbreviation:"fgemi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:aj,name:"Frontiers in Genetic Disorders",slug:"genetic-disorders",abbreviation:"frged",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2333,name:"Frontiers in Gastroenterology",slug:"gastroenterology",abbreviation:"fgstr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1529,name:"Frontiers in Future Transportation",slug:"future-transportation",abbreviation:"ffutr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1725,name:"Frontiers in Fungal Biology",slug:"fungal-biology",abbreviation:"ffunb",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2826,name:"Frontiers in Fuels",slug:"fuels",abbreviation:"ffuel",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3207,name:"Frontiers in Freshwater Science",slug:"freshwater-science",abbreviation:"ffwsc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1366,name:"Frontiers in Forests and Global Change",slug:"forests-and-global-change",abbreviation:"ffgc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2689,name:"Frontiers in Forensic Science",slug:"forensic-science",abbreviation:h,space:{id:c,domainName:d,__typename:b},__typename:a},{id:2289,name:"Frontiers in Food Science and Technology",slug:"food-science-and-technology",abbreviation:"frfst",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3559,name:"Frontiers in Fluorescence",slug:R,abbreviation:"fflur",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2987,name:"Frontiers in Fish Science",slug:"fish-science",abbreviation:"frish",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3489,name:"Frontiers in Fire Science and Technology",slug:"fire-science-and-technology",abbreviation:"firtc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2749,name:"Frontiers in Financial Economics",slug:"financial-economics",abbreviation:"ffecn",space:{id:c,domainName:d,__typename:b},__typename:a},{id:c,name:"Frontiers in FSHIP Test Journal",slug:"fship-test-journal",abbreviation:"ftest",space:{id:i,domainName:j,__typename:b},__typename:a},{id:bz,name:"Frontiers in Evolutionary Neuroscience",slug:"evolutionary-neuroscience",abbreviation:"fnevo",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2955,name:"Frontiers in Ethology",slug:"ethology",abbreviation:"fetho",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3032,name:"Frontiers in Epigenetics and Epigenomics",slug:"epigenetics-and-epigenomics",abbreviation:"freae",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2394,name:"Frontiers in Epidemiology",slug:"epidemiology",abbreviation:"fepid",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3450,name:"Frontiers in Environmental Toxicology",slug:"environmental-toxicology",abbreviation:"fentx",space:{id:c,domainName:d,__typename:b},__typename:a},{id:627,name:"Frontiers in Environmental Science",slug:"environmental-science",abbreviation:"fenvs",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2888,name:"Frontiers in Environmental Health",slug:"environmental-health",abbreviation:"fenvh",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2851,name:"Frontiers in Environmental Engineering",slug:"environmental-engineering",abbreviation:"fenve",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2547,name:"Frontiers in Environmental Economics",slug:"environmental-economics",abbreviation:"frevc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1697,name:"Frontiers in Environmental Chemistry",slug:"environmental-chemistry",abbreviation:"fenvc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2756,name:"Frontiers in Environmental Archaeology",slug:"environmental-archaeology",abbreviation:"fearc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:ak,name:"Frontiers in Engineering archive",slug:"engineering-archive",abbreviation:B,space:{id:i,domainName:j,__typename:b},__typename:a},{id:626,name:"Frontiers in Energy Research",slug:"energy-research",abbreviation:"fenrg",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3115,name:"Frontiers in Energy Efficiency",slug:"energy-efficiency",abbreviation:"fenef",space:{id:c,domainName:d,__typename:b},__typename:a},{id:106,name:"Frontiers in Endocrinology",slug:"endocrinology",abbreviation:"fendo",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1696,name:"Frontiers in Electronics",slug:"electronics",abbreviation:"felec",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1800,name:"Frontiers in Electronic Materials",slug:"electronic-materials",abbreviation:"femat",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2998,name:"Frontiers in Educational Psychology",slug:"educational-psychology",abbreviation:"fepys",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1239,name:"Frontiers in Education",slug:"education",abbreviation:"feduc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:625,name:"Frontiers in Economics",slug:"economics",abbreviation:"fecon",space:{id:c,domainName:d,__typename:b},__typename:a},{id:471,name:"Frontiers in Ecology and Evolution",slug:"ecology-and-evolution",abbreviation:"fevo",space:{id:c,domainName:d,__typename:b},__typename:a},{id:c,name:"Frontiers in Earth Science Archive",slug:"earth-science-archive",abbreviation:"gslfj",space:{id:t,domainName:M,__typename:b},__typename:a},{id:654,name:"Frontiers in Earth Science",slug:"earth-science",abbreviation:"feart",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3309,name:"Frontiers in Earth Observation and Land Monitoring",slug:"earth-observation-and-land-monitoring",abbreviation:"feolm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2161,name:"Frontiers in Drug Safety and Regulation",slug:"drug-safety-and-regulation",abbreviation:"fdsfr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2137,name:"Frontiers in Drug Discovery",slug:"drug-discovery",abbreviation:"fddsv",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2136,name:"Frontiers in Drug Delivery",slug:"drug-delivery",abbreviation:"fddev",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2775,name:"Frontiers in Disaster and Emergency Medicine",slug:"disaster-and-emergency-medicine",abbreviation:"femer",space:{id:c,domainName:d,__typename:b},__typename:a},{id:788,name:"Frontiers in Digital Humanities",slug:"digital-humanities",abbreviation:"fdigh",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1534,name:"Frontiers in Digital Health",slug:"digital-health",abbreviation:"fdgth",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2999,name:"Frontiers in Developmental Psychology",slug:"developmental-psychology",abbreviation:"fdpys",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2873,name:"Frontiers in Detector Science and Technology",slug:"detector-science-and-technology",abbreviation:"fdest",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3611,name:"Frontiers in Design Engineering",slug:"design-engineering",abbreviation:"fdese",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2550,name:"Frontiers in Dermatological Research",slug:"dermatological-research",abbreviation:"fdmre",space:{id:c,domainName:d,__typename:b},__typename:a},{id:607,name:"Frontiers in Dental Medicine",slug:"dental-medicine",abbreviation:"fdmed",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2597,name:"Frontiers in Dementia",slug:"dementia",abbreviation:"frdem",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1785,name:"Frontiers in Control Engineering",slug:"control-engineering",abbreviation:"fcteg",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1724,name:"Frontiers in Conservation Science",slug:"conservation-science",abbreviation:"fcosc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3454,name:"Frontiers in Condensed Matter",slug:"condensed-matter",abbreviation:"fconm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1511,name:"Frontiers in Computer Science",slug:"computer-science",abbreviation:"fcomp",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3566,name:"Frontiers in Computational Physiology",slug:"computational-physiology",abbreviation:"fcphy",space:{id:c,domainName:d,__typename:b},__typename:a},{id:n,name:o,slug:r,abbreviation:V,space:{id:c,domainName:d,__typename:b},__typename:a},{id:3234,name:"Frontiers in Complex Systems",slug:"complex-systems",abbreviation:"fcpxs",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1787,name:"Frontiers in Communications and Networks",slug:"communications-and-networks",abbreviation:"frcmn",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1238,name:"Frontiers in Communication",slug:"communication",abbreviation:"fcomm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2535,name:"Frontiers in Cognition",slug:"cognition",abbreviation:"fcogn",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2857,name:"Frontiers in Coatings, Dyes and Interface Engineering",slug:"coatings-dyes-and-interface-engineering",abbreviation:"frcdi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3222,name:"Frontiers in Clinical Microbiology",slug:"clinical-microbiology",abbreviation:"fclmi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1729,name:"Frontiers in Clinical Diabetes and Healthcare",slug:"clinical-diabetes-and-healthcare",abbreviation:"fcdhc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2551,name:"Frontiers in Clinical Dermatology",slug:"clinical-dermatology",abbreviation:"fcldm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1490,name:"Frontiers in Climate",slug:"climate",abbreviation:"fclim",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3338,name:"Frontiers in Chromosome Research",slug:"chromosome-research",abbreviation:h,space:{id:c,domainName:d,__typename:b},__typename:a},{id:2587,name:"Frontiers in Child and Adolescent Psychiatry",slug:"child-and-adolescent-psychiatry",abbreviation:"frcha",space:{id:c,domainName:d,__typename:b},__typename:a},{id:601,name:"Frontiers in Chemistry",slug:"chemistry",abbreviation:"fchem",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1532,name:"Frontiers in Chemical Engineering",slug:"chemical-engineering",abbreviation:"fceng",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3038,name:"Frontiers in Chemical Biology",slug:"chemical-biology",abbreviation:"fchbi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3322,name:"Frontiers in Ceramics",slug:"ceramics",abbreviation:"fceic",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1440,name:"Frontiers in Cellular and Infection Microbiology",slug:"cellular-and-infection-microbiology",abbreviation:"fcimb",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1523,name:"Frontiers in Cellular Neuroscience",slug:"cellular-neuroscience",abbreviation:"fncel",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3084,name:"Frontiers in Cellular Immunology",slug:"cellular-immunology",abbreviation:"fcimy",space:{id:c,domainName:d,__typename:b},__typename:a},{id:403,name:"Frontiers in Cell and Developmental Biology",slug:"cell-and-developmental-biology",abbreviation:"fcell",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3178,name:"Frontiers in Cell Signaling",slug:"cell-signaling",abbreviation:"fcsig",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2655,name:"Frontiers in Cell Death",slug:"cell-death",abbreviation:"fceld",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1901,name:"Frontiers in Catalysis",slug:"catalysis",abbreviation:"fctls",space:{id:c,domainName:d,__typename:b},__typename:a},{id:755,name:"Frontiers in Cardiovascular Medicine",slug:"cardiovascular-medicine",abbreviation:"fcvm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2662,name:"Frontiers in Carbon",slug:"carbon",abbreviation:"frcrb",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3513,name:"Frontiers in Cancer Interception",slug:"cancer-interception",abbreviation:"fcint",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3433,name:"Frontiers in Cancer Control and Society",slug:"cancer-control-and-society",abbreviation:"fcacs",space:{id:c,domainName:d,__typename:b},__typename:a},{id:921,name:"Frontiers in Built Environment",slug:"built-environment",abbreviation:"fbuil",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1418,name:"Frontiers in Blockchain",slug:"blockchain",abbreviation:"fbloc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2971,name:"Frontiers in Bird Science",slug:"bird-science",abbreviation:"fbirs",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3300,name:"Frontiers in Biophysics",slug:"biophysics",abbreviation:"frbis",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2222,name:"Frontiers in Biomaterials Science",slug:"biomaterials-science",abbreviation:"fbiom",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1722,name:"Frontiers in Bioinformatics",slug:"bioinformatics",abbreviation:"fbinf",space:{id:c,domainName:d,__typename:b},__typename:a},{id:452,name:"Frontiers in Bioengineering and Biotechnology",slug:"bioengineering-and-biotechnology",abbreviation:"fbioe",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1380,name:"Frontiers in Big Data",slug:"big-data",abbreviation:"fdata",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1589,name:"Frontiers in Behavioral Neuroscience",slug:"behavioral-neuroscience",abbreviation:"fnbeh",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2432,name:"Frontiers in Behavioral Economics",slug:"behavioral-economics",abbreviation:"frbhe",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2796,name:"Frontiers in Bee Science",slug:"bee-science",abbreviation:"frbee",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3214,name:"Frontiers in Batteries and Electrochemistry",slug:"batteries-and-electrochemistry",abbreviation:"fbael",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3011,name:"Frontiers in Bacteriology",slug:"bacteriology",abbreviation:"fbrio",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3040,name:"Frontiers in Audiology and Otology",slug:"audiology-and-otology",abbreviation:"fauot",space:{id:c,domainName:d,__typename:b},__typename:a},{id:603,name:"Frontiers in Astronomy and Space Sciences",slug:"astronomy-and-space-sciences",abbreviation:"fspas",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1437,name:"Frontiers in Artificial Intelligence",slug:"artificial-intelligence",abbreviation:"frai",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2940,name:"Frontiers in Arachnid Science",slug:"arachnid-science",abbreviation:"frchs",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2834,name:"Frontiers in Aquaculture",slug:"aquaculture",abbreviation:"faquc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:981,name:"Frontiers in Applied Mathematics and Statistics",slug:"applied-mathematics-and-statistics",abbreviation:"fams",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3417,name:"Frontiers in Applied Environmental Microbiology",slug:"applied-environmental-microbiology",abbreviation:"faemi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2714,name:"Frontiers in Antibiotics",slug:"antibiotics",abbreviation:"frabi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3443,name:"Frontiers in Anti-Cancer Therapies",slug:"anti-cancer-therapies",abbreviation:"facth",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3253,name:"Frontiers in Antennas and Propagation",slug:"antennas-and-propagation",abbreviation:"fanpr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1719,name:"Frontiers in Animal Science",slug:"animal-science",abbreviation:"fanim",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2513,name:"Frontiers in Anesthesiology",slug:"anesthesiology",abbreviation:"fanes",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1989,name:"Frontiers in Analytical Science",slug:"analytical-science",abbreviation:"frans",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2909,name:"Frontiers in Amphibian and Reptile Science",slug:"amphibian-and-reptile-science",abbreviation:"famrs",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1705,name:"Frontiers in Allergy",slug:"allergy",abbreviation:"falgy",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1541,name:"Frontiers in Agronomy",slug:"agronomy",abbreviation:"fagro",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3631,name:"Frontiers in Agricultural Engineering",slug:"agricultural-engineering",abbreviation:"faeng",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2477,name:"Frontiers in Aging Neuroscience",slug:"aging-neuroscience",abbreviation:"fnagi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1566,name:"Frontiers in Aging",slug:"aging",abbreviation:"fragi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2449,name:"Frontiers in Aerospace Engineering",slug:"aerospace-engineering",abbreviation:"fpace",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2195,name:"Frontiers in Adolescent Medicine",slug:"adolescent-medicine",abbreviation:"fradm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3426,name:"Frontiers in Acoustics",slug:"acoustics",abbreviation:"facou",space:{id:c,domainName:d,__typename:b},__typename:a},{id:979,name:bj,slug:"frontiers-for-young-minds",abbreviation:"frym",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3260,name:"Frontiers In Ocean Engineering",slug:"frontiers-in-ocean-engineering",abbreviation:"focen",space:{id:c,domainName:d,__typename:b},__typename:a},{id:bW,name:"FSHIP Test Journal 2",slug:"fship-test-journal-2",abbreviation:"FTJ2",space:{id:i,domainName:j,__typename:b},__typename:a},{id:i,name:b_,slug:b$,abbreviation:ca,space:{id:L,domainName:bs,__typename:b},__typename:a},{id:3746,name:b_,slug:b$,abbreviation:ca,space:{id:c,domainName:d,__typename:b},__typename:a},{id:bX,name:cb,slug:cc,abbreviation:cd,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3231,name:cb,slug:cc,abbreviation:cd,space:{id:c,domainName:d,__typename:b},__typename:a},{id:t,name:ce,slug:cf,abbreviation:cg,space:{id:t,domainName:M,__typename:b},__typename:a},{id:2078,name:ce,slug:cf,abbreviation:cg,space:{id:c,domainName:d,__typename:b},__typename:a},{id:bZ,name:ch,slug:ci,abbreviation:cj,space:{id:i,domainName:j,__typename:b},__typename:a},{id:2359,name:ch,slug:ci,abbreviation:cj,space:{id:c,domainName:d,__typename:b},__typename:a},{id:8,name:ck,slug:cl,abbreviation:cm,space:{id:i,domainName:j,__typename:b},__typename:a},{id:2446,name:ck,slug:cl,abbreviation:cm,space:{id:c,domainName:d,__typename:b},__typename:a},{id:10,name:cn,slug:co,abbreviation:cp,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3230,name:cn,slug:co,abbreviation:cp,space:{id:c,domainName:d,__typename:b},__typename:a},{id:t,name:cq,slug:cr,abbreviation:cs,space:{id:i,domainName:j,__typename:b},__typename:a},{id:2358,name:cq,slug:cr,abbreviation:cs,space:{id:c,domainName:d,__typename:b},__typename:a},{id:3660,name:"Advanced Optical Technologies",slug:"advanced-optical-technologies",abbreviation:"aot",space:{id:c,domainName:d,__typename:b},__typename:a},{id:bY,name:ct,slug:cu,abbreviation:cv,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3659,name:ct,slug:cu,abbreviation:cv,space:{id:c,domainName:d,__typename:b},__typename:a},{id:H,name:cw,slug:cx,abbreviation:"abp",space:{id:i,domainName:j,__typename:b},__typename:a},{id:3695,name:cw,slug:cx,abbreviation:"ABP",space:{id:c,domainName:d,__typename:b},__typename:a}]},serverRendered:g,routePath:"\u002Fjournals\u002Fcomputational-neuroscience\u002Farticles\u002F10.3389\u002Ffncom.2021.650050\u002Ffull",config:{baseUrl:"https:\u002F\u002Fwww.frontiersin.org",appName:"article-pages-2022",spaceId:c,spaceName:v,domain:d,loopUrl:"https:\u002F\u002Floop.frontiersin.org",ssMainDomain:d,googleRecaptchaKeyName:"FrontiersRecaptchaV2",googleRecaptchaSiteKey:"6LdG3i0UAAAAAOC4qUh35ubHgJotEHp_STXHgr_v",linkedArticleCopyText:"'{\"articleTypeCopyText\":[{\"articleTypeId\":0,\"originalArticleCopyText\":\"Part of this article's content has been mentioned in:\",\"linkedArticleCopyText\":\"This article mentions parts of:\"},{\"articleTypeId\":122,\"originalArticleCopyText\":\"Parts of this article's content have been modified or rectified in:\",\"linkedArticleCopyText\":\"This article is an erratum on:\"},{\"articleTypeId\":129,\"originalArticleCopyText\":\"Parts of this article's content have been modified or rectified in:\",\"linkedArticleCopyText\":\"This article is an addendum to:\"},{\"articleTypeId\":128,\"originalArticleCopyText\":\"A correction has been applied to this article in:\",\"linkedArticleCopyText\":\"This article is a correction to:\"},{\"articleTypeId\":134,\"originalArticleCopyText\":\"A retraction of this article was approved in:\",\"linkedArticleCopyText\":\"This article is a retraction of:\"},{\"articleTypeId\":29,\"originalArticleCopyText\":\"A commentary has been posted on this article:\",\"linkedArticleCopyText\":\"This article is a commentary on:\"},{\"articleTypeId\":30,\"originalArticleCopyText\":\"A commentary has been posted on this article:\",\"linkedArticleCopyText\":\"This article is a commentary on:\"}],\"articleIdCopyText\":[]}'\n",articleTypeConfigurableLabel:"\u003C\u003Carticle-type:uppercase\u003E\u003E article",terminologySettings:"'{\"terms\":[{\"sequenceNumber\":1,\"key\":\"frontiers\",\"tenantTerm\":\"Frontiers\",\"frontiersDefaultTerm\":\"Frontiers\",\"category\":\"Customer\"},{\"sequenceNumber\":2,\"key\":\"submission_system\",\"tenantTerm\":\"submission system\",\"frontiersDefaultTerm\":\"submission system\",\"category\":\"Product\"},{\"sequenceNumber\":3,\"key\":\"public_pages\",\"tenantTerm\":\"public pages\",\"frontiersDefaultTerm\":\"public pages\",\"category\":\"Product\"},{\"sequenceNumber\":4,\"key\":\"my_frontiers\",\"tenantTerm\":\"my frontiers\",\"frontiersDefaultTerm\":\"my frontiers\",\"category\":\"Product\"},{\"sequenceNumber\":5,\"key\":\"digital_editorial_office\",\"tenantTerm\":\"digital editorial office\",\"frontiersDefaultTerm\":\"digital editorial office\",\"category\":\"Product\"},{\"sequenceNumber\":6,\"key\":\"deo\",\"tenantTerm\":\"DEO\",\"frontiersDefaultTerm\":\"DEO\",\"category\":\"Product\"},{\"sequenceNumber\":7,\"key\":\"digital_editorial_office_for_chiefs\",\"tenantTerm\":\"digital editorial office for chiefs\",\"frontiersDefaultTerm\":\"digital editorial office for chiefs\",\"category\":\"Product\"},{\"sequenceNumber\":8,\"key\":\"digital_editorial_office_for_eof\",\"tenantTerm\":\"digital editorial office for eof\",\"frontiersDefaultTerm\":\"digital editorial office for eof\",\"category\":\"Product\"},{\"sequenceNumber\":9,\"key\":\"editorial_office\",\"tenantTerm\":\"editorial office\",\"frontiersDefaultTerm\":\"editorial office\",\"category\":\"Product\"},{\"sequenceNumber\":10,\"key\":\"eof\",\"tenantTerm\":\"EOF\",\"frontiersDefaultTerm\":\"EOF\",\"category\":\"Product\"},{\"sequenceNumber\":11,\"key\":\"research_topic_management\",\"tenantTerm\":\"research topic management\",\"frontiersDefaultTerm\":\"research topic management\",\"category\":\"Product\"},{\"sequenceNumber\":12,\"key\":\"review_forum\",\"tenantTerm\":\"review forum\",\"frontiersDefaultTerm\":\"review forum\",\"category\":\"Product\"},{\"sequenceNumber\":13,\"key\":\"accounting_office\",\"tenantTerm\":\"accounting office\",\"frontiersDefaultTerm\":\"accounting office\",\"category\":\"Product\"},{\"sequenceNumber\":14,\"key\":\"aof\",\"tenantTerm\":\"AOF\",\"frontiersDefaultTerm\":\"AOF\",\"category\":\"Product\"},{\"sequenceNumber\":15,\"key\":\"publishing_office\",\"tenantTerm\":\"publishing office\",\"frontiersDefaultTerm\":\"publishing office\",\"category\":\"Product\"},{\"sequenceNumber\":16,\"key\":\"production_office\",\"tenantTerm\":\"production office forum\",\"frontiersDefaultTerm\":\"production office forum\",\"category\":\"Product\"},{\"sequenceNumber\":17,\"key\":\"pof\",\"tenantTerm\":\"POF\",\"frontiersDefaultTerm\":\"POF\",\"category\":\"Product\"},{\"sequenceNumber\":18,\"key\":\"book_office_forum\",\"tenantTerm\":\"book office forum\",\"frontiersDefaultTerm\":\"book office forum\",\"category\":\"Product\"},{\"sequenceNumber\":19,\"key\":\"bof\",\"tenantTerm\":\"BOF\",\"frontiersDefaultTerm\":\"BOF\",\"category\":\"Product\"},{\"sequenceNumber\":20,\"key\":\"aira\",\"tenantTerm\":\"AIRA\",\"frontiersDefaultTerm\":\"AIRA\",\"category\":\"Product\"},{\"sequenceNumber\":21,\"key\":\"editorial_board_management\",\"tenantTerm\":\"editorial board management\",\"frontiersDefaultTerm\":\"editorial board management\",\"category\":\"Product\"},{\"sequenceNumber\":22,\"key\":\"ebm\",\"tenantTerm\":\"EBM\",\"frontiersDefaultTerm\":\"EBM\",\"category\":\"Product\"},{\"sequenceNumber\":23,\"key\":\"domain\",\"tenantTerm\":\"domain\",\"frontiersDefaultTerm\":\"domain\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":24,\"key\":\"journal\",\"tenantTerm\":\"journal\",\"frontiersDefaultTerm\":\"journal\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":25,\"key\":\"section\",\"tenantTerm\":\"section\",\"frontiersDefaultTerm\":\"section\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":26,\"key\":\"domains\",\"tenantTerm\":\"domains\",\"frontiersDefaultTerm\":\"domains\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":27,\"key\":\"specialty_section\",\"tenantTerm\":\"specialty section\",\"frontiersDefaultTerm\":\"specialty section\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":28,\"key\":\"specialty_journal\",\"tenantTerm\":\"specialty journal\",\"frontiersDefaultTerm\":\"specialty journal\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":29,\"key\":\"journals\",\"tenantTerm\":\"journals\",\"frontiersDefaultTerm\":\"journals\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":30,\"key\":\"sections\",\"tenantTerm\":\"sections\",\"frontiersDefaultTerm\":\"sections\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":31,\"key\":\"specialty_sections\",\"tenantTerm\":\"specialty sections\",\"frontiersDefaultTerm\":\"specialty sections\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":32,\"key\":\"specialty_journals\",\"tenantTerm\":\"specialty journals\",\"frontiersDefaultTerm\":\"specialty journals\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":33,\"key\":\"manuscript\",\"tenantTerm\":\"manuscript\",\"frontiersDefaultTerm\":\"manuscript\",\"category\":\"Core\"},{\"sequenceNumber\":34,\"key\":\"manuscripts\",\"tenantTerm\":\"manuscripts\",\"frontiersDefaultTerm\":\"manuscripts\",\"category\":\"Core\"},{\"sequenceNumber\":35,\"key\":\"article\",\"tenantTerm\":\"article\",\"frontiersDefaultTerm\":\"article\",\"category\":\"Core\"},{\"sequenceNumber\":36,\"key\":\"articles\",\"tenantTerm\":\"articles\",\"frontiersDefaultTerm\":\"articles\",\"category\":\"Core\"},{\"sequenceNumber\":37,\"key\":\"article_type\",\"tenantTerm\":\"article type\",\"frontiersDefaultTerm\":\"article type\",\"category\":\"Core\"},{\"sequenceNumber\":38,\"key\":\"article_types\",\"tenantTerm\":\"article types\",\"frontiersDefaultTerm\":\"article types\",\"category\":\"Core\"},{\"sequenceNumber\":39,\"key\":\"author\",\"tenantTerm\":\"author\",\"frontiersDefaultTerm\":\"author\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":40,\"key\":\"authors\",\"tenantTerm\":\"authors\",\"frontiersDefaultTerm\":\"authors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":41,\"key\":\"authoring\",\"tenantTerm\":\"authoring\",\"frontiersDefaultTerm\":\"authoring\",\"category\":\"Core\"},{\"sequenceNumber\":42,\"key\":\"authored\",\"tenantTerm\":\"authored\",\"frontiersDefaultTerm\":\"authored\",\"category\":\"Core\"},{\"sequenceNumber\":43,\"key\":\"accept\",\"tenantTerm\":\"accept\",\"frontiersDefaultTerm\":\"accept\",\"category\":\"Process\"},{\"sequenceNumber\":44,\"key\":\"accepted\",\"tenantTerm\":\"accepted\",\"frontiersDefaultTerm\":\"accepted\",\"category\":\"Process\"},{\"sequenceNumber\":45,\"key\":\"assistant_field_chief_editor\",\"tenantTerm\":\"Assistant Field Chief Editor\",\"frontiersDefaultTerm\":\"Assistant Field Chief Editor\",\"description\":\"An editorial role on a Field Journal that a Registered User may hold. This gives them rights to different functionality and parts of the platform\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":46,\"key\":\"assistant_specialty_chief_editor\",\"tenantTerm\":\"Assistant Specialty Chief Editor\",\"frontiersDefaultTerm\":\"Assistant Specialty Chief Editor\",\"description\":\"An editorial role on a specialty that a Registered User may hold. This gives them rights to different functionality and parts of the platform\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":47,\"key\":\"assistant_specialty_chief_editors\",\"tenantTerm\":\"Assistant Specialty Chief Editors\",\"frontiersDefaultTerm\":\"Assistant Specialty Chief Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":48,\"key\":\"associate_editor\",\"tenantTerm\":\"Associate Editor\",\"frontiersDefaultTerm\":\"Associate Editor\",\"description\":\"An editorial role on a specialty that a Registered User may hold. This gives them rights to different functionality and parts of the platform\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":49,\"key\":\"specialty_chief_editor\",\"tenantTerm\":\"Specialty Chief Editor\",\"frontiersDefaultTerm\":\"Specialty Chief Editor\",\"description\":\"An editorial role on a specialty that a Registered User may hold. This gives them rights to different functionality and parts of the platform\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":50,\"key\":\"specialty_chief_editors\",\"tenantTerm\":\"Specialty Chief Editors\",\"frontiersDefaultTerm\":\"Specialty Chief Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":51,\"key\":\"chief_editor\",\"tenantTerm\":\"Chief Editor\",\"frontiersDefaultTerm\":\"Chief Editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":52,\"key\":\"chief_editors\",\"tenantTerm\":\"Chief Editors\",\"frontiersDefaultTerm\":\"Chief Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":53,\"key\":\"call_for_participation\",\"tenantTerm\":\"call for participation\",\"frontiersDefaultTerm\":\"call for participation\",\"category\":\"Process\"},{\"sequenceNumber\":54,\"key\":\"citation\",\"tenantTerm\":\"citation\",\"frontiersDefaultTerm\":\"citation\",\"category\":\"Misc.\"},{\"sequenceNumber\":55,\"key\":\"citations\",\"tenantTerm\":\"citations\",\"frontiersDefaultTerm\":\"citations\",\"category\":\"Misc.\"},{\"sequenceNumber\":56,\"key\":\"contributor\",\"tenantTerm\":\"contributor\",\"frontiersDefaultTerm\":\"contributor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":57,\"key\":\"contributors\",\"tenantTerm\":\"contributors\",\"frontiersDefaultTerm\":\"contributors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":58,\"key\":\"corresponding_author\",\"tenantTerm\":\"corresponding author\",\"frontiersDefaultTerm\":\"corresponding author\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":59,\"key\":\"corresponding_authors\",\"tenantTerm\":\"corresponding authors\",\"frontiersDefaultTerm\":\"corresponding authors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":60,\"key\":\"decline\",\"tenantTerm\":\"decline\",\"frontiersDefaultTerm\":\"decline\",\"category\":\"Process\"},{\"sequenceNumber\":61,\"key\":\"declined\",\"tenantTerm\":\"declined\",\"frontiersDefaultTerm\":\"declined\",\"category\":\"Process\"},{\"sequenceNumber\":62,\"key\":\"reject\",\"tenantTerm\":\"reject\",\"frontiersDefaultTerm\":\"reject\",\"category\":\"Process\"},{\"sequenceNumber\":63,\"key\":\"rejected\",\"tenantTerm\":\"rejected\",\"frontiersDefaultTerm\":\"rejected\",\"category\":\"Process\"},{\"sequenceNumber\":64,\"key\":\"publish\",\"tenantTerm\":\"publish\",\"frontiersDefaultTerm\":\"publish\",\"category\":\"Core\"},{\"sequenceNumber\":65,\"key\":\"published\",\"tenantTerm\":\"published\",\"frontiersDefaultTerm\":\"published\",\"category\":\"Core\"},{\"sequenceNumber\":66,\"key\":\"publication\",\"tenantTerm\":\"publication\",\"frontiersDefaultTerm\":\"publication\",\"category\":\"Core\"},{\"sequenceNumber\":67,\"key\":\"peer_review\",\"tenantTerm\":\"peer review\",\"frontiersDefaultTerm\":\"peer review\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":68,\"key\":\"peer_reviewed\",\"tenantTerm\":\"peer reviewed\",\"frontiersDefaultTerm\":\"peer reviewed\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":69,\"key\":\"initial_validation\",\"tenantTerm\":\"initial validation\",\"frontiersDefaultTerm\":\"initial validation\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":70,\"key\":\"editorial_assignment\",\"tenantTerm\":\"editorial assignment\",\"frontiersDefaultTerm\":\"editorial assignment\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":71,\"key\":\"independent_review\",\"tenantTerm\":\"independent review\",\"frontiersDefaultTerm\":\"independent review\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":72,\"key\":\"interactive_review\",\"tenantTerm\":\"interactive review\",\"frontiersDefaultTerm\":\"interactive review\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":73,\"key\":\"review\",\"tenantTerm\":\"review\",\"frontiersDefaultTerm\":\"review\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":74,\"key\":\"reviewing\",\"tenantTerm\":\"reviewing\",\"frontiersDefaultTerm\":\"reviewing\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":75,\"key\":\"reviewer\",\"tenantTerm\":\"reviewer\",\"frontiersDefaultTerm\":\"reviewer\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":76,\"key\":\"reviewers\",\"tenantTerm\":\"reviewers\",\"frontiersDefaultTerm\":\"reviewers\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":77,\"key\":\"review_finalized\",\"tenantTerm\":\"review finalized\",\"frontiersDefaultTerm\":\"review finalized\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":78,\"key\":\"final_decision\",\"tenantTerm\":\"final decision\",\"frontiersDefaultTerm\":\"final decision\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":79,\"key\":\"final_validation\",\"tenantTerm\":\"final validation\",\"frontiersDefaultTerm\":\"final validation\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":80,\"key\":\"ae_accept_manuscript\",\"tenantTerm\":\"recommend to accept manuscript\",\"frontiersDefaultTerm\":\"accept manuscript\",\"category\":\"Process\"},{\"sequenceNumber\":81,\"key\":\"fee\",\"tenantTerm\":\"fee\",\"frontiersDefaultTerm\":\"fee\",\"category\":\"Accounting\"},{\"sequenceNumber\":82,\"key\":\"fees\",\"tenantTerm\":\"fees\",\"frontiersDefaultTerm\":\"fees\",\"category\":\"Accounting\"},{\"sequenceNumber\":83,\"key\":\"guest_associate_editor\",\"tenantTerm\":\"Guest Associate Editor\",\"frontiersDefaultTerm\":\"Guest Associate Editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":84,\"key\":\"guest_associate_editors\",\"tenantTerm\":\"Guest Associate Editors\",\"frontiersDefaultTerm\":\"Guest Associate Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":85,\"key\":\"in_review\",\"tenantTerm\":\"in review\",\"frontiersDefaultTerm\":\"in review\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":86,\"key\":\"institutional_member\",\"tenantTerm\":\"institutional partner\",\"frontiersDefaultTerm\":\"institutional partner\",\"category\":\"Accounting\"},{\"sequenceNumber\":87,\"key\":\"institutional_membership\",\"tenantTerm\":\"institutional partnership\",\"frontiersDefaultTerm\":\"institutional partnership\",\"category\":\"Accounting\"},{\"sequenceNumber\":88,\"key\":\"article_processing_charge\",\"tenantTerm\":\"article processing charge\",\"frontiersDefaultTerm\":\"article processing charge\",\"category\":\"Accounting\"},{\"sequenceNumber\":89,\"key\":\"article_processing_charges\",\"tenantTerm\":\"article processing charges\",\"frontiersDefaultTerm\":\"article processing charges\",\"category\":\"Accounting\"},{\"sequenceNumber\":90,\"key\":\"apcs\",\"tenantTerm\":\"APCs\",\"frontiersDefaultTerm\":\"APCs\",\"category\":\"Accounting\"},{\"sequenceNumber\":91,\"key\":\"apc\",\"tenantTerm\":\"APC\",\"frontiersDefaultTerm\":\"APC\",\"category\":\"Accounting\"},{\"sequenceNumber\":92,\"key\":\"received\",\"tenantTerm\":\"received\",\"frontiersDefaultTerm\":\"received\",\"description\":\"Date manuscript was received on.\",\"category\":\"Core\"},{\"sequenceNumber\":93,\"key\":\"transferred\",\"tenantTerm\":\"transferred\",\"frontiersDefaultTerm\":\"transferred\",\"category\":\"Core\"},{\"sequenceNumber\":94,\"key\":\"transfer\",\"tenantTerm\":\"transfer\",\"frontiersDefaultTerm\":\"transfer\",\"category\":\"Core\"},{\"sequenceNumber\":95,\"key\":\"research_topic\",\"tenantTerm\":\"research topic\",\"frontiersDefaultTerm\":\"research topic\",\"category\":\"Core\"},{\"sequenceNumber\":96,\"key\":\"research_topics\",\"tenantTerm\":\"research topics\",\"frontiersDefaultTerm\":\"research topics\",\"category\":\"Core\"},{\"sequenceNumber\":97,\"key\":\"topic_editor\",\"tenantTerm\":\"Topic Editor\",\"frontiersDefaultTerm\":\"Topic Editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":98,\"key\":\"review_editor\",\"tenantTerm\":\"Review Editor\",\"frontiersDefaultTerm\":\"Review Editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":99,\"key\":\"title\",\"tenantTerm\":\"title\",\"frontiersDefaultTerm\":\"title\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":100,\"key\":\"running_title\",\"tenantTerm\":\"running title\",\"frontiersDefaultTerm\":\"running title\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":101,\"key\":\"submit\",\"tenantTerm\":\"submit\",\"frontiersDefaultTerm\":\"submit\",\"category\":\"Process\"},{\"sequenceNumber\":102,\"key\":\"submitted\",\"tenantTerm\":\"submitted\",\"frontiersDefaultTerm\":\"submitted\",\"category\":\"Process\"},{\"sequenceNumber\":103,\"key\":\"submitting\",\"tenantTerm\":\"submitting\",\"frontiersDefaultTerm\":\"submitting\",\"category\":\"Process\"},{\"sequenceNumber\":104,\"key\":\"t_e\",\"tenantTerm\":\"TE\",\"frontiersDefaultTerm\":\"TE\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":105,\"key\":\"topic\",\"tenantTerm\":\"topic\",\"frontiersDefaultTerm\":\"topic\",\"category\":\"Process\"},{\"sequenceNumber\":106,\"key\":\"topic_summary\",\"tenantTerm\":\"topic summary\",\"frontiersDefaultTerm\":\"topic summary\",\"category\":\"Process\"},{\"sequenceNumber\":107,\"key\":\"figure\",\"tenantTerm\":\"figure\",\"frontiersDefaultTerm\":\"figure\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":108,\"key\":\"figures\",\"tenantTerm\":\"figures\",\"frontiersDefaultTerm\":\"figures\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":109,\"key\":\"editorial_file\",\"tenantTerm\":\"editorial file\",\"frontiersDefaultTerm\":\"editorial file\",\"category\":\"Core\"},{\"sequenceNumber\":110,\"key\":\"editorial_files\",\"tenantTerm\":\"editorial files\",\"frontiersDefaultTerm\":\"editorial files\",\"category\":\"Core\"},{\"sequenceNumber\":111,\"key\":\"e_book\",\"tenantTerm\":\"e-book\",\"frontiersDefaultTerm\":\"e-book\",\"category\":\"Core\"},{\"sequenceNumber\":112,\"key\":\"organization\",\"tenantTerm\":\"organization\",\"frontiersDefaultTerm\":\"organization\",\"category\":\"Core\"},{\"sequenceNumber\":113,\"key\":\"institution\",\"tenantTerm\":\"institution\",\"frontiersDefaultTerm\":\"institution\",\"category\":\"Core\"},{\"sequenceNumber\":114,\"key\":\"reference\",\"tenantTerm\":\"reference\",\"frontiersDefaultTerm\":\"reference\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":115,\"key\":\"references\",\"tenantTerm\":\"references\",\"frontiersDefaultTerm\":\"references\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":116,\"key\":\"sce\",\"tenantTerm\":\"SCE\",\"frontiersDefaultTerm\":\"SCE\",\"description\":\"Abbreviation for Specialty Chief Editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":117,\"key\":\"submission\",\"tenantTerm\":\"submission\",\"frontiersDefaultTerm\":\"submission\",\"category\":\"Process\"},{\"sequenceNumber\":118,\"key\":\"submissions\",\"tenantTerm\":\"submissions\",\"frontiersDefaultTerm\":\"submissions\",\"category\":\"Process\"},{\"sequenceNumber\":119,\"key\":\"editing\",\"tenantTerm\":\"editing\",\"frontiersDefaultTerm\":\"editing\",\"category\":\"Process\"},{\"sequenceNumber\":120,\"key\":\"in_preparation\",\"tenantTerm\":\"in preparation\",\"frontiersDefaultTerm\":\"in preparation\",\"category\":\"Process\"},{\"sequenceNumber\":121,\"key\":\"country_region\",\"tenantTerm\":\"country\u002Fregion\",\"frontiersDefaultTerm\":\"country\u002Fregion\",\"description\":\"Because of political issues, some of the country listings are actually classified as `regions` and we need to include this. However other clients may not want to do this.\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":122,\"key\":\"countries_regions\",\"tenantTerm\":\"countries\u002Fregions\",\"frontiersDefaultTerm\":\"countries\u002Fregions\",\"description\":\"Because of political issues, some of the country listings are actually classified as `regions` and we need to include this. However other clients may not want to do this.\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":123,\"key\":\"specialty\",\"tenantTerm\":\"specialty\",\"frontiersDefaultTerm\":\"specialty\",\"category\":\"Core\"},{\"sequenceNumber\":124,\"key\":\"specialties\",\"tenantTerm\":\"specialties\",\"frontiersDefaultTerm\":\"specialties\",\"category\":\"Core\"},{\"sequenceNumber\":125,\"key\":\"associate_editors\",\"tenantTerm\":\"Associate Editors\",\"frontiersDefaultTerm\":\"Associate Editors\",\"description\":\"An editorial role on a specialty that a Registered User may hold. This gives them rights to different functionality and parts of the platform\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":126,\"key\":\"reviewed\",\"tenantTerm\":\"reviewed\",\"frontiersDefaultTerm\":\"reviewed\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":127,\"key\":\"institutional_members\",\"tenantTerm\":\"institutional partners\",\"frontiersDefaultTerm\":\"institutional partners\",\"category\":\"Accounting\"},{\"sequenceNumber\":128,\"key\":\"institutional_memberships\",\"tenantTerm\":\"institutional partnerships\",\"frontiersDefaultTerm\":\"institutional partnerships\",\"category\":\"Accounting\"},{\"sequenceNumber\":129,\"key\":\"assistant_field_chief_editors\",\"tenantTerm\":\"Assistant Field Chief Editors\",\"frontiersDefaultTerm\":\"Assistant Field Chief Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":130,\"key\":\"publications\",\"tenantTerm\":\"publications\",\"frontiersDefaultTerm\":\"publications\",\"category\":\"Process\"},{\"sequenceNumber\":131,\"key\":\"ae_accepted\",\"tenantTerm\":\"recommended acceptance\",\"frontiersDefaultTerm\":\"accepted\",\"category\":\"Process\"},{\"sequenceNumber\":132,\"key\":\"field_journal\",\"tenantTerm\":\"field journal\",\"frontiersDefaultTerm\":\"field journal\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":133,\"key\":\"field_journals\",\"tenantTerm\":\"field journals\",\"frontiersDefaultTerm\":\"field journals\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":134,\"key\":\"program_manager\",\"tenantTerm\":\"program manager\",\"frontiersDefaultTerm\":\"program manager\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":135,\"key\":\"journal_manager\",\"tenantTerm\":\"journal manager\",\"frontiersDefaultTerm\":\"journal manager\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":136,\"key\":\"journal_specialist\",\"tenantTerm\":\"journal specialist\",\"frontiersDefaultTerm\":\"journal specialist\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":137,\"key\":\"program_managers\",\"tenantTerm\":\"program managers\",\"frontiersDefaultTerm\":\"program managers\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":138,\"key\":\"journal_managers\",\"tenantTerm\":\"journal managers\",\"frontiersDefaultTerm\":\"journal managers\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":139,\"key\":\"journal_specialists\",\"tenantTerm\":\"journal specialists\",\"frontiersDefaultTerm\":\"journal specialists\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":140,\"key\":\"cover_letter\",\"tenantTerm\":\"manuscript contribution to the field\",\"frontiersDefaultTerm\":\"manuscript contribution to the field\",\"category\":\"Process\"},{\"sequenceNumber\":141,\"key\":\"ae_accepted_manuscript\",\"tenantTerm\":\"recommended to accept manuscript\",\"frontiersDefaultTerm\":\"accepted manuscript\",\"category\":\"Process\"},{\"sequenceNumber\":142,\"key\":\"recommend_for_rejection\",\"tenantTerm\":\"recommend for rejection\",\"frontiersDefaultTerm\":\"recommend for rejection\",\"category\":\"Process\"},{\"sequenceNumber\":143,\"key\":\"recommended_for_rejection\",\"tenantTerm\":\"recommended for rejection\",\"frontiersDefaultTerm\":\"recommended for rejection\",\"category\":\"Process\"},{\"sequenceNumber\":144,\"key\":\"ae\",\"tenantTerm\":\"AE\",\"frontiersDefaultTerm\":\"AE\",\"description\":\"Associate Editor - board member\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":145,\"key\":\"re\",\"tenantTerm\":\"RE\",\"frontiersDefaultTerm\":\"RE\",\"description\":\"Review Editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":146,\"key\":\"rev\",\"tenantTerm\":\"REV\",\"frontiersDefaultTerm\":\"REV\",\"description\":\"Reviewer\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":147,\"key\":\"aut\",\"tenantTerm\":\"AUT\",\"frontiersDefaultTerm\":\"AUT\",\"description\":\"Author\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":148,\"key\":\"coraut\",\"tenantTerm\":\"CORAUT\",\"frontiersDefaultTerm\":\"CORAUT\",\"description\":\"Corresponding author\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":149,\"key\":\"saut\",\"tenantTerm\":\"SAUT\",\"frontiersDefaultTerm\":\"SAUT\",\"description\":\"Submitting author\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":150,\"key\":\"coaut\",\"tenantTerm\":\"COAUT\",\"frontiersDefaultTerm\":\"COAUT\",\"description\":\"co-author\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":151,\"key\":\"tsof\",\"tenantTerm\":\"TSOF\",\"frontiersDefaultTerm\":\"TSOF\",\"description\":\"Typesetter\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":152,\"key\":\"typesetting_office\",\"tenantTerm\":\"typesetting office\",\"frontiersDefaultTerm\":\"typesetting office\",\"category\":\"Product\"},{\"sequenceNumber\":153,\"key\":\"config\",\"tenantTerm\":\"CONFIG\",\"frontiersDefaultTerm\":\"CONFIG\",\"description\":\"Configuration office role\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":154,\"key\":\"jm\",\"tenantTerm\":\"JM\",\"frontiersDefaultTerm\":\"JM\",\"description\":\"Journal Manager\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":155,\"key\":\"rte\",\"tenantTerm\":\"RTE\",\"frontiersDefaultTerm\":\"RTE\",\"description\":\"Research topic editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":156,\"key\":\"organizations\",\"tenantTerm\":\"organizations\",\"frontiersDefaultTerm\":\"organizations\",\"category\":\"Core\"},{\"sequenceNumber\":157,\"key\":\"publishing\",\"tenantTerm\":\"publishing\",\"frontiersDefaultTerm\":\"publishing\",\"category\":\"Core\"},{\"sequenceNumber\":158,\"key\":\"acceptance\",\"tenantTerm\":\"acceptance\",\"frontiersDefaultTerm\":\"acceptance\",\"category\":\"Process\"},{\"sequenceNumber\":159,\"key\":\"preferred_associate_editor\",\"tenantTerm\":\"preferred associate editor\",\"frontiersDefaultTerm\":\"preferred associate editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":160,\"key\":\"topic_editors\",\"tenantTerm\":\"Topic Editors\",\"frontiersDefaultTerm\":\"Topic Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":161,\"key\":\"institutions\",\"tenantTerm\":\"institutions\",\"frontiersDefaultTerm\":\"institutions\",\"category\":\"Core\"},{\"sequenceNumber\":162,\"key\":\"author(s)\",\"tenantTerm\":\"author(s)\",\"frontiersDefaultTerm\":\"author(s)\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":163,\"key\":\"figure(s)\",\"tenantTerm\":\"figure(s)\",\"frontiersDefaultTerm\":\"figure(s)\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":164,\"key\":\"co-authors\",\"tenantTerm\":\"co-authors\",\"frontiersDefaultTerm\":\"co-authors\",\"description\":\"co-authors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":165,\"key\":\"editorial_board_members\",\"tenantTerm\":\"editorial board members\",\"frontiersDefaultTerm\":\"editorial board members\",\"description\":\"editorial board members\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":166,\"key\":\"editorial_board\",\"tenantTerm\":\"editorial board\",\"frontiersDefaultTerm\":\"editorial board\",\"description\":\"editorial board\",\"category\":\"Product\"},{\"sequenceNumber\":167,\"key\":\"co-authorship\",\"tenantTerm\":\"co-authorship\",\"frontiersDefaultTerm\":\"co-authorship\",\"description\":\"co-authorship\",\"category\":\"Misc.\"},{\"sequenceNumber\":168,\"key\":\"role_id_1\",\"tenantTerm\":\"registration office\",\"frontiersDefaultTerm\":\"registration office\",\"category\":\"User Role\"},{\"sequenceNumber\":169,\"key\":\"role_id_2\",\"tenantTerm\":\"editorial office\",\"frontiersDefaultTerm\":\"editorial office\",\"category\":\"User Role\"},{\"sequenceNumber\":170,\"key\":\"role_id_7\",\"tenantTerm\":\"field chief editor\",\"frontiersDefaultTerm\":\"field chief editor\",\"category\":\"User Role\"},{\"sequenceNumber\":171,\"key\":\"role_id_8\",\"tenantTerm\":\"assistant field chief editor\",\"frontiersDefaultTerm\":\"assistant field chief editor\",\"category\":\"User Role\"},{\"sequenceNumber\":172,\"key\":\"role_id_9\",\"tenantTerm\":\"specialty chief editor\",\"frontiersDefaultTerm\":\"specialty chief editor\",\"category\":\"User Role\"},{\"sequenceNumber\":173,\"key\":\"role_id_10\",\"tenantTerm\":\"assistant specialty chief editor\",\"frontiersDefaultTerm\":\"assistant specialty chief editor\",\"category\":\"User Role\"},{\"sequenceNumber\":174,\"key\":\"role_id_11\",\"tenantTerm\":\"associate editor\",\"frontiersDefaultTerm\":\"associate editor\",\"category\":\"User Role\"},{\"sequenceNumber\":175,\"key\":\"role_id_12\",\"tenantTerm\":\"guest associate editor\",\"frontiersDefaultTerm\":\"guest associate editor\",\"category\":\"User Role\"},{\"sequenceNumber\":176,\"key\":\"role_id_13\",\"tenantTerm\":\"review editor\",\"frontiersDefaultTerm\":\"review editor\",\"category\":\"User Role\"},{\"sequenceNumber\":177,\"key\":\"role_id_14\",\"tenantTerm\":\"reviewer\",\"frontiersDefaultTerm\":\"reviewer\",\"category\":\"User Role\"},{\"sequenceNumber\":178,\"key\":\"role_id_15\",\"tenantTerm\":\"author\",\"frontiersDefaultTerm\":\"author\",\"category\":\"User Role\"},{\"sequenceNumber\":179,\"key\":\"role_id_16\",\"tenantTerm\":\"corresponding author\",\"frontiersDefaultTerm\":\"corresponding author\",\"category\":\"User Role\"},{\"sequenceNumber\":180,\"key\":\"role_id_17\",\"tenantTerm\":\"submitting author\",\"frontiersDefaultTerm\":\"submitting author\",\"category\":\"User Role\"},{\"sequenceNumber\":181,\"key\":\"role_id_18\",\"tenantTerm\":\"co-author\",\"frontiersDefaultTerm\":\"co-author\",\"category\":\"User Role\"},{\"sequenceNumber\":182,\"key\":\"role_id_20\",\"tenantTerm\":\"production office\",\"frontiersDefaultTerm\":\"production office\",\"category\":\"User Role\"},{\"sequenceNumber\":183,\"key\":\"role_id_22\",\"tenantTerm\":\"typesetting office (typesetter)\",\"frontiersDefaultTerm\":\"typesetting office (typesetter)\",\"category\":\"User Role\"},{\"sequenceNumber\":184,\"key\":\"role_id_24\",\"tenantTerm\":\"registered user\",\"frontiersDefaultTerm\":\"registered user\",\"category\":\"User Role\"},{\"sequenceNumber\":185,\"key\":\"role_id_35\",\"tenantTerm\":\"job office\",\"frontiersDefaultTerm\":\"job office\",\"category\":\"User Role\"},{\"sequenceNumber\":186,\"key\":\"role_id_41\",\"tenantTerm\":\"special event administrator\",\"frontiersDefaultTerm\":\"special event administrator\",\"category\":\"User Role\"},{\"sequenceNumber\":187,\"key\":\"role_id_42\",\"tenantTerm\":\"special event reviewer\",\"frontiersDefaultTerm\":\"special event reviewer\",\"category\":\"User Role\"},{\"sequenceNumber\":188,\"key\":\"role_id_43\",\"tenantTerm\":\"submit abstract\",\"frontiersDefaultTerm\":\"submit abstract\",\"category\":\"User Role\"},{\"sequenceNumber\":189,\"key\":\"role_id_52\",\"tenantTerm\":\"events office\",\"frontiersDefaultTerm\":\"events office\",\"category\":\"User Role\"},{\"sequenceNumber\":190,\"key\":\"role_id_53\",\"tenantTerm\":\"event administrator\",\"frontiersDefaultTerm\":\"event administrator\",\"category\":\"User Role\"},{\"sequenceNumber\":191,\"key\":\"role_id_89\",\"tenantTerm\":\"content management office\",\"frontiersDefaultTerm\":\"content management office\",\"category\":\"User Role\"},{\"sequenceNumber\":192,\"key\":\"role_id_98\",\"tenantTerm\":\"accounting office\",\"frontiersDefaultTerm\":\"accounting office\",\"category\":\"User Role\"},{\"sequenceNumber\":193,\"key\":\"role_id_99\",\"tenantTerm\":\"projects\",\"frontiersDefaultTerm\":\"projects\",\"category\":\"User Role\"},{\"sequenceNumber\":194,\"key\":\"role_id_103\",\"tenantTerm\":\"configuration office\",\"frontiersDefaultTerm\":\"configuration office\",\"category\":\"User Role\"},{\"sequenceNumber\":195,\"key\":\"role_id_104\",\"tenantTerm\":\"beta user\",\"frontiersDefaultTerm\":\"beta user\",\"category\":\"User Role\"},{\"sequenceNumber\":196,\"key\":\"role_id_106\",\"tenantTerm\":\"wfconf\",\"frontiersDefaultTerm\":\"wfconf\",\"category\":\"User Role\"},{\"sequenceNumber\":197,\"key\":\"role_id_107\",\"tenantTerm\":\"rt management beta user\",\"frontiersDefaultTerm\":\"rt management beta user\",\"category\":\"User Role\"},{\"sequenceNumber\":198,\"key\":\"role_id_108\",\"tenantTerm\":\"deo beta user\",\"frontiersDefaultTerm\":\"deo beta user\",\"category\":\"User Role\"},{\"sequenceNumber\":199,\"key\":\"role_id_109\",\"tenantTerm\":\"search beta user\",\"frontiersDefaultTerm\":\"search beta user\",\"category\":\"User Role\"},{\"sequenceNumber\":200,\"key\":\"role_id_110\",\"tenantTerm\":\"journal manager\",\"frontiersDefaultTerm\":\"journal manager\",\"category\":\"User Role\"},{\"sequenceNumber\":201,\"key\":\"role_id_111\",\"tenantTerm\":\"myfrontiers beta user\",\"frontiersDefaultTerm\":\"myfrontiers beta user\",\"category\":\"User Role\"},{\"sequenceNumber\":202,\"key\":\"role_id_21\",\"tenantTerm\":\"copy editor\",\"frontiersDefaultTerm\":\"copy editor\",\"category\":\"User Role\"},{\"sequenceNumber\":203,\"key\":\"role_id_1_abr\",\"tenantTerm\":\"ROF\",\"frontiersDefaultTerm\":\"ROF\",\"category\":\"User Role\"},{\"sequenceNumber\":204,\"key\":\"role_id_2_abr\",\"tenantTerm\":\"EOF\",\"frontiersDefaultTerm\":\"EOF\",\"category\":\"User Role\"},{\"sequenceNumber\":205,\"key\":\"role_id_7_abr\",\"tenantTerm\":\"FCE\",\"frontiersDefaultTerm\":\"FCE\",\"category\":\"User Role\"},{\"sequenceNumber\":206,\"key\":\"role_id_8_abr\",\"tenantTerm\":\"AFCE\",\"frontiersDefaultTerm\":\"AFCE\",\"category\":\"User Role\"},{\"sequenceNumber\":207,\"key\":\"role_id_9_abr\",\"tenantTerm\":\"SCE\",\"frontiersDefaultTerm\":\"SCE\",\"category\":\"User Role\"},{\"sequenceNumber\":208,\"key\":\"role_id_10_abr\",\"tenantTerm\":\"ASCE\",\"frontiersDefaultTerm\":\"ASCE\",\"category\":\"User Role\"},{\"sequenceNumber\":209,\"key\":\"role_id_11_abr\",\"tenantTerm\":\"AE\",\"frontiersDefaultTerm\":\"AE\",\"category\":\"User Role\"},{\"sequenceNumber\":210,\"key\":\"role_id_12_abr\",\"tenantTerm\":\"GAE\",\"frontiersDefaultTerm\":\"GAE\",\"category\":\"User Role\"},{\"sequenceNumber\":211,\"key\":\"role_id_13_abr\",\"tenantTerm\":\"RE\",\"frontiersDefaultTerm\":\"RE\",\"category\":\"User Role\"},{\"sequenceNumber\":212,\"key\":\"role_id_14_abr\",\"tenantTerm\":\"REV\",\"frontiersDefaultTerm\":\"REV\",\"category\":\"User Role\"},{\"sequenceNumber\":213,\"key\":\"role_id_15_abr\",\"tenantTerm\":\"AUT\",\"frontiersDefaultTerm\":\"AUT\",\"category\":\"User Role\"},{\"sequenceNumber\":214,\"key\":\"role_id_16_abr\",\"tenantTerm\":\"CORAUT\",\"frontiersDefaultTerm\":\"CORAUT\",\"category\":\"User Role\"},{\"sequenceNumber\":215,\"key\":\"role_id_17_abr\",\"tenantTerm\":\"SAUT\",\"frontiersDefaultTerm\":\"SAUT\",\"category\":\"User Role\"},{\"sequenceNumber\":216,\"key\":\"role_id_18_abr\",\"tenantTerm\":\"COAUT\",\"frontiersDefaultTerm\":\"COAUT\",\"category\":\"User Role\"},{\"sequenceNumber\":217,\"key\":\"role_id_20_abr\",\"tenantTerm\":\"POF\",\"frontiersDefaultTerm\":\"POF\",\"category\":\"User Role\"},{\"sequenceNumber\":218,\"key\":\"role_id_22_abr\",\"tenantTerm\":\"TSOF\",\"frontiersDefaultTerm\":\"TSOF\",\"category\":\"User Role\"},{\"sequenceNumber\":219,\"key\":\"role_id_24_abr\",\"tenantTerm\":\"RU\",\"frontiersDefaultTerm\":\"RU\",\"category\":\"User Role\"},{\"sequenceNumber\":220,\"key\":\"role_id_35_abr\",\"tenantTerm\":\"JOF\",\"frontiersDefaultTerm\":\"JOF\",\"category\":\"User Role\"},{\"sequenceNumber\":221,\"key\":\"role_id_41_abr\",\"tenantTerm\":\"SE-ADM\",\"frontiersDefaultTerm\":\"SE-ADM\",\"category\":\"User Role\"},{\"sequenceNumber\":222,\"key\":\"role_id_42_abr\",\"tenantTerm\":\"SE-REV\",\"frontiersDefaultTerm\":\"SE-REV\",\"category\":\"User Role\"},{\"sequenceNumber\":223,\"key\":\"role_id_43_abr\",\"tenantTerm\":\"SE-AUT\",\"frontiersDefaultTerm\":\"SE-AUT\",\"category\":\"User Role\"},{\"sequenceNumber\":224,\"key\":\"role_id_52_abr\",\"tenantTerm\":\"EVOF\",\"frontiersDefaultTerm\":\"EVOF\",\"category\":\"User Role\"},{\"sequenceNumber\":225,\"key\":\"role_id_53_abr\",\"tenantTerm\":\"EV-ADM\",\"frontiersDefaultTerm\":\"EV-ADM\",\"category\":\"User Role\"},{\"sequenceNumber\":226,\"key\":\"role_id_89_abr\",\"tenantTerm\":\"COMOF\",\"frontiersDefaultTerm\":\"COMOF\",\"category\":\"User Role\"},{\"sequenceNumber\":227,\"key\":\"role_id_98_abr\",\"tenantTerm\":\"AOF\",\"frontiersDefaultTerm\":\"AOF\",\"category\":\"User Role\"},{\"sequenceNumber\":228,\"key\":\"role_id_99_abr\",\"tenantTerm\":\"Projects\",\"frontiersDefaultTerm\":\"Projects\",\"category\":\"User Role\"},{\"sequenceNumber\":229,\"key\":\"role_id_103_abr\",\"tenantTerm\":\"CONFIG\",\"frontiersDefaultTerm\":\"CONFIG\",\"category\":\"User Role\"},{\"sequenceNumber\":230,\"key\":\"role_id_104_abr\",\"tenantTerm\":\"BETA\",\"frontiersDefaultTerm\":\"BETA\",\"category\":\"User Role\"},{\"sequenceNumber\":231,\"key\":\"role_id_106_abr\",\"tenantTerm\":\"WFCONF\",\"frontiersDefaultTerm\":\"WFCONF\",\"category\":\"User Role\"},{\"sequenceNumber\":232,\"key\":\"role_id_107_abr\",\"tenantTerm\":\"RTBETA\",\"frontiersDefaultTerm\":\"RTBETA\",\"category\":\"User Role\"},{\"sequenceNumber\":233,\"key\":\"role_id_108_abr\",\"tenantTerm\":\"DEOBETA\",\"frontiersDefaultTerm\":\"DEOBETA\",\"category\":\"User Role\"},{\"sequenceNumber\":234,\"key\":\"role_id_109_abr\",\"tenantTerm\":\"SEARCHBETA\",\"frontiersDefaultTerm\":\"SEARCHBETA\",\"category\":\"User Role\"},{\"sequenceNumber\":235,\"key\":\"role_id_110_abr\",\"tenantTerm\":\"JM\",\"frontiersDefaultTerm\":\"JM\",\"category\":\"User Role\"},{\"sequenceNumber\":236,\"key\":\"role_id_111_abr\",\"tenantTerm\":\"MFBETA\",\"frontiersDefaultTerm\":\"MFBETA\",\"category\":\"User Role\"},{\"sequenceNumber\":237,\"key\":\"role_id_21_abr\",\"tenantTerm\":\"COPED\",\"frontiersDefaultTerm\":\"COPED\",\"category\":\"User Role\"},{\"sequenceNumber\":238,\"key\":\"reviewer_editorial_board\",\"tenantTerm\":\"editorial board\",\"frontiersDefaultTerm\":\"editorial board\",\"description\":\"This is the label for the review editorial board\",\"category\":\"Label\"},{\"sequenceNumber\":239,\"key\":\"field_chief_editor\",\"tenantTerm\":\"Field Chief Editor\",\"frontiersDefaultTerm\":\"Field Chief Editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":240,\"key\":\"field_chief_editors\",\"tenantTerm\":\"Field Chief Editors\",\"frontiersDefaultTerm\":\"Field Chief Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":241,\"key\":\"editor\",\"tenantTerm\":\"editor\",\"frontiersDefaultTerm\":\"editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":242,\"key\":\"editors\",\"tenantTerm\":\"editors\",\"frontiersDefaultTerm\":\"editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":243,\"key\":\"board\",\"tenantTerm\":\"board\",\"frontiersDefaultTerm\":\"board\",\"category\":\"Label\"},{\"sequenceNumber\":244,\"key\":\"boards\",\"tenantTerm\":\"boards\",\"frontiersDefaultTerm\":\"boards\",\"category\":\"Label\"},{\"sequenceNumber\":245,\"key\":\"article_collection\",\"tenantTerm\":\"article collection\",\"frontiersDefaultTerm\":\"article collection\",\"category\":\"Label\"},{\"sequenceNumber\":246,\"key\":\"article_collections\",\"tenantTerm\":\"article collections\",\"frontiersDefaultTerm\":\"article collections\",\"category\":\"Label\"},{\"sequenceNumber\":247,\"key\":\"handling_editor\",\"tenantTerm\":\"handling editor\",\"frontiersDefaultTerm\":\"associate editor\",\"description\":\"This terminology key is for the person assigned to edit a manuscript. It is a label for the temporary handling editor assignment.\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":248,\"key\":\"handling_editors\",\"tenantTerm\":\"handling editors\",\"frontiersDefaultTerm\":\"associate editors\",\"description\":\"This terminology key is for the person assigned to edit a manuscript. It is a label for the temporary handling editor assignment.\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":249,\"key\":\"ae_accept\",\"tenantTerm\":\"recommend acceptance\",\"frontiersDefaultTerm\":\"accept\",\"category\":\"Process\"},{\"sequenceNumber\":250,\"key\":\"rtm\",\"tenantTerm\":\"RTM\",\"frontiersDefaultTerm\":\"RTM\",\"category\":\"Product\"},{\"sequenceNumber\":251,\"key\":\"frontiers_media_sa\",\"tenantTerm\":\"Frontiers Media S.A\",\"frontiersDefaultTerm\":\"Frontiers Media S.A\",\"category\":\"Customer\"},{\"sequenceNumber\":252,\"key\":\"review_editors\",\"tenantTerm\":\"Review Editors\",\"frontiersDefaultTerm\":\"Review Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":253,\"key\":\"journal_card_chief_editor\",\"tenantTerm\":\"Chief Editor\",\"frontiersDefaultTerm\":\"Chief Editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":254,\"key\":\"journal_card_chief_editors\",\"tenantTerm\":\"Chief Editors\",\"frontiersDefaultTerm\":\"Chief Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":255,\"key\":\"call_for_papers\",\"tenantTerm\":\"Call for papers\",\"frontiersDefaultTerm\":\"Call for papers\",\"category\":\"Label\"},{\"sequenceNumber\":256,\"key\":\"calls_for_papers\",\"tenantTerm\":\"Calls for papers\",\"frontiersDefaultTerm\":\"Calls for papers\",\"category\":\"Label\"},{\"sequenceNumber\":257,\"key\":\"supervising_editor\",\"tenantTerm\":\"Supervising Editor\",\"frontiersDefaultTerm\":\"Supervising Editor\",\"description\":\"A Chief or Assistant Chief editor who is assigned to a manuscript to supervise.\",\"category\":\"Role\",\"externalKey\":\"supervising_editor\"},{\"sequenceNumber\":258,\"key\":\"supervising_editors\",\"tenantTerm\":\"Supervising Editors\",\"frontiersDefaultTerm\":\"Supervising Editors\",\"description\":\"A Chief or Assistant Chief editor who is assigned to a manuscript to supervise.\",\"category\":\"Role\",\"externalKey\":\"supervising_editors\"},{\"sequenceNumber\":259,\"key\":\"reviewer_endorse\",\"tenantTerm\":\"endorse\",\"frontiersDefaultTerm\":\"endorse\",\"category\":\"Label\"},{\"sequenceNumber\":260,\"key\":\"reviewer_endorsed\",\"tenantTerm\":\"endorsed\",\"frontiersDefaultTerm\":\"endorsed\",\"category\":\"Label\"},{\"sequenceNumber\":261,\"key\":\"reviewer_endorse_publication\",\"tenantTerm\":\"endorse publication\",\"frontiersDefaultTerm\":\"endorse publication\",\"category\":\"Label\"},{\"sequenceNumber\":262,\"key\":\"reviewer_endorsed_publication\",\"tenantTerm\":\"endorsed publication\",\"frontiersDefaultTerm\":\"endorsed publication\",\"category\":\"Label\"},{\"sequenceNumber\":263,\"key\":\"editor_role\",\"tenantTerm\":\"editor role\",\"frontiersDefaultTerm\":\"Editor Role\",\"category\":\"Label\"},{\"sequenceNumber\":264,\"key\":\"editor_roles\",\"tenantTerm\":\"editor roles\",\"frontiersDefaultTerm\":\"Editor Roles\",\"category\":\"Label\"},{\"sequenceNumber\":265,\"key\":\"editorial_role\",\"tenantTerm\":\"editorial role\",\"frontiersDefaultTerm\":\"Editorial Role\",\"category\":\"Label\"},{\"sequenceNumber\":266,\"key\":\"editorial_roles\",\"tenantTerm\":\"editorial roles\",\"frontiersDefaultTerm\":\"Editorial Roles\",\"category\":\"Label\"},{\"sequenceNumber\":267,\"key\":\"call_for_paper\",\"tenantTerm\":\"Call for paper\",\"frontiersDefaultTerm\":\"Call for paper\",\"category\":\"Label\"},{\"sequenceNumber\":268,\"key\":\"research_topic_abstract\",\"tenantTerm\":\"manuscript summary\",\"frontiersDefaultTerm\":\"manuscript summary\",\"category\":\"Process\"},{\"sequenceNumber\":269,\"key\":\"research_topic_abstracts\",\"tenantTerm\":\"manuscript summaries\",\"frontiersDefaultTerm\":\"manuscript summaries\",\"category\":\"Process\"},{\"sequenceNumber\":270,\"key\":\"submissions_team_manager\",\"tenantTerm\":\"Content Manager\",\"frontiersDefaultTerm\":\"Content Manager\",\"category\":\"Process\"},{\"sequenceNumber\":271,\"key\":\"submissions_team\",\"tenantTerm\":\"Content Team\",\"frontiersDefaultTerm\":\"Content Team\",\"category\":\"Process\"},{\"sequenceNumber\":272,\"key\":\"topic_coordinator\",\"tenantTerm\":\"topic coordinator\",\"frontiersDefaultTerm\":\"topic coordinator\",\"category\":\"Process\"},{\"sequenceNumber\":273,\"key\":\"topic_coordinators\",\"tenantTerm\":\"topic coordinators\",\"frontiersDefaultTerm\":\"topic coordinators\",\"category\":\"Process\"}]}'\n",gtmId:"GTM-M322FV2",gtmAuth:"owVbWxfaJr21yQv1fe1cAQ",gtmServerUrl:"https:\u002F\u002Ftag-manager.frontiersin.org",gtmPreview:"env-1",faviconSize512:"https:\u002F\u002Fbrand.frontiersin.org\u002Fm\u002Fed3f9ce840a03d7\u002Ffavicon_512-tenantFavicon-Frontiers.png",socialMediaImg:"https:\u002F\u002Fbrand.frontiersin.org\u002Fm\u002F1c8bcb536c789e11\u002FGuidelines-Frontiers_Logo_1200x628_1-91to1.png",_app:{basePath:"\u002F",assetsPath:"\u002Farticle-pages\u002F_nuxt\u002F",cdnURL:e}},apollo:{contentfulJournalsDelivery:Object.create(null),contentfulJournalsPreview:Object.create(null),contentfulHomeDelivery:Object.create(null),contentfulHomePreview:Object.create(null),frontiersGraph:Object.create(null)}}}("journal_journal","public_space",1,"frontiersin.org",null,"_self",true,"",3,"frontierspartnerships.org","_blank",false,0,9,"Frontiers in Computational Neuroscience","PDF",5,"computational-neuroscience",4,2,"description","Frontiers","Help center","Link","Grey","Medium","ssph-journal.org","fship","image","Front. Comput. Neurosci.","1662-5188",void 0,11,18,"United States",1920,"por-journal.com",7,"escubed.org",1918,"fipp","https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002F56C3A6B3-C9F9-41FF-A2AD73FFDD914D8D\u002Fwebimage-8DD7BA64-E266-4BF0-9CC5ABA89F00ED7D.png","science","fluorescence","red","neurology","2022-06-27T10:00:19Z","fncom",237,55,"10.3389\u002Ffncom.2021.650050","A Deep Learning Approach for Automatic Seizure Detection in Children With Epilepsy","\u003Cp\u003EOver the last few decades, electroencephalogram (EEG) has become one of the most vital tools used by physicians to diagnose several neurological disorders of the human brain and, in particular, to detect seizures. Because of its peculiar nature, the consequent impact of epileptic seizures on the quality of life of patients made the precise diagnosis of epilepsy extremely essential. Therefore, this article proposes a novel deep-learning approach for detecting seizures in pediatric patients based on the classification of raw multichannel EEG signal recordings that are minimally pre-processed. The new approach takes advantage of the automatic feature learning capabilities of a two-dimensional deep convolution autoencoder (2D-DCAE) linked to a neural network-based classifier to form a unified system that is trained in a supervised way to achieve the best classification accuracy between the ictal and interictal brain state signals. For testing and evaluating our approach, two models were designed and assessed using three different EEG data segment lengths and a 10-fold cross-validation scheme. Based on five evaluation metrics, the best performing model was a supervised deep convolutional autoencoder (SDCAE) model that uses a bidirectional long short-term memory (Bi-LSTM) – based classifier, and EEG segment length of 4 s. Using the public dataset collected from the Children’s Hospital Boston (CHB) and the Massachusetts Institute of Technology (MIT), this model has obtained 98.79 ± 0.53% accuracy, 98.72 ± 0.77% sensitivity, 98.86 ± 0.53% specificity, 98.86 ± 0.53% precision, and an F1-score of 98.79 ± 0.53%, respectively. Based on these results, our new approach was able to present one of the most effective seizure detection methods compared to other existing state-of-the-art methods applied to the same dataset.\u003C\u002Fp\u003E",1163050,"Ahmed","Department of Electrical and Computer Engineering, University of Louisiana at Lafayette","Magdy",1003479,"Saman",222607,"Antonio",1191759,"Nhan Duy",3227,15,"EPUB","fncom-15-650050.pdf","Frontiers | A Deep Learning Approach for Automatic Seizure Detection in Children With Epilepsy","https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fcomputational-neuroscience\u002Farticles\u002F10.3389\u002Ffncom.2021.650050\u002Ffull","Over the last few decades, electroencephalogram (EEG) has become one of the most vital tools used by physicians to diagnose several neurological disorders of...","og:title","og:description","keywords","og:site_name","og:image","og:type","og:url","twitter:card","citation_volume","citation_journal_title","citation_publisher","citation_journal_abbrev","citation_issn","citation_doi","citation_firstpage","citation_language","citation_title","citation_keywords","citation_abstract","citation_pdf_url","citation_online_date","citation_publication_date","citation_author","citation_author_institution","Department of Electrical and Computer Engineering, University of Louisiana at Lafayette, United States","dc.identifier","articles","editors","research-topics","How we publish","https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fhow-we-publish","Fee policy","https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Ffee-policy","Research Topics","https:\u002F\u002Fforum.frontiersin.org\u002F","Frontiers Planet Prize","https:\u002F\u002Fwww.frontiersplanetprize.org\u002F","this link will take you to the Frontiers Planet Prize website","Career opportunities","https:\u002F\u002Fcareers.frontiersin.org\u002F","https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fcontact","Author guidelines","Editor guidelines","https:\u002F\u002Fwww.frontiersin.org\u002Fjournals","https:\u002F\u002Fwww.frontiersin.org\u002Farticles","Articles","https:\u002F\u002Fhelpcenter.frontiersin.org","Frontiers for Young Minds","Frontiers Facebook","Transplant International","transplant-international","ti",1921,"Spanish Journal of Soil Science","spanish-journal-of-soil-science","sjss","ebm-journal.org","Public Health Reviews","public-health-reviews","phrs","Pathology and Oncology Research","pathology-and-oncology-research","pore",21,"Pastoralism: Research, Policy and Practice","pastoralism-research-policy-and-practice","past","Oncology Reviews","oncology-reviews","or","Journal of Pharmacy & Pharmaceutical Sciences","journal-of-pharmacy-pharmaceutical-sciences","jpps","Journal of Cutaneous Immunology and Allergy","journal-of-cutaneous-immunology-and-allergy","JCIA","Journal of Abdominal Wall Surgery","journal-of-abdominal-wall-surgery","jaws",1919,"International Journal of Public Health","international-journal-of-public-health","ijph","Frontiers in Pathology","pathology","fpath",13,12,17,6,"Experimental Biology and Medicine","experimental-biology-and-medicine","EBM","European Journal of Cultural Management and Policy","european-journal-of-cultural-management-and-policy","ejcmp","Earth Science, Systems and Society","earth-science-systems-and-society","esss","Dystonia","dystonia","dyst","British Journal of Biomedical Science","british-journal-of-biomedical-science","bjbs","Aerospace Research Communications","aerospace-research-communications","arc","Advances in Drug and Alcohol Research","advances-in-drug-and-alcohol-research","adar","Acta Virologica","acta-virologica","av","Acta Biochimica Polonica","acta-biochimica-polonica"));</script><script src="/article-pages/_nuxt/4764e3b.js" defer></script><script src="/article-pages/_nuxt/a07a553.js" defer></script><script src="/article-pages/_nuxt/94ee25c.js" defer></script><script src="/article-pages/_nuxt/5465e0e.js" defer></script><script src="/article-pages/_nuxt/fb04c78.js" defer></script><script src="/article-pages/_nuxt/f8f682e.js" defer></script><script src="/article-pages/_nuxt/8e7ee66.js" defer></script><script src="/article-pages/_nuxt/232bf4b.js" defer></script><script src="/article-pages/_nuxt/3b10072.js" defer></script><script data-n-head="ssr" src="https://cdnjs.cloudflare.com/polyfill/v3/polyfill.min.js?features=es6" data-body="true" async></script><script data-n-head="ssr" src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.1/MathJax.js?config=TeX-MML-AM_CHTML" data-body="true" async></script><script data-n-head="ssr" src="https://d1bxh8uas1mnw7.cloudfront.net/assets/altmetric_badges-f0bc9b243ff5677d05460c1eb71834ca998946d764eb3bc244ab4b18ba50d21e.js" data-body="true" async></script><script data-n-head="ssr" src="https://api.altmetric.com/v1/doi/10.3389/fncom.2021.650050?callback=_altmetric.embed_callback&amp;domain=www.frontiersin.org&amp;key=3c130976ca2b8f2e88f8377633751ba1&amp;cache_until=14-15" data-body="true" async></script><script data-n-head="ssr" src="https://widgets.figshare.com/static/figshare.js" data-body="true" async></script><script data-n-head="ssr" src="https://crossmark-cdn.crossref.org/widget/v2.0/widget.js" data-body="true" async></script> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10