CINXE.COM
Frontiers | Perception and Reality: Why a Wholly Empirical Paradigm is Needed to Understand Vision
<!doctype html> <html data-n-head-ssr lang="en" data-n-head="%7B%22lang%22:%7B%22ssr%22:%22en%22%7D%7D"> <head > <link data-n-head="ssr" rel="icon" type="image/png" sizes="16x16" href="https://brand.frontiersin.org/m/ed3f9ce840a03d7/favicon_16-tenantFavicon-Frontiers.png"> <link data-n-head="ssr" rel="icon" type="image/png" sizes="32x32" href="https://brand.frontiersin.org/m/ed3f9ce840a03d7/favicon_32-tenantFavicon-Frontiers.png"> <link data-n-head="ssr" rel="apple-touch-icon" type="image/png" sizes="180x180" href="https://brand.frontiersin.org/m/ed3f9ce840a03d7/favicon_180-tenantFavicon-Frontiers.png"> <title>Frontiers | Perception and Reality: Why a Wholly Empirical Paradigm is Needed to Understand Vision</title><meta data-n-head="ssr" charset="utf-8"><meta data-n-head="ssr" name="viewport" content="width=device-width, initial-scale=1"><meta data-n-head="ssr" data-hid="charset" charset="utf-8"><meta data-n-head="ssr" data-hid="mobile-web-app-capable" name="mobile-web-app-capable" content="yes"><meta data-n-head="ssr" data-hid="apple-mobile-web-app-title" name="apple-mobile-web-app-title" content="Frontiers | Articles"><meta data-n-head="ssr" data-hid="theme-color" name="theme-color" content="#0C4DED"><meta data-n-head="ssr" data-hid="description" property="description" name="description" content="A central puzzle in vision science is how perceptions that are routinely at odds with physical measurements of real world properties can arise from neural re..."><meta data-n-head="ssr" data-hid="og:title" property="og:title" name="title" content="Frontiers | Perception and Reality: Why a Wholly Empirical Paradigm is Needed to Understand Vision"><meta data-n-head="ssr" data-hid="og:description" property="og:description" name="description" content="A central puzzle in vision science is how perceptions that are routinely at odds with physical measurements of real world properties can arise from neural re..."><meta data-n-head="ssr" data-hid="keywords" name="keywords" content="Vision,Visual Perception,feature detection,Bayesian probability,efficient coding,empirical ranking"><meta data-n-head="ssr" data-hid="og:site_name" property="og:site_name" name="site_name" content="Frontiers"><meta data-n-head="ssr" data-hid="og:image" property="og:image" name="image" content="https://images-provider.frontiersin.org/api/ipx/w=1200&f=png/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g001.jpg"><meta data-n-head="ssr" data-hid="og:type" property="og:type" name="type" content="article"><meta data-n-head="ssr" data-hid="og:url" property="og:url" name="url" content="https://www.frontiersin.org/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/full"><meta data-n-head="ssr" data-hid="twitter:card" name="twitter:card" content="summary_large_image"><meta data-n-head="ssr" data-hid="citation_volume" name="citation_volume" content="9"><meta data-n-head="ssr" data-hid="citation_journal_title" name="citation_journal_title" content="Frontiers in Systems Neuroscience"><meta data-n-head="ssr" data-hid="citation_publisher" name="citation_publisher" content="Frontiers"><meta data-n-head="ssr" data-hid="citation_journal_abbrev" name="citation_journal_abbrev" content="Front. Syst. Neurosci."><meta data-n-head="ssr" data-hid="citation_issn" name="citation_issn" content="1662-5137"><meta data-n-head="ssr" data-hid="citation_doi" name="citation_doi" content="10.3389/fnsys.2015.00156"><meta data-n-head="ssr" data-hid="citation_firstpage" name="citation_firstpage" content="163471"><meta data-n-head="ssr" data-hid="citation_language" name="citation_language" content="English"><meta data-n-head="ssr" data-hid="citation_title" name="citation_title" content="Perception and Reality: Why a Wholly Empirical Paradigm is Needed to Understand Vision"><meta data-n-head="ssr" data-hid="citation_keywords" name="citation_keywords" content="Vision; Visual Perception; feature detection; Bayesian probability; efficient coding; empirical ranking"><meta data-n-head="ssr" data-hid="citation_abstract" name="citation_abstract" content="<p>A central puzzle in vision science is how perceptions that are routinely at odds with physical measurements of real world properties can arise from neural responses that nonetheless lead to effective behaviors. Here we argue that the solution depends on: (1) rejecting the assumption that the goal of vision is to recover, however imperfectly, properties of the world; and (2) replacing it with a paradigm in which perceptions reflect biological utility based on past experience rather than objective features of the environment. Present evidence is consistent with the conclusion that conceiving vision in wholly empirical terms provides a plausible way to understand what we see and why.</p>"><meta data-n-head="ssr" data-hid="citation_pdf_url" name="citation_pdf_url" content="https://www.frontiersin.org/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/pdf"><meta data-n-head="ssr" data-hid="citation_online_date" name="citation_online_date" content="2015/10/29"><meta data-n-head="ssr" data-hid="citation_publication_date" name="citation_publication_date" content="2015/11/18"><meta data-n-head="ssr" data-hid="citation_author_0" name="citation_author" content="Purves, Dale"><meta data-n-head="ssr" data-hid="citation_author_institution_0" name="citation_author_institution" content="Duke Institute for Brain Sciences, Duke University, Durham, NC, USA"><meta data-n-head="ssr" data-hid="citation_author_1" name="citation_author" content="Morgenstern, Yaniv"><meta data-n-head="ssr" data-hid="citation_author_institution_1" name="citation_author_institution" content="Duke-NUS Graduate Medical School, Singapore, Singapore"><meta data-n-head="ssr" data-hid="citation_author_2" name="citation_author" content="Wojtach, William T."><meta data-n-head="ssr" data-hid="citation_author_institution_2" name="citation_author_institution" content="Duke Institute for Brain Sciences, Duke University, Durham, NC, USA"><meta data-n-head="ssr" data-hid="dc.identifier" name="dc.identifier" content="doi:10.3389/fnsys.2015.00156"><link data-n-head="ssr" rel="manifest" href="/article-pages/_nuxt/manifest.c499fc0a.json" data-hid="manifest"><link data-n-head="ssr" rel="canonical" href="https://www.frontiersin.org/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/full"><script data-n-head="ssr" data-hid="newrelic-browser-script" type="text/javascript">window.NREUM||(NREUM={});NREUM.info = {"agent":"","beacon":"bam.nr-data.net","errorBeacon":"bam.nr-data.net","licenseKey":"598a124f17","applicationID":"588603994","agentToken":null,"applicationTime":4.103074,"transactionName":"MQcDMkECCkNSW0YMWghNIgldDQFTRxd1IGFJTQ==","queueTime":0,"ttGuid":"9e508589d34b46a5"}; (window.NREUM||(NREUM={})).init={privacy:{cookies_enabled:true},ajax:{deny_list:["bam.nr-data.net"]},distributed_tracing:{enabled:true}};(window.NREUM||(NREUM={})).loader_config={agentID:"594400880",accountID:"230385",trustKey:"230385",xpid:"VgUHUl5WGwYIXFdSBAgOUg==",licenseKey:"598a124f17",applicationID:"588603994"};;/*! For license information please see nr-loader-spa-1.283.2.min.js.LICENSE.txt */ (()=>{var e,t,r={8122:(e,t,r)=>{"use strict";r.d(t,{a:()=>i});var n=r(944);function i(e,t){try{if(!e||"object"!=typeof e)return(0,n.R)(3);if(!t||"object"!=typeof t)return(0,n.R)(4);const r=Object.create(Object.getPrototypeOf(t),Object.getOwnPropertyDescriptors(t)),o=0===Object.keys(r).length?e:r;for(let a in o)if(void 0!==e[a])try{if(null===e[a]){r[a]=null;continue}Array.isArray(e[a])&&Array.isArray(t[a])?r[a]=Array.from(new Set([...e[a],...t[a]])):"object"==typeof e[a]&&"object"==typeof t[a]?r[a]=i(e[a],t[a]):r[a]=e[a]}catch(e){(0,n.R)(1,e)}return r}catch(e){(0,n.R)(2,e)}}},2555:(e,t,r)=>{"use strict";r.d(t,{Vp:()=>c,fn:()=>s,x1:()=>u});var n=r(384),i=r(8122);const o={beacon:n.NT.beacon,errorBeacon:n.NT.errorBeacon,licenseKey:void 0,applicationID:void 0,sa:void 0,queueTime:void 0,applicationTime:void 0,ttGuid:void 0,user:void 0,account:void 0,product:void 0,extra:void 0,jsAttributes:{},userAttributes:void 0,atts:void 0,transactionName:void 0,tNamePlain:void 0},a={};function s(e){try{const t=c(e);return!!t.licenseKey&&!!t.errorBeacon&&!!t.applicationID}catch(e){return!1}}function c(e){if(!e)throw new Error("All info objects require an agent identifier!");if(!a[e])throw new Error("Info for ".concat(e," was never set"));return a[e]}function u(e,t){if(!e)throw new Error("All info objects require an agent identifier!");a[e]=(0,i.a)(t,o);const r=(0,n.nY)(e);r&&(r.info=a[e])}},9417:(e,t,r)=>{"use strict";r.d(t,{D0:()=>h,gD:()=>g,xN:()=>p});var n=r(3333);const i=e=>{if(!e||"string"!=typeof e)return!1;try{document.createDocumentFragment().querySelector(e)}catch{return!1}return!0};var o=r(2614),a=r(944),s=r(384),c=r(8122);const u="[data-nr-mask]",d=()=>{const e={feature_flags:[],experimental:{marks:!1,measures:!1,resources:!1},mask_selector:"*",block_selector:"[data-nr-block]",mask_input_options:{color:!1,date:!1,"datetime-local":!1,email:!1,month:!1,number:!1,range:!1,search:!1,tel:!1,text:!1,time:!1,url:!1,week:!1,textarea:!1,select:!1,password:!0}};return{ajax:{deny_list:void 0,block_internal:!0,enabled:!0,autoStart:!0},distributed_tracing:{enabled:void 0,exclude_newrelic_header:void 0,cors_use_newrelic_header:void 0,cors_use_tracecontext_headers:void 0,allowed_origins:void 0},get feature_flags(){return e.feature_flags},set feature_flags(t){e.feature_flags=t},generic_events:{enabled:!0,autoStart:!0},harvest:{interval:30},jserrors:{enabled:!0,autoStart:!0},logging:{enabled:!0,autoStart:!0},metrics:{enabled:!0,autoStart:!0},obfuscate:void 0,page_action:{enabled:!0},page_view_event:{enabled:!0,autoStart:!0},page_view_timing:{enabled:!0,autoStart:!0},performance:{get capture_marks(){return e.feature_flags.includes(n.$v.MARKS)||e.experimental.marks},set capture_marks(t){e.experimental.marks=t},get capture_measures(){return e.feature_flags.includes(n.$v.MEASURES)||e.experimental.measures},set capture_measures(t){e.experimental.measures=t},capture_detail:!0,resources:{get enabled(){return e.feature_flags.includes(n.$v.RESOURCES)||e.experimental.resources},set enabled(t){e.experimental.resources=t},asset_types:[],first_party_domains:[],ignore_newrelic:!0}},privacy:{cookies_enabled:!0},proxy:{assets:void 0,beacon:void 0},session:{expiresMs:o.wk,inactiveMs:o.BB},session_replay:{autoStart:!0,enabled:!1,preload:!1,sampling_rate:10,error_sampling_rate:100,collect_fonts:!1,inline_images:!1,fix_stylesheets:!0,mask_all_inputs:!0,get mask_text_selector(){return e.mask_selector},set mask_text_selector(t){i(t)?e.mask_selector="".concat(t,",").concat(u):""===t||null===t?e.mask_selector=u:(0,a.R)(5,t)},get block_class(){return"nr-block"},get ignore_class(){return"nr-ignore"},get mask_text_class(){return"nr-mask"},get block_selector(){return e.block_selector},set block_selector(t){i(t)?e.block_selector+=",".concat(t):""!==t&&(0,a.R)(6,t)},get mask_input_options(){return e.mask_input_options},set mask_input_options(t){t&&"object"==typeof t?e.mask_input_options={...t,password:!0}:(0,a.R)(7,t)}},session_trace:{enabled:!0,autoStart:!0},soft_navigations:{enabled:!0,autoStart:!0},spa:{enabled:!0,autoStart:!0},ssl:void 0,user_actions:{enabled:!0,elementAttributes:["id","className","tagName","type"]}}},l={},f="All configuration objects require an agent identifier!";function h(e){if(!e)throw new Error(f);if(!l[e])throw new Error("Configuration for ".concat(e," was never set"));return l[e]}function p(e,t){if(!e)throw new Error(f);l[e]=(0,c.a)(t,d());const r=(0,s.nY)(e);r&&(r.init=l[e])}function g(e,t){if(!e)throw new Error(f);var r=h(e);if(r){for(var n=t.split("."),i=0;i<n.length-1;i++)if("object"!=typeof(r=r[n[i]]))return;r=r[n[n.length-1]]}return r}},5603:(e,t,r)=>{"use strict";r.d(t,{a:()=>c,o:()=>s});var n=r(384),i=r(8122);const o={accountID:void 0,trustKey:void 0,agentID:void 0,licenseKey:void 0,applicationID:void 0,xpid:void 0},a={};function s(e){if(!e)throw new Error("All loader-config objects require an agent identifier!");if(!a[e])throw new Error("LoaderConfig for ".concat(e," was never set"));return a[e]}function c(e,t){if(!e)throw new Error("All loader-config objects require an agent identifier!");a[e]=(0,i.a)(t,o);const r=(0,n.nY)(e);r&&(r.loader_config=a[e])}},3371:(e,t,r)=>{"use strict";r.d(t,{V:()=>f,f:()=>l});var n=r(8122),i=r(384),o=r(6154),a=r(9324);let s=0;const c={buildEnv:a.F3,distMethod:a.Xs,version:a.xv,originTime:o.WN},u={customTransaction:void 0,disabled:!1,isolatedBacklog:!1,loaderType:void 0,maxBytes:3e4,onerror:void 0,ptid:void 0,releaseIds:{},appMetadata:{},session:void 0,denyList:void 0,timeKeeper:void 0,obfuscator:void 0,harvester:void 0},d={};function l(e){if(!e)throw new Error("All runtime objects require an agent identifier!");if(!d[e])throw new Error("Runtime for ".concat(e," was never set"));return d[e]}function f(e,t){if(!e)throw new Error("All runtime objects require an agent identifier!");d[e]={...(0,n.a)(t,u),...c},Object.hasOwnProperty.call(d[e],"harvestCount")||Object.defineProperty(d[e],"harvestCount",{get:()=>++s});const r=(0,i.nY)(e);r&&(r.runtime=d[e])}},9324:(e,t,r)=>{"use strict";r.d(t,{F3:()=>i,Xs:()=>o,Yq:()=>a,xv:()=>n});const n="1.283.2",i="PROD",o="CDN",a="^2.0.0-alpha.17"},6154:(e,t,r)=>{"use strict";r.d(t,{A4:()=>s,OF:()=>d,RI:()=>i,WN:()=>h,bv:()=>o,gm:()=>a,lR:()=>f,m:()=>u,mw:()=>c,sb:()=>l});var n=r(1863);const i="undefined"!=typeof window&&!!window.document,o="undefined"!=typeof WorkerGlobalScope&&("undefined"!=typeof self&&self instanceof WorkerGlobalScope&&self.navigator instanceof WorkerNavigator||"undefined"!=typeof globalThis&&globalThis instanceof WorkerGlobalScope&&globalThis.navigator instanceof WorkerNavigator),a=i?window:"undefined"!=typeof WorkerGlobalScope&&("undefined"!=typeof self&&self instanceof WorkerGlobalScope&&self||"undefined"!=typeof globalThis&&globalThis instanceof WorkerGlobalScope&&globalThis),s="complete"===a?.document?.readyState,c=Boolean("hidden"===a?.document?.visibilityState),u=""+a?.location,d=/iPad|iPhone|iPod/.test(a.navigator?.userAgent),l=d&&"undefined"==typeof SharedWorker,f=(()=>{const e=a.navigator?.userAgent?.match(/Firefox[/\s](\d+\.\d+)/);return Array.isArray(e)&&e.length>=2?+e[1]:0})(),h=Date.now()-(0,n.t)()},7295:(e,t,r)=>{"use strict";r.d(t,{Xv:()=>a,gX:()=>i,iW:()=>o});var n=[];function i(e){if(!e||o(e))return!1;if(0===n.length)return!0;for(var t=0;t<n.length;t++){var r=n[t];if("*"===r.hostname)return!1;if(s(r.hostname,e.hostname)&&c(r.pathname,e.pathname))return!1}return!0}function o(e){return void 0===e.hostname}function a(e){if(n=[],e&&e.length)for(var t=0;t<e.length;t++){let r=e[t];if(!r)continue;0===r.indexOf("http://")?r=r.substring(7):0===r.indexOf("https://")&&(r=r.substring(8));const i=r.indexOf("/");let o,a;i>0?(o=r.substring(0,i),a=r.substring(i)):(o=r,a="");let[s]=o.split(":");n.push({hostname:s,pathname:a})}}function s(e,t){return!(e.length>t.length)&&t.indexOf(e)===t.length-e.length}function c(e,t){return 0===e.indexOf("/")&&(e=e.substring(1)),0===t.indexOf("/")&&(t=t.substring(1)),""===e||e===t}},1687:(e,t,r)=>{"use strict";r.d(t,{Ak:()=>c,Ze:()=>l,x3:()=>u});var n=r(7836),i=r(3606),o=r(860),a=r(2646);const s={};function c(e,t){const r={staged:!1,priority:o.P3[t]||0};d(e),s[e].get(t)||s[e].set(t,r)}function u(e,t){e&&s[e]&&(s[e].get(t)&&s[e].delete(t),h(e,t,!1),s[e].size&&f(e))}function d(e){if(!e)throw new Error("agentIdentifier required");s[e]||(s[e]=new Map)}function l(e="",t="feature",r=!1){if(d(e),!e||!s[e].get(t)||r)return h(e,t);s[e].get(t).staged=!0,f(e)}function f(e){const t=Array.from(s[e]);t.every((([e,t])=>t.staged))&&(t.sort(((e,t)=>e[1].priority-t[1].priority)),t.forEach((([t])=>{s[e].delete(t),h(e,t)})))}function h(e,t,r=!0){const o=e?n.ee.get(e):n.ee,s=i.i.handlers;if(!o.aborted&&o.backlog&&s){if(r){const e=o.backlog[t],r=s[t];if(r){for(let t=0;e&&t<e.length;++t)p(e[t],r);Object.entries(r).forEach((([e,t])=>{Object.values(t||{}).forEach((t=>{t[0]?.on&&t[0]?.context()instanceof a.y&&t[0].on(e,t[1])}))}))}}o.isolatedBacklog||delete s[t],o.backlog[t]=null,o.emit("drain-"+t,[])}}function p(e,t){var r=e[1];Object.values(t[r]||{}).forEach((t=>{var r=e[0];if(t[0]===r){var n=t[1],i=e[3],o=e[2];n.apply(i,o)}}))}},7836:(e,t,r)=>{"use strict";r.d(t,{P:()=>c,ee:()=>u});var n=r(384),i=r(8990),o=r(3371),a=r(2646),s=r(5607);const c="nr@context:".concat(s.W),u=function e(t,r){var n={},s={},d={},l=!1;try{l=16===r.length&&(0,o.f)(r).isolatedBacklog}catch(e){}var f={on:p,addEventListener:p,removeEventListener:function(e,t){var r=n[e];if(!r)return;for(var i=0;i<r.length;i++)r[i]===t&&r.splice(i,1)},emit:function(e,r,n,i,o){!1!==o&&(o=!0);if(u.aborted&&!i)return;t&&o&&t.emit(e,r,n);for(var a=h(n),c=g(e),d=c.length,l=0;l<d;l++)c[l].apply(a,r);var p=v()[s[e]];p&&p.push([f,e,r,a]);return a},get:m,listeners:g,context:h,buffer:function(e,t){const r=v();if(t=t||"feature",f.aborted)return;Object.entries(e||{}).forEach((([e,n])=>{s[n]=t,t in r||(r[t]=[])}))},abort:function(){f._aborted=!0,Object.keys(f.backlog).forEach((e=>{delete f.backlog[e]}))},isBuffering:function(e){return!!v()[s[e]]},debugId:r,backlog:l?{}:t&&"object"==typeof t.backlog?t.backlog:{},isolatedBacklog:l};return Object.defineProperty(f,"aborted",{get:()=>{let e=f._aborted||!1;return e||(t&&(e=t.aborted),e)}}),f;function h(e){return e&&e instanceof a.y?e:e?(0,i.I)(e,c,(()=>new a.y(c))):new a.y(c)}function p(e,t){n[e]=g(e).concat(t)}function g(e){return n[e]||[]}function m(t){return d[t]=d[t]||e(f,t)}function v(){return f.backlog}}(void 0,"globalEE"),d=(0,n.Zm)();d.ee||(d.ee=u)},2646:(e,t,r)=>{"use strict";r.d(t,{y:()=>n});class n{constructor(e){this.contextId=e}}},9908:(e,t,r)=>{"use strict";r.d(t,{d:()=>n,p:()=>i});var n=r(7836).ee.get("handle");function i(e,t,r,i,o){o?(o.buffer([e],i),o.emit(e,t,r)):(n.buffer([e],i),n.emit(e,t,r))}},3606:(e,t,r)=>{"use strict";r.d(t,{i:()=>o});var n=r(9908);o.on=a;var i=o.handlers={};function o(e,t,r,o){a(o||n.d,i,e,t,r)}function a(e,t,r,i,o){o||(o="feature"),e||(e=n.d);var a=t[o]=t[o]||{};(a[r]=a[r]||[]).push([e,i])}},3878:(e,t,r)=>{"use strict";function n(e,t){return{capture:e,passive:!1,signal:t}}function i(e,t,r=!1,i){window.addEventListener(e,t,n(r,i))}function o(e,t,r=!1,i){document.addEventListener(e,t,n(r,i))}r.d(t,{DD:()=>o,jT:()=>n,sp:()=>i})},5607:(e,t,r)=>{"use strict";r.d(t,{W:()=>n});const n=(0,r(9566).bz)()},9566:(e,t,r)=>{"use strict";r.d(t,{LA:()=>s,ZF:()=>c,bz:()=>a,el:()=>u});var n=r(6154);const i="xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx";function o(e,t){return e?15&e[t]:16*Math.random()|0}function a(){const e=n.gm?.crypto||n.gm?.msCrypto;let t,r=0;return e&&e.getRandomValues&&(t=e.getRandomValues(new Uint8Array(30))),i.split("").map((e=>"x"===e?o(t,r++).toString(16):"y"===e?(3&o()|8).toString(16):e)).join("")}function s(e){const t=n.gm?.crypto||n.gm?.msCrypto;let r,i=0;t&&t.getRandomValues&&(r=t.getRandomValues(new Uint8Array(e)));const a=[];for(var s=0;s<e;s++)a.push(o(r,i++).toString(16));return a.join("")}function c(){return s(16)}function u(){return s(32)}},2614:(e,t,r)=>{"use strict";r.d(t,{BB:()=>a,H3:()=>n,g:()=>u,iL:()=>c,tS:()=>s,uh:()=>i,wk:()=>o});const n="NRBA",i="SESSION",o=144e5,a=18e5,s={STARTED:"session-started",PAUSE:"session-pause",RESET:"session-reset",RESUME:"session-resume",UPDATE:"session-update"},c={SAME_TAB:"same-tab",CROSS_TAB:"cross-tab"},u={OFF:0,FULL:1,ERROR:2}},1863:(e,t,r)=>{"use strict";function n(){return Math.floor(performance.now())}r.d(t,{t:()=>n})},7485:(e,t,r)=>{"use strict";r.d(t,{D:()=>i});var n=r(6154);function i(e){if(0===(e||"").indexOf("data:"))return{protocol:"data"};try{const t=new URL(e,location.href),r={port:t.port,hostname:t.hostname,pathname:t.pathname,search:t.search,protocol:t.protocol.slice(0,t.protocol.indexOf(":")),sameOrigin:t.protocol===n.gm?.location?.protocol&&t.host===n.gm?.location?.host};return r.port&&""!==r.port||("http:"===t.protocol&&(r.port="80"),"https:"===t.protocol&&(r.port="443")),r.pathname&&""!==r.pathname?r.pathname.startsWith("/")||(r.pathname="/".concat(r.pathname)):r.pathname="/",r}catch(e){return{}}}},944:(e,t,r)=>{"use strict";function n(e,t){"function"==typeof console.debug&&console.debug("New Relic Warning: https://github.com/newrelic/newrelic-browser-agent/blob/main/docs/warning-codes.md#".concat(e),t)}r.d(t,{R:()=>n})},5284:(e,t,r)=>{"use strict";r.d(t,{t:()=>c,B:()=>s});var n=r(7836),i=r(6154);const o="newrelic";const a=new Set,s={};function c(e,t){const r=n.ee.get(t);s[t]??={},e&&"object"==typeof e&&(a.has(t)||(r.emit("rumresp",[e]),s[t]=e,a.add(t),function(e={}){try{i.gm.dispatchEvent(new CustomEvent(o,{detail:e}))}catch(e){}}({loaded:!0})))}},8990:(e,t,r)=>{"use strict";r.d(t,{I:()=>i});var n=Object.prototype.hasOwnProperty;function i(e,t,r){if(n.call(e,t))return e[t];var i=r();if(Object.defineProperty&&Object.keys)try{return Object.defineProperty(e,t,{value:i,writable:!0,enumerable:!1}),i}catch(e){}return e[t]=i,i}},6389:(e,t,r)=>{"use strict";function n(e,t=500,r={}){const n=r?.leading||!1;let i;return(...r)=>{n&&void 0===i&&(e.apply(this,r),i=setTimeout((()=>{i=clearTimeout(i)}),t)),n||(clearTimeout(i),i=setTimeout((()=>{e.apply(this,r)}),t))}}function i(e){let t=!1;return(...r)=>{t||(t=!0,e.apply(this,r))}}r.d(t,{J:()=>i,s:()=>n})},3304:(e,t,r)=>{"use strict";r.d(t,{A:()=>o});var n=r(7836);const i=()=>{const e=new WeakSet;return(t,r)=>{if("object"==typeof r&&null!==r){if(e.has(r))return;e.add(r)}return r}};function o(e){try{return JSON.stringify(e,i())??""}catch(e){try{n.ee.emit("internal-error",[e])}catch(e){}return""}}},5289:(e,t,r)=>{"use strict";r.d(t,{GG:()=>o,sB:()=>a});var n=r(3878);function i(){return"undefined"==typeof document||"complete"===document.readyState}function o(e,t){if(i())return e();(0,n.sp)("load",e,t)}function a(e){if(i())return e();(0,n.DD)("DOMContentLoaded",e)}},384:(e,t,r)=>{"use strict";r.d(t,{NT:()=>o,US:()=>d,Zm:()=>a,bQ:()=>c,dV:()=>s,nY:()=>u,pV:()=>l});var n=r(6154),i=r(1863);const o={beacon:"bam.nr-data.net",errorBeacon:"bam.nr-data.net"};function a(){return n.gm.NREUM||(n.gm.NREUM={}),void 0===n.gm.newrelic&&(n.gm.newrelic=n.gm.NREUM),n.gm.NREUM}function s(){let e=a();return e.o||(e.o={ST:n.gm.setTimeout,SI:n.gm.setImmediate,CT:n.gm.clearTimeout,XHR:n.gm.XMLHttpRequest,REQ:n.gm.Request,EV:n.gm.Event,PR:n.gm.Promise,MO:n.gm.MutationObserver,FETCH:n.gm.fetch,WS:n.gm.WebSocket}),e}function c(e,t){let r=a();r.initializedAgents??={},t.initializedAt={ms:(0,i.t)(),date:new Date},r.initializedAgents[e]=t}function u(e){let t=a();return t.initializedAgents?.[e]}function d(e,t){a()[e]=t}function l(){return function(){let e=a();const t=e.info||{};e.info={beacon:o.beacon,errorBeacon:o.errorBeacon,...t}}(),function(){let e=a();const t=e.init||{};e.init={...t}}(),s(),function(){let e=a();const t=e.loader_config||{};e.loader_config={...t}}(),a()}},2843:(e,t,r)=>{"use strict";r.d(t,{u:()=>i});var n=r(3878);function i(e,t=!1,r,i){(0,n.DD)("visibilitychange",(function(){if(t)return void("hidden"===document.visibilityState&&e());e(document.visibilityState)}),r,i)}},8139:(e,t,r)=>{"use strict";r.d(t,{u:()=>f});var n=r(7836),i=r(3434),o=r(8990),a=r(6154);const s={},c=a.gm.XMLHttpRequest,u="addEventListener",d="removeEventListener",l="nr@wrapped:".concat(n.P);function f(e){var t=function(e){return(e||n.ee).get("events")}(e);if(s[t.debugId]++)return t;s[t.debugId]=1;var r=(0,i.YM)(t,!0);function f(e){r.inPlace(e,[u,d],"-",p)}function p(e,t){return e[1]}return"getPrototypeOf"in Object&&(a.RI&&h(document,f),c&&h(c.prototype,f),h(a.gm,f)),t.on(u+"-start",(function(e,t){var n=e[1];if(null!==n&&("function"==typeof n||"object"==typeof n)){var i=(0,o.I)(n,l,(function(){var e={object:function(){if("function"!=typeof n.handleEvent)return;return n.handleEvent.apply(n,arguments)},function:n}[typeof n];return e?r(e,"fn-",null,e.name||"anonymous"):n}));this.wrapped=e[1]=i}})),t.on(d+"-start",(function(e){e[1]=this.wrapped||e[1]})),t}function h(e,t,...r){let n=e;for(;"object"==typeof n&&!Object.prototype.hasOwnProperty.call(n,u);)n=Object.getPrototypeOf(n);n&&t(n,...r)}},3434:(e,t,r)=>{"use strict";r.d(t,{Jt:()=>o,YM:()=>c});var n=r(7836),i=r(5607);const o="nr@original:".concat(i.W);var a=Object.prototype.hasOwnProperty,s=!1;function c(e,t){return e||(e=n.ee),r.inPlace=function(e,t,n,i,o){n||(n="");const a="-"===n.charAt(0);for(let s=0;s<t.length;s++){const c=t[s],u=e[c];d(u)||(e[c]=r(u,a?c+n:n,i,c,o))}},r.flag=o,r;function r(t,r,n,s,c){return d(t)?t:(r||(r=""),nrWrapper[o]=t,function(e,t,r){if(Object.defineProperty&&Object.keys)try{return Object.keys(e).forEach((function(r){Object.defineProperty(t,r,{get:function(){return e[r]},set:function(t){return e[r]=t,t}})})),t}catch(e){u([e],r)}for(var n in e)a.call(e,n)&&(t[n]=e[n])}(t,nrWrapper,e),nrWrapper);function nrWrapper(){var o,a,d,l;try{a=this,o=[...arguments],d="function"==typeof n?n(o,a):n||{}}catch(t){u([t,"",[o,a,s],d],e)}i(r+"start",[o,a,s],d,c);try{return l=t.apply(a,o)}catch(e){throw i(r+"err",[o,a,e],d,c),e}finally{i(r+"end",[o,a,l],d,c)}}}function i(r,n,i,o){if(!s||t){var a=s;s=!0;try{e.emit(r,n,i,t,o)}catch(t){u([t,r,n,i],e)}s=a}}}function u(e,t){t||(t=n.ee);try{t.emit("internal-error",e)}catch(e){}}function d(e){return!(e&&"function"==typeof e&&e.apply&&!e[o])}},9414:(e,t,r)=>{"use strict";r.d(t,{J:()=>c});var n=r(7836),i=r(2646),o=r(944),a=r(3434);const s=new Map;function c(e,t,r,c){if("object"!=typeof t||!t||"string"!=typeof r||!r||"function"!=typeof t[r])return(0,o.R)(29);const u=function(e){return(e||n.ee).get("logger")}(e),d=(0,a.YM)(u),l=new i.y(n.P);l.level=c.level,l.customAttributes=c.customAttributes;const f=t[r]?.[a.Jt]||t[r];return s.set(f,l),d.inPlace(t,[r],"wrap-logger-",(()=>s.get(f))),u}},9300:(e,t,r)=>{"use strict";r.d(t,{T:()=>n});const n=r(860).K7.ajax},3333:(e,t,r)=>{"use strict";r.d(t,{$v:()=>u,TZ:()=>n,Zp:()=>i,kd:()=>c,mq:()=>s,nf:()=>a,qN:()=>o});const n=r(860).K7.genericEvents,i=["auxclick","click","copy","keydown","paste","scrollend"],o=["focus","blur"],a=4,s=1e3,c=["PageAction","UserAction","BrowserPerformance"],u={MARKS:"experimental.marks",MEASURES:"experimental.measures",RESOURCES:"experimental.resources"}},6774:(e,t,r)=>{"use strict";r.d(t,{T:()=>n});const n=r(860).K7.jserrors},993:(e,t,r)=>{"use strict";r.d(t,{A$:()=>o,ET:()=>a,TZ:()=>s,p_:()=>i});var n=r(860);const i={ERROR:"ERROR",WARN:"WARN",INFO:"INFO",DEBUG:"DEBUG",TRACE:"TRACE"},o={OFF:0,ERROR:1,WARN:2,INFO:3,DEBUG:4,TRACE:5},a="log",s=n.K7.logging},3785:(e,t,r)=>{"use strict";r.d(t,{R:()=>c,b:()=>u});var n=r(9908),i=r(1863),o=r(860),a=r(8154),s=r(993);function c(e,t,r={},c=s.p_.INFO){(0,n.p)(a.xV,["API/logging/".concat(c.toLowerCase(),"/called")],void 0,o.K7.metrics,e),(0,n.p)(s.ET,[(0,i.t)(),t,r,c],void 0,o.K7.logging,e)}function u(e){return"string"==typeof e&&Object.values(s.p_).some((t=>t===e.toUpperCase().trim()))}},8154:(e,t,r)=>{"use strict";r.d(t,{z_:()=>o,XG:()=>s,TZ:()=>n,rs:()=>i,xV:()=>a});r(6154),r(9566),r(384);const n=r(860).K7.metrics,i="sm",o="cm",a="storeSupportabilityMetrics",s="storeEventMetrics"},6630:(e,t,r)=>{"use strict";r.d(t,{T:()=>n});const n=r(860).K7.pageViewEvent},782:(e,t,r)=>{"use strict";r.d(t,{T:()=>n});const n=r(860).K7.pageViewTiming},6344:(e,t,r)=>{"use strict";r.d(t,{BB:()=>d,G4:()=>o,Qb:()=>l,TZ:()=>i,Ug:()=>a,_s:()=>s,bc:()=>u,yP:()=>c});var n=r(2614);const i=r(860).K7.sessionReplay,o={RECORD:"recordReplay",PAUSE:"pauseReplay",REPLAY_RUNNING:"replayRunning",ERROR_DURING_REPLAY:"errorDuringReplay"},a=.12,s={DomContentLoaded:0,Load:1,FullSnapshot:2,IncrementalSnapshot:3,Meta:4,Custom:5},c={[n.g.ERROR]:15e3,[n.g.FULL]:3e5,[n.g.OFF]:0},u={RESET:{message:"Session was reset",sm:"Reset"},IMPORT:{message:"Recorder failed to import",sm:"Import"},TOO_MANY:{message:"429: Too Many Requests",sm:"Too-Many"},TOO_BIG:{message:"Payload was too large",sm:"Too-Big"},CROSS_TAB:{message:"Session Entity was set to OFF on another tab",sm:"Cross-Tab"},ENTITLEMENTS:{message:"Session Replay is not allowed and will not be started",sm:"Entitlement"}},d=5e3,l={API:"api"}},5270:(e,t,r)=>{"use strict";r.d(t,{Aw:()=>c,CT:()=>u,SR:()=>s});var n=r(384),i=r(9417),o=r(7767),a=r(6154);function s(e){return!!(0,n.dV)().o.MO&&(0,o.V)(e)&&!0===(0,i.gD)(e,"session_trace.enabled")}function c(e){return!0===(0,i.gD)(e,"session_replay.preload")&&s(e)}function u(e,t){const r=t.correctAbsoluteTimestamp(e);return{originalTimestamp:e,correctedTimestamp:r,timestampDiff:e-r,originTime:a.WN,correctedOriginTime:t.correctedOriginTime,originTimeDiff:Math.floor(a.WN-t.correctedOriginTime)}}},3738:(e,t,r)=>{"use strict";r.d(t,{He:()=>i,Kp:()=>s,Lc:()=>u,Rz:()=>d,TZ:()=>n,bD:()=>o,d3:()=>a,jx:()=>l,uP:()=>c});const n=r(860).K7.sessionTrace,i="bstResource",o="resource",a="-start",s="-end",c="fn"+a,u="fn"+s,d="pushState",l=1e3},3962:(e,t,r)=>{"use strict";r.d(t,{AM:()=>o,O2:()=>c,Qu:()=>u,TZ:()=>s,ih:()=>d,pP:()=>a,tC:()=>i});var n=r(860);const i=["click","keydown","submit","popstate"],o="api",a="initialPageLoad",s=n.K7.softNav,c={INITIAL_PAGE_LOAD:"",ROUTE_CHANGE:1,UNSPECIFIED:2},u={INTERACTION:1,AJAX:2,CUSTOM_END:3,CUSTOM_TRACER:4},d={IP:"in progress",FIN:"finished",CAN:"cancelled"}},7378:(e,t,r)=>{"use strict";r.d(t,{$p:()=>x,BR:()=>b,Kp:()=>R,L3:()=>y,Lc:()=>c,NC:()=>o,SG:()=>d,TZ:()=>i,U6:()=>p,UT:()=>m,d3:()=>w,dT:()=>f,e5:()=>A,gx:()=>v,l9:()=>l,oW:()=>h,op:()=>g,rw:()=>u,tH:()=>T,uP:()=>s,wW:()=>E,xq:()=>a});var n=r(384);const i=r(860).K7.spa,o=["click","submit","keypress","keydown","keyup","change"],a=999,s="fn-start",c="fn-end",u="cb-start",d="api-ixn-",l="remaining",f="interaction",h="spaNode",p="jsonpNode",g="fetch-start",m="fetch-done",v="fetch-body-",b="jsonp-end",y=(0,n.dV)().o.ST,w="-start",R="-end",x="-body",E="cb"+R,A="jsTime",T="fetch"},4234:(e,t,r)=>{"use strict";r.d(t,{W:()=>o});var n=r(7836),i=r(1687);class o{constructor(e,t){this.agentIdentifier=e,this.ee=n.ee.get(e),this.featureName=t,this.blocked=!1}deregisterDrain(){(0,i.x3)(this.agentIdentifier,this.featureName)}}},7767:(e,t,r)=>{"use strict";r.d(t,{V:()=>o});var n=r(9417),i=r(6154);const o=e=>i.RI&&!0===(0,n.gD)(e,"privacy.cookies_enabled")},8969:(e,t,r)=>{"use strict";r.d(t,{j:()=>O});var n=r(860),i=r(2555),o=r(3371),a=r(9908),s=r(7836),c=r(1687),u=r(5289),d=r(6154),l=r(944),f=r(8154),h=r(384),p=r(6344);const g=["setErrorHandler","finished","addToTrace","addRelease","recordCustomEvent","addPageAction","setCurrentRouteName","setPageViewName","setCustomAttribute","interaction","noticeError","setUserId","setApplicationVersion","start",p.G4.RECORD,p.G4.PAUSE,"log","wrapLogger"],m=["setErrorHandler","finished","addToTrace","addRelease"];var v=r(1863),b=r(2614),y=r(993),w=r(3785),R=r(9414);function x(){const e=(0,h.pV)();g.forEach((t=>{e[t]=(...r)=>function(t,...r){let n=[];return Object.values(e.initializedAgents).forEach((e=>{e&&e.api?e.exposed&&e.api[t]&&n.push(e.api[t](...r)):(0,l.R)(38,t)})),n.length>1?n:n[0]}(t,...r)}))}const E={};var A=r(9417),T=r(5603),N=r(5284);const S=e=>{const t=e.startsWith("http");e+="/",r.p=t?e:"https://"+e};let _=!1;function O(e,t={},g,O){let{init:I,info:P,loader_config:j,runtime:C={},exposed:k=!0}=t;C.loaderType=g;const L=(0,h.pV)();P||(I=L.init,P=L.info,j=L.loader_config),(0,A.xN)(e.agentIdentifier,I||{}),(0,T.a)(e.agentIdentifier,j||{}),P.jsAttributes??={},d.bv&&(P.jsAttributes.isWorker=!0),(0,i.x1)(e.agentIdentifier,P);const H=(0,A.D0)(e.agentIdentifier),M=[P.beacon,P.errorBeacon];_||(H.proxy.assets&&(S(H.proxy.assets),M.push(H.proxy.assets)),H.proxy.beacon&&M.push(H.proxy.beacon),x(),(0,h.US)("activatedFeatures",N.B),e.runSoftNavOverSpa&&=!0===H.soft_navigations.enabled&&H.feature_flags.includes("soft_nav")),C.denyList=[...H.ajax.deny_list||[],...H.ajax.block_internal?M:[]],C.ptid=e.agentIdentifier,(0,o.V)(e.agentIdentifier,C),e.ee=s.ee.get(e.agentIdentifier),void 0===e.api&&(e.api=function(e,t,h=!1){t||(0,c.Ak)(e,"api");const g={};var x=s.ee.get(e),A=x.get("tracer");E[e]=b.g.OFF,x.on(p.G4.REPLAY_RUNNING,(t=>{E[e]=t}));var T="api-",N=T+"ixn-";function S(t,r,n,o){const a=(0,i.Vp)(e);return null===r?delete a.jsAttributes[t]:(0,i.x1)(e,{...a,jsAttributes:{...a.jsAttributes,[t]:r}}),I(T,n,!0,o||null===r?"session":void 0)(t,r)}function _(){}g.log=function(e,{customAttributes:t={},level:r=y.p_.INFO}={}){(0,a.p)(f.xV,["API/log/called"],void 0,n.K7.metrics,x),(0,w.R)(x,e,t,r)},g.wrapLogger=(e,t,{customAttributes:r={},level:i=y.p_.INFO}={})=>{(0,a.p)(f.xV,["API/wrapLogger/called"],void 0,n.K7.metrics,x),(0,R.J)(x,e,t,{customAttributes:r,level:i})},m.forEach((e=>{g[e]=I(T,e,!0,"api")})),g.addPageAction=I(T,"addPageAction",!0,n.K7.genericEvents),g.recordCustomEvent=I(T,"recordCustomEvent",!0,n.K7.genericEvents),g.setPageViewName=function(t,r){if("string"==typeof t)return"/"!==t.charAt(0)&&(t="/"+t),(0,o.f)(e).customTransaction=(r||"http://custom.transaction")+t,I(T,"setPageViewName",!0)()},g.setCustomAttribute=function(e,t,r=!1){if("string"==typeof e){if(["string","number","boolean"].includes(typeof t)||null===t)return S(e,t,"setCustomAttribute",r);(0,l.R)(40,typeof t)}else(0,l.R)(39,typeof e)},g.setUserId=function(e){if("string"==typeof e||null===e)return S("enduser.id",e,"setUserId",!0);(0,l.R)(41,typeof e)},g.setApplicationVersion=function(e){if("string"==typeof e||null===e)return S("application.version",e,"setApplicationVersion",!1);(0,l.R)(42,typeof e)},g.start=()=>{try{(0,a.p)(f.xV,["API/start/called"],void 0,n.K7.metrics,x),x.emit("manual-start-all")}catch(e){(0,l.R)(23,e)}},g[p.G4.RECORD]=function(){(0,a.p)(f.xV,["API/recordReplay/called"],void 0,n.K7.metrics,x),(0,a.p)(p.G4.RECORD,[],void 0,n.K7.sessionReplay,x)},g[p.G4.PAUSE]=function(){(0,a.p)(f.xV,["API/pauseReplay/called"],void 0,n.K7.metrics,x),(0,a.p)(p.G4.PAUSE,[],void 0,n.K7.sessionReplay,x)},g.interaction=function(e){return(new _).get("object"==typeof e?e:{})};const O=_.prototype={createTracer:function(e,t){var r={},i=this,o="function"==typeof t;return(0,a.p)(f.xV,["API/createTracer/called"],void 0,n.K7.metrics,x),h||(0,a.p)(N+"tracer",[(0,v.t)(),e,r],i,n.K7.spa,x),function(){if(A.emit((o?"":"no-")+"fn-start",[(0,v.t)(),i,o],r),o)try{return t.apply(this,arguments)}catch(e){const t="string"==typeof e?new Error(e):e;throw A.emit("fn-err",[arguments,this,t],r),t}finally{A.emit("fn-end",[(0,v.t)()],r)}}}};function I(e,t,r,i){return function(){return(0,a.p)(f.xV,["API/"+t+"/called"],void 0,n.K7.metrics,x),i&&(0,a.p)(e+t,[r?(0,v.t)():performance.now(),...arguments],r?null:this,i,x),r?void 0:this}}function P(){r.e(478).then(r.bind(r,8778)).then((({setAPI:t})=>{t(e),(0,c.Ze)(e,"api")})).catch((e=>{(0,l.R)(27,e),x.abort()}))}return["actionText","setName","setAttribute","save","ignore","onEnd","getContext","end","get"].forEach((e=>{O[e]=I(N,e,void 0,h?n.K7.softNav:n.K7.spa)})),g.setCurrentRouteName=h?I(N,"routeName",void 0,n.K7.softNav):I(T,"routeName",!0,n.K7.spa),g.noticeError=function(t,r){"string"==typeof t&&(t=new Error(t)),(0,a.p)(f.xV,["API/noticeError/called"],void 0,n.K7.metrics,x),(0,a.p)("err",[t,(0,v.t)(),!1,r,!!E[e]],void 0,n.K7.jserrors,x)},d.RI?(0,u.GG)((()=>P()),!0):P(),g}(e.agentIdentifier,O,e.runSoftNavOverSpa)),void 0===e.exposed&&(e.exposed=k),_=!0}},8374:(e,t,r)=>{r.nc=(()=>{try{return document?.currentScript?.nonce}catch(e){}return""})()},860:(e,t,r)=>{"use strict";r.d(t,{$J:()=>u,K7:()=>s,P3:()=>c,XX:()=>i,qY:()=>n,v4:()=>a});const n="events",i="jserrors",o="browser/blobs",a="rum",s={ajax:"ajax",genericEvents:"generic_events",jserrors:i,logging:"logging",metrics:"metrics",pageAction:"page_action",pageViewEvent:"page_view_event",pageViewTiming:"page_view_timing",sessionReplay:"session_replay",sessionTrace:"session_trace",softNav:"soft_navigations",spa:"spa"},c={[s.pageViewEvent]:1,[s.pageViewTiming]:2,[s.metrics]:3,[s.jserrors]:4,[s.spa]:5,[s.ajax]:6,[s.sessionTrace]:7,[s.softNav]:8,[s.sessionReplay]:9,[s.logging]:10,[s.genericEvents]:11},u={[s.pageViewEvent]:a,[s.pageViewTiming]:n,[s.ajax]:n,[s.spa]:n,[s.softNav]:n,[s.metrics]:i,[s.jserrors]:i,[s.sessionTrace]:o,[s.sessionReplay]:o,[s.logging]:"browser/logs",[s.genericEvents]:"ins"}}},n={};function i(e){var t=n[e];if(void 0!==t)return t.exports;var o=n[e]={exports:{}};return r[e](o,o.exports,i),o.exports}i.m=r,i.d=(e,t)=>{for(var r in t)i.o(t,r)&&!i.o(e,r)&&Object.defineProperty(e,r,{enumerable:!0,get:t[r]})},i.f={},i.e=e=>Promise.all(Object.keys(i.f).reduce(((t,r)=>(i.f[r](e,t),t)),[])),i.u=e=>({212:"nr-spa-compressor",249:"nr-spa-recorder",478:"nr-spa"}[e]+"-1.283.2.min.js"),i.o=(e,t)=>Object.prototype.hasOwnProperty.call(e,t),e={},t="NRBA-1.283.2.PROD:",i.l=(r,n,o,a)=>{if(e[r])e[r].push(n);else{var s,c;if(void 0!==o)for(var u=document.getElementsByTagName("script"),d=0;d<u.length;d++){var l=u[d];if(l.getAttribute("src")==r||l.getAttribute("data-webpack")==t+o){s=l;break}}if(!s){c=!0;var f={478:"sha512-2oN05BjxuObKuOX8E0vq/zS51M+2HokmNPBRUrIC1fw3hpJqoI18/nckSFiqV11KxT7ag3C+FunKrR8n0PD9Ig==",249:"sha512-Zs5nIHr/khH6G8IhAEdnngg+P7y/IfmjU0PQmXABpCEtSTeKV22OYdaa9lENrW9uxI0lZ6O5e5dCnEMsTS0onA==",212:"sha512-LPKde7A1ZxIHzoSqWKxn5uWVhM9u76Vtmp9DMBf+Ry3mnn2jpsfyfigMYD5Yka2RG3NeIBqOwNYuPrWL39qn6w=="};(s=document.createElement("script")).charset="utf-8",s.timeout=120,i.nc&&s.setAttribute("nonce",i.nc),s.setAttribute("data-webpack",t+o),s.src=r,0!==s.src.indexOf(window.location.origin+"/")&&(s.crossOrigin="anonymous"),f[a]&&(s.integrity=f[a])}e[r]=[n];var h=(t,n)=>{s.onerror=s.onload=null,clearTimeout(p);var i=e[r];if(delete e[r],s.parentNode&&s.parentNode.removeChild(s),i&&i.forEach((e=>e(n))),t)return t(n)},p=setTimeout(h.bind(null,void 0,{type:"timeout",target:s}),12e4);s.onerror=h.bind(null,s.onerror),s.onload=h.bind(null,s.onload),c&&document.head.appendChild(s)}},i.r=e=>{"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},i.p="https://js-agent.newrelic.com/",(()=>{var e={38:0,788:0};i.f.j=(t,r)=>{var n=i.o(e,t)?e[t]:void 0;if(0!==n)if(n)r.push(n[2]);else{var o=new Promise(((r,i)=>n=e[t]=[r,i]));r.push(n[2]=o);var a=i.p+i.u(t),s=new Error;i.l(a,(r=>{if(i.o(e,t)&&(0!==(n=e[t])&&(e[t]=void 0),n)){var o=r&&("load"===r.type?"missing":r.type),a=r&&r.target&&r.target.src;s.message="Loading chunk "+t+" failed.\n("+o+": "+a+")",s.name="ChunkLoadError",s.type=o,s.request=a,n[1](s)}}),"chunk-"+t,t)}};var t=(t,r)=>{var n,o,[a,s,c]=r,u=0;if(a.some((t=>0!==e[t]))){for(n in s)i.o(s,n)&&(i.m[n]=s[n]);if(c)c(i)}for(t&&t(r);u<a.length;u++)o=a[u],i.o(e,o)&&e[o]&&e[o][0](),e[o]=0},r=self["webpackChunk:NRBA-1.283.2.PROD"]=self["webpackChunk:NRBA-1.283.2.PROD"]||[];r.forEach(t.bind(null,0)),r.push=t.bind(null,r.push.bind(r))})(),(()=>{"use strict";i(8374);var e=i(944),t=i(6344),r=i(9566);class n{agentIdentifier;constructor(){this.agentIdentifier=(0,r.LA)(16)}#e(t,...r){if("function"==typeof this.api?.[t])return this.api[t](...r);(0,e.R)(35,t)}addPageAction(e,t){return this.#e("addPageAction",e,t)}recordCustomEvent(e,t){return this.#e("recordCustomEvent",e,t)}setPageViewName(e,t){return this.#e("setPageViewName",e,t)}setCustomAttribute(e,t,r){return this.#e("setCustomAttribute",e,t,r)}noticeError(e,t){return this.#e("noticeError",e,t)}setUserId(e){return this.#e("setUserId",e)}setApplicationVersion(e){return this.#e("setApplicationVersion",e)}setErrorHandler(e){return this.#e("setErrorHandler",e)}addRelease(e,t){return this.#e("addRelease",e,t)}log(e,t){return this.#e("log",e,t)}}class o extends n{#e(t,...r){if("function"==typeof this.api?.[t])return this.api[t](...r);(0,e.R)(35,t)}start(){return this.#e("start")}finished(e){return this.#e("finished",e)}recordReplay(){return this.#e(t.G4.RECORD)}pauseReplay(){return this.#e(t.G4.PAUSE)}addToTrace(e){return this.#e("addToTrace",e)}setCurrentRouteName(e){return this.#e("setCurrentRouteName",e)}interaction(){return this.#e("interaction")}wrapLogger(e,t,r){return this.#e("wrapLogger",e,t,r)}}var a=i(860),s=i(9417);const c=Object.values(a.K7);function u(e){const t={};return c.forEach((r=>{t[r]=function(e,t){return!0===(0,s.gD)(t,"".concat(e,".enabled"))}(r,e)})),t}var d=i(8969);var l=i(1687),f=i(4234),h=i(5289),p=i(6154),g=i(5270),m=i(7767),v=i(6389);class b extends f.W{constructor(e,t,r=!0){super(e.agentIdentifier,t),this.auto=r,this.abortHandler=void 0,this.featAggregate=void 0,this.onAggregateImported=void 0,!1===e.init[this.featureName].autoStart&&(this.auto=!1),this.auto?(0,l.Ak)(e.agentIdentifier,t):this.ee.on("manual-start-all",(0,v.J)((()=>{(0,l.Ak)(e.agentIdentifier,this.featureName),this.auto=!0,this.importAggregator(e)})))}importAggregator(t,r={}){if(this.featAggregate||!this.auto)return;let n;this.onAggregateImported=new Promise((e=>{n=e}));const o=async()=>{let o;try{if((0,m.V)(this.agentIdentifier)){const{setupAgentSession:e}=await i.e(478).then(i.bind(i,6526));o=e(t)}}catch(t){(0,e.R)(20,t),this.ee.emit("internal-error",[t]),this.featureName===a.K7.sessionReplay&&this.abortHandler?.()}try{if(!this.#t(this.featureName,o))return(0,l.Ze)(this.agentIdentifier,this.featureName),void n(!1);const{lazyFeatureLoader:e}=await i.e(478).then(i.bind(i,6103)),{Aggregate:a}=await e(this.featureName,"aggregate");this.featAggregate=new a(t,r),t.runtime.harvester.initializedAggregates.push(this.featAggregate),n(!0)}catch(t){(0,e.R)(34,t),this.abortHandler?.(),(0,l.Ze)(this.agentIdentifier,this.featureName,!0),n(!1),this.ee&&this.ee.abort()}};p.RI?(0,h.GG)((()=>o()),!0):o()}#t(e,t){switch(e){case a.K7.sessionReplay:return(0,g.SR)(this.agentIdentifier)&&!!t;case a.K7.sessionTrace:return!!t;default:return!0}}}var y=i(6630);class w extends b{static featureName=y.T;constructor(e,t=!0){super(e,y.T,t),this.importAggregator(e)}}var R=i(384);var x=i(9908),E=i(2843),A=i(3878),T=i(782),N=i(1863);class S extends b{static featureName=T.T;constructor(e,t=!0){super(e,T.T,t),p.RI&&((0,E.u)((()=>(0,x.p)("docHidden",[(0,N.t)()],void 0,T.T,this.ee)),!0),(0,A.sp)("pagehide",(()=>(0,x.p)("winPagehide",[(0,N.t)()],void 0,T.T,this.ee))),this.importAggregator(e))}}var _=i(8154);class O extends b{static featureName=_.TZ;constructor(e,t=!0){super(e,_.TZ,t),this.importAggregator(e)}}var I=i(6774),P=i(3304);class j{constructor(e,t,r,n,i){this.name="UncaughtError",this.message="string"==typeof e?e:(0,P.A)(e),this.sourceURL=t,this.line=r,this.column=n,this.__newrelic=i}}function C(e){return H(e)?e:new j(void 0!==e?.message?e.message:e,e?.filename||e?.sourceURL,e?.lineno||e?.line,e?.colno||e?.col,e?.__newrelic)}function k(e){const t="Unhandled Promise Rejection";if(!e?.reason)return;if(H(e.reason))try{return e.reason.message=t+": "+e.reason.message,C(e.reason)}catch(t){return C(e.reason)}const r=C(e.reason);return r.message=t+": "+r?.message,r}function L(e){if(e.error instanceof SyntaxError&&!/:\d+$/.test(e.error.stack?.trim())){const t=new j(e.message,e.filename,e.lineno,e.colno,e.error.__newrelic);return t.name=SyntaxError.name,t}return H(e.error)?e.error:C(e)}function H(e){return e instanceof Error&&!!e.stack}class M extends b{static featureName=I.T;#r=!1;constructor(e,r=!0){super(e,I.T,r);try{this.removeOnAbort=new AbortController}catch(e){}this.ee.on("internal-error",((e,t)=>{this.abortHandler&&(0,x.p)("ierr",[C(e),(0,N.t)(),!0,{},this.#r,t],void 0,this.featureName,this.ee)})),this.ee.on(t.G4.REPLAY_RUNNING,(e=>{this.#r=e})),p.gm.addEventListener("unhandledrejection",(e=>{this.abortHandler&&(0,x.p)("err",[k(e),(0,N.t)(),!1,{unhandledPromiseRejection:1},this.#r],void 0,this.featureName,this.ee)}),(0,A.jT)(!1,this.removeOnAbort?.signal)),p.gm.addEventListener("error",(e=>{this.abortHandler&&(0,x.p)("err",[L(e),(0,N.t)(),!1,{},this.#r],void 0,this.featureName,this.ee)}),(0,A.jT)(!1,this.removeOnAbort?.signal)),this.abortHandler=this.#n,this.importAggregator(e)}#n(){this.removeOnAbort?.abort(),this.abortHandler=void 0}}var D=i(8990);let K=1;const U="nr@id";function V(e){const t=typeof e;return!e||"object"!==t&&"function"!==t?-1:e===p.gm?0:(0,D.I)(e,U,(function(){return K++}))}function G(e){if("string"==typeof e&&e.length)return e.length;if("object"==typeof e){if("undefined"!=typeof ArrayBuffer&&e instanceof ArrayBuffer&&e.byteLength)return e.byteLength;if("undefined"!=typeof Blob&&e instanceof Blob&&e.size)return e.size;if(!("undefined"!=typeof FormData&&e instanceof FormData))try{return(0,P.A)(e).length}catch(e){return}}}var F=i(8139),B=i(7836),W=i(3434);const z={},q=["open","send"];function Z(t){var r=t||B.ee;const n=function(e){return(e||B.ee).get("xhr")}(r);if(void 0===p.gm.XMLHttpRequest)return n;if(z[n.debugId]++)return n;z[n.debugId]=1,(0,F.u)(r);var i=(0,W.YM)(n),o=p.gm.XMLHttpRequest,a=p.gm.MutationObserver,s=p.gm.Promise,c=p.gm.setInterval,u="readystatechange",d=["onload","onerror","onabort","onloadstart","onloadend","onprogress","ontimeout"],l=[],f=p.gm.XMLHttpRequest=function(t){const r=new o(t),a=n.context(r);try{n.emit("new-xhr",[r],a),r.addEventListener(u,(s=a,function(){var e=this;e.readyState>3&&!s.resolved&&(s.resolved=!0,n.emit("xhr-resolved",[],e)),i.inPlace(e,d,"fn-",y)}),(0,A.jT)(!1))}catch(t){(0,e.R)(15,t);try{n.emit("internal-error",[t])}catch(e){}}var s;return r};function h(e,t){i.inPlace(t,["onreadystatechange"],"fn-",y)}if(function(e,t){for(var r in e)t[r]=e[r]}(o,f),f.prototype=o.prototype,i.inPlace(f.prototype,q,"-xhr-",y),n.on("send-xhr-start",(function(e,t){h(e,t),function(e){l.push(e),a&&(g?g.then(b):c?c(b):(m=-m,v.data=m))}(t)})),n.on("open-xhr-start",h),a){var g=s&&s.resolve();if(!c&&!s){var m=1,v=document.createTextNode(m);new a(b).observe(v,{characterData:!0})}}else r.on("fn-end",(function(e){e[0]&&e[0].type===u||b()}));function b(){for(var e=0;e<l.length;e++)h(0,l[e]);l.length&&(l=[])}function y(e,t){return t}return n}var Y="fetch-",J=Y+"body-",X=["arrayBuffer","blob","json","text","formData"],Q=p.gm.Request,ee=p.gm.Response,te="prototype";const re={};function ne(e){const t=function(e){return(e||B.ee).get("fetch")}(e);if(!(Q&&ee&&p.gm.fetch))return t;if(re[t.debugId]++)return t;function r(e,r,n){var i=e[r];"function"==typeof i&&(e[r]=function(){var e,r=[...arguments],o={};t.emit(n+"before-start",[r],o),o[B.P]&&o[B.P].dt&&(e=o[B.P].dt);var a=i.apply(this,r);return t.emit(n+"start",[r,e],a),a.then((function(e){return t.emit(n+"end",[null,e],a),e}),(function(e){throw t.emit(n+"end",[e],a),e}))})}return re[t.debugId]=1,X.forEach((e=>{r(Q[te],e,J),r(ee[te],e,J)})),r(p.gm,"fetch",Y),t.on(Y+"end",(function(e,r){var n=this;if(r){var i=r.headers.get("content-length");null!==i&&(n.rxSize=i),t.emit(Y+"done",[null,r],n)}else t.emit(Y+"done",[e],n)})),t}var ie=i(7485),oe=i(5603);class ae{constructor(e){this.agentIdentifier=e}generateTracePayload(e){if(!this.shouldGenerateTrace(e))return null;var t=(0,oe.o)(this.agentIdentifier);if(!t)return null;var n=(t.accountID||"").toString()||null,i=(t.agentID||"").toString()||null,o=(t.trustKey||"").toString()||null;if(!n||!i)return null;var a=(0,r.ZF)(),s=(0,r.el)(),c=Date.now(),u={spanId:a,traceId:s,timestamp:c};return(e.sameOrigin||this.isAllowedOrigin(e)&&this.useTraceContextHeadersForCors())&&(u.traceContextParentHeader=this.generateTraceContextParentHeader(a,s),u.traceContextStateHeader=this.generateTraceContextStateHeader(a,c,n,i,o)),(e.sameOrigin&&!this.excludeNewrelicHeader()||!e.sameOrigin&&this.isAllowedOrigin(e)&&this.useNewrelicHeaderForCors())&&(u.newrelicHeader=this.generateTraceHeader(a,s,c,n,i,o)),u}generateTraceContextParentHeader(e,t){return"00-"+t+"-"+e+"-01"}generateTraceContextStateHeader(e,t,r,n,i){return i+"@nr=0-1-"+r+"-"+n+"-"+e+"----"+t}generateTraceHeader(e,t,r,n,i,o){if(!("function"==typeof p.gm?.btoa))return null;var a={v:[0,1],d:{ty:"Browser",ac:n,ap:i,id:e,tr:t,ti:r}};return o&&n!==o&&(a.d.tk=o),btoa((0,P.A)(a))}shouldGenerateTrace(e){return this.isDtEnabled()&&this.isAllowedOrigin(e)}isAllowedOrigin(e){var t=!1,r={};if((0,s.gD)(this.agentIdentifier,"distributed_tracing")&&(r=(0,s.D0)(this.agentIdentifier).distributed_tracing),e.sameOrigin)t=!0;else if(r.allowed_origins instanceof Array)for(var n=0;n<r.allowed_origins.length;n++){var i=(0,ie.D)(r.allowed_origins[n]);if(e.hostname===i.hostname&&e.protocol===i.protocol&&e.port===i.port){t=!0;break}}return t}isDtEnabled(){var e=(0,s.gD)(this.agentIdentifier,"distributed_tracing");return!!e&&!!e.enabled}excludeNewrelicHeader(){var e=(0,s.gD)(this.agentIdentifier,"distributed_tracing");return!!e&&!!e.exclude_newrelic_header}useNewrelicHeaderForCors(){var e=(0,s.gD)(this.agentIdentifier,"distributed_tracing");return!!e&&!1!==e.cors_use_newrelic_header}useTraceContextHeadersForCors(){var e=(0,s.gD)(this.agentIdentifier,"distributed_tracing");return!!e&&!!e.cors_use_tracecontext_headers}}var se=i(9300),ce=i(7295),ue=["load","error","abort","timeout"],de=ue.length,le=(0,R.dV)().o.REQ,fe=(0,R.dV)().o.XHR;class he extends b{static featureName=se.T;constructor(e,t=!0){super(e,se.T,t),this.dt=new ae(e.agentIdentifier),this.handler=(e,t,r,n)=>(0,x.p)(e,t,r,n,this.ee);try{const e={xmlhttprequest:"xhr",fetch:"fetch",beacon:"beacon"};p.gm?.performance?.getEntriesByType("resource").forEach((t=>{if(t.initiatorType in e&&0!==t.responseStatus){const r={status:t.responseStatus},n={rxSize:t.transferSize,duration:Math.floor(t.duration),cbTime:0};pe(r,t.name),this.handler("xhr",[r,n,t.startTime,t.responseEnd,e[t.initiatorType]],void 0,a.K7.ajax)}}))}catch(e){}ne(this.ee),Z(this.ee),function(e,t,r,n){function i(e){var t=this;t.totalCbs=0,t.called=0,t.cbTime=0,t.end=R,t.ended=!1,t.xhrGuids={},t.lastSize=null,t.loadCaptureCalled=!1,t.params=this.params||{},t.metrics=this.metrics||{},e.addEventListener("load",(function(r){E(t,e)}),(0,A.jT)(!1)),p.lR||e.addEventListener("progress",(function(e){t.lastSize=e.loaded}),(0,A.jT)(!1))}function o(e){this.params={method:e[0]},pe(this,e[1]),this.metrics={}}function s(t,r){e.loader_config.xpid&&this.sameOrigin&&r.setRequestHeader("X-NewRelic-ID",e.loader_config.xpid);var i=n.generateTracePayload(this.parsedOrigin);if(i){var o=!1;i.newrelicHeader&&(r.setRequestHeader("newrelic",i.newrelicHeader),o=!0),i.traceContextParentHeader&&(r.setRequestHeader("traceparent",i.traceContextParentHeader),i.traceContextStateHeader&&r.setRequestHeader("tracestate",i.traceContextStateHeader),o=!0),o&&(this.dt=i)}}function c(e,r){var n=this.metrics,i=e[0],o=this;if(n&&i){var a=G(i);a&&(n.txSize=a)}this.startTime=(0,N.t)(),this.body=i,this.listener=function(e){try{"abort"!==e.type||o.loadCaptureCalled||(o.params.aborted=!0),("load"!==e.type||o.called===o.totalCbs&&(o.onloadCalled||"function"!=typeof r.onload)&&"function"==typeof o.end)&&o.end(r)}catch(e){try{t.emit("internal-error",[e])}catch(e){}}};for(var s=0;s<de;s++)r.addEventListener(ue[s],this.listener,(0,A.jT)(!1))}function u(e,t,r){this.cbTime+=e,t?this.onloadCalled=!0:this.called+=1,this.called!==this.totalCbs||!this.onloadCalled&&"function"==typeof r.onload||"function"!=typeof this.end||this.end(r)}function d(e,t){var r=""+V(e)+!!t;this.xhrGuids&&!this.xhrGuids[r]&&(this.xhrGuids[r]=!0,this.totalCbs+=1)}function l(e,t){var r=""+V(e)+!!t;this.xhrGuids&&this.xhrGuids[r]&&(delete this.xhrGuids[r],this.totalCbs-=1)}function f(){this.endTime=(0,N.t)()}function h(e,r){r instanceof fe&&"load"===e[0]&&t.emit("xhr-load-added",[e[1],e[2]],r)}function g(e,r){r instanceof fe&&"load"===e[0]&&t.emit("xhr-load-removed",[e[1],e[2]],r)}function m(e,t,r){t instanceof fe&&("onload"===r&&(this.onload=!0),("load"===(e[0]&&e[0].type)||this.onload)&&(this.xhrCbStart=(0,N.t)()))}function v(e,r){this.xhrCbStart&&t.emit("xhr-cb-time",[(0,N.t)()-this.xhrCbStart,this.onload,r],r)}function b(e){var t,r=e[1]||{};if("string"==typeof e[0]?0===(t=e[0]).length&&p.RI&&(t=""+p.gm.location.href):e[0]&&e[0].url?t=e[0].url:p.gm?.URL&&e[0]&&e[0]instanceof URL?t=e[0].href:"function"==typeof e[0].toString&&(t=e[0].toString()),"string"==typeof t&&0!==t.length){t&&(this.parsedOrigin=(0,ie.D)(t),this.sameOrigin=this.parsedOrigin.sameOrigin);var i=n.generateTracePayload(this.parsedOrigin);if(i&&(i.newrelicHeader||i.traceContextParentHeader))if(e[0]&&e[0].headers)s(e[0].headers,i)&&(this.dt=i);else{var o={};for(var a in r)o[a]=r[a];o.headers=new Headers(r.headers||{}),s(o.headers,i)&&(this.dt=i),e.length>1?e[1]=o:e.push(o)}}function s(e,t){var r=!1;return t.newrelicHeader&&(e.set("newrelic",t.newrelicHeader),r=!0),t.traceContextParentHeader&&(e.set("traceparent",t.traceContextParentHeader),t.traceContextStateHeader&&e.set("tracestate",t.traceContextStateHeader),r=!0),r}}function y(e,t){this.params={},this.metrics={},this.startTime=(0,N.t)(),this.dt=t,e.length>=1&&(this.target=e[0]),e.length>=2&&(this.opts=e[1]);var r,n=this.opts||{},i=this.target;"string"==typeof i?r=i:"object"==typeof i&&i instanceof le?r=i.url:p.gm?.URL&&"object"==typeof i&&i instanceof URL&&(r=i.href),pe(this,r);var o=(""+(i&&i instanceof le&&i.method||n.method||"GET")).toUpperCase();this.params.method=o,this.body=n.body,this.txSize=G(n.body)||0}function w(e,t){if(this.endTime=(0,N.t)(),this.params||(this.params={}),(0,ce.iW)(this.params))return;let n;this.params.status=t?t.status:0,"string"==typeof this.rxSize&&this.rxSize.length>0&&(n=+this.rxSize);const i={txSize:this.txSize,rxSize:n,duration:(0,N.t)()-this.startTime};r("xhr",[this.params,i,this.startTime,this.endTime,"fetch"],this,a.K7.ajax)}function R(e){const t=this.params,n=this.metrics;if(!this.ended){this.ended=!0;for(let t=0;t<de;t++)e.removeEventListener(ue[t],this.listener,!1);t.aborted||(0,ce.iW)(t)||(n.duration=(0,N.t)()-this.startTime,this.loadCaptureCalled||4!==e.readyState?null==t.status&&(t.status=0):E(this,e),n.cbTime=this.cbTime,r("xhr",[t,n,this.startTime,this.endTime,"xhr"],this,a.K7.ajax))}}function E(e,r){e.params.status=r.status;var n=function(e,t){var r=e.responseType;return"json"===r&&null!==t?t:"arraybuffer"===r||"blob"===r||"json"===r?G(e.response):"text"===r||""===r||void 0===r?G(e.responseText):void 0}(r,e.lastSize);if(n&&(e.metrics.rxSize=n),e.sameOrigin){var i=r.getResponseHeader("X-NewRelic-App-Data");i&&((0,x.p)(_.rs,["Ajax/CrossApplicationTracing/Header/Seen"],void 0,a.K7.metrics,t),e.params.cat=i.split(", ").pop())}e.loadCaptureCalled=!0}t.on("new-xhr",i),t.on("open-xhr-start",o),t.on("open-xhr-end",s),t.on("send-xhr-start",c),t.on("xhr-cb-time",u),t.on("xhr-load-added",d),t.on("xhr-load-removed",l),t.on("xhr-resolved",f),t.on("addEventListener-end",h),t.on("removeEventListener-end",g),t.on("fn-end",v),t.on("fetch-before-start",b),t.on("fetch-start",y),t.on("fn-start",m),t.on("fetch-done",w)}(e,this.ee,this.handler,this.dt),this.importAggregator(e)}}function pe(e,t){var r=(0,ie.D)(t),n=e.params||e;n.hostname=r.hostname,n.port=r.port,n.protocol=r.protocol,n.host=r.hostname+":"+r.port,n.pathname=r.pathname,e.parsedOrigin=r,e.sameOrigin=r.sameOrigin}const ge={},me=["pushState","replaceState"];function ve(e){const t=function(e){return(e||B.ee).get("history")}(e);return!p.RI||ge[t.debugId]++||(ge[t.debugId]=1,(0,W.YM)(t).inPlace(window.history,me,"-")),t}var be=i(3738);const{He:ye,bD:we,d3:Re,Kp:xe,TZ:Ee,Lc:Ae,uP:Te,Rz:Ne}=be;class Se extends b{static featureName=Ee;constructor(e,t=!0){super(e,Ee,t);if(!(0,m.V)(this.agentIdentifier))return void this.deregisterDrain();const r=this.ee;let n;ve(r),this.eventsEE=(0,F.u)(r),this.eventsEE.on(Te,(function(e,t){this.bstStart=(0,N.t)()})),this.eventsEE.on(Ae,(function(e,t){(0,x.p)("bst",[e[0],t,this.bstStart,(0,N.t)()],void 0,a.K7.sessionTrace,r)})),r.on(Ne+Re,(function(e){this.time=(0,N.t)(),this.startPath=location.pathname+location.hash})),r.on(Ne+xe,(function(e){(0,x.p)("bstHist",[location.pathname+location.hash,this.startPath,this.time],void 0,a.K7.sessionTrace,r)}));try{n=new PerformanceObserver((e=>{const t=e.getEntries();(0,x.p)(ye,[t],void 0,a.K7.sessionTrace,r)})),n.observe({type:we,buffered:!0})}catch(e){}this.importAggregator(e,{resourceObserver:n})}}var _e=i(2614);class Oe extends b{static featureName=t.TZ;#i;#o;constructor(e,r=!0){let n;super(e,t.TZ,r),this.replayRunning=!1,this.#o=e;try{n=JSON.parse(localStorage.getItem("".concat(_e.H3,"_").concat(_e.uh)))}catch(e){}(0,g.SR)(e.agentIdentifier)&&this.ee.on(t.G4.RECORD,(()=>this.#a())),this.#s(n)?(this.#i=n?.sessionReplayMode,this.#c()):this.importAggregator(e),this.ee.on("err",(e=>{this.replayRunning&&(this.errorNoticed=!0,(0,x.p)(t.G4.ERROR_DURING_REPLAY,[e],void 0,this.featureName,this.ee))})),this.ee.on(t.G4.REPLAY_RUNNING,(e=>{this.replayRunning=e}))}#s(e){return e&&(e.sessionReplayMode===_e.g.FULL||e.sessionReplayMode===_e.g.ERROR)||(0,g.Aw)(this.agentIdentifier)}#u=!1;async#c(e){if(!this.#u){this.#u=!0;try{const{Recorder:t}=await Promise.all([i.e(478),i.e(249)]).then(i.bind(i,8589));this.recorder??=new t({mode:this.#i,agentIdentifier:this.agentIdentifier,trigger:e,ee:this.ee,agentRef:this.#o}),this.recorder.startRecording(),this.abortHandler=this.recorder.stopRecording}catch(e){}this.importAggregator(this.#o,{recorder:this.recorder,errorNoticed:this.errorNoticed})}}#a(){this.featAggregate?this.featAggregate.mode!==_e.g.FULL&&this.featAggregate.initializeRecording(_e.g.FULL,!0):(this.#i=_e.g.FULL,this.#c(t.Qb.API),this.recorder&&this.recorder.parent.mode!==_e.g.FULL&&(this.recorder.parent.mode=_e.g.FULL,this.recorder.stopRecording(),this.recorder.startRecording(),this.abortHandler=this.recorder.stopRecording))}}var Ie=i(3962);class Pe extends b{static featureName=Ie.TZ;constructor(e,t=!0){if(super(e,Ie.TZ,t),!p.RI||!(0,R.dV)().o.MO)return;const r=ve(this.ee);Ie.tC.forEach((e=>{(0,A.sp)(e,(e=>{a(e)}),!0)}));const n=()=>(0,x.p)("newURL",[(0,N.t)(),""+window.location],void 0,this.featureName,this.ee);r.on("pushState-end",n),r.on("replaceState-end",n);try{this.removeOnAbort=new AbortController}catch(e){}(0,A.sp)("popstate",(e=>(0,x.p)("newURL",[e.timeStamp,""+window.location],void 0,this.featureName,this.ee)),!0,this.removeOnAbort?.signal);let i=!1;const o=new((0,R.dV)().o.MO)(((e,t)=>{i||(i=!0,requestAnimationFrame((()=>{(0,x.p)("newDom",[(0,N.t)()],void 0,this.featureName,this.ee),i=!1})))})),a=(0,v.s)((e=>{(0,x.p)("newUIEvent",[e],void 0,this.featureName,this.ee),o.observe(document.body,{attributes:!0,childList:!0,subtree:!0,characterData:!0})}),100,{leading:!0});this.abortHandler=function(){this.removeOnAbort?.abort(),o.disconnect(),this.abortHandler=void 0},this.importAggregator(e,{domObserver:o})}}var je=i(7378);const Ce={},ke=["appendChild","insertBefore","replaceChild"];function Le(e){const t=function(e){return(e||B.ee).get("jsonp")}(e);if(!p.RI||Ce[t.debugId])return t;Ce[t.debugId]=!0;var r=(0,W.YM)(t),n=/[?&](?:callback|cb)=([^&#]+)/,i=/(.*)\.([^.]+)/,o=/^(\w+)(\.|$)(.*)$/;function a(e,t){if(!e)return t;const r=e.match(o),n=r[1];return a(r[3],t[n])}return r.inPlace(Node.prototype,ke,"dom-"),t.on("dom-start",(function(e){!function(e){if(!e||"string"!=typeof e.nodeName||"script"!==e.nodeName.toLowerCase())return;if("function"!=typeof e.addEventListener)return;var o=(s=e.src,c=s.match(n),c?c[1]:null);var s,c;if(!o)return;var u=function(e){var t=e.match(i);if(t&&t.length>=3)return{key:t[2],parent:a(t[1],window)};return{key:e,parent:window}}(o);if("function"!=typeof u.parent[u.key])return;var d={};function l(){t.emit("jsonp-end",[],d),e.removeEventListener("load",l,(0,A.jT)(!1)),e.removeEventListener("error",f,(0,A.jT)(!1))}function f(){t.emit("jsonp-error",[],d),t.emit("jsonp-end",[],d),e.removeEventListener("load",l,(0,A.jT)(!1)),e.removeEventListener("error",f,(0,A.jT)(!1))}r.inPlace(u.parent,[u.key],"cb-",d),e.addEventListener("load",l,(0,A.jT)(!1)),e.addEventListener("error",f,(0,A.jT)(!1)),t.emit("new-jsonp",[e.src],d)}(e[0])})),t}const He={};function Me(e){const t=function(e){return(e||B.ee).get("promise")}(e);if(He[t.debugId])return t;He[t.debugId]=!0;var r=t.context,n=(0,W.YM)(t),i=p.gm.Promise;return i&&function(){function e(r){var o=t.context(),a=n(r,"executor-",o,null,!1);const s=Reflect.construct(i,[a],e);return t.context(s).getCtx=function(){return o},s}p.gm.Promise=e,Object.defineProperty(e,"name",{value:"Promise"}),e.toString=function(){return i.toString()},Object.setPrototypeOf(e,i),["all","race"].forEach((function(r){const n=i[r];e[r]=function(e){let i=!1;[...e||[]].forEach((e=>{this.resolve(e).then(a("all"===r),a(!1))}));const o=n.apply(this,arguments);return o;function a(e){return function(){t.emit("propagate",[null,!i],o,!1,!1),i=i||!e}}}})),["resolve","reject"].forEach((function(r){const n=i[r];e[r]=function(e){const r=n.apply(this,arguments);return e!==r&&t.emit("propagate",[e,!0],r,!1,!1),r}})),e.prototype=i.prototype;const o=i.prototype.then;i.prototype.then=function(...e){var i=this,a=r(i);a.promise=i,e[0]=n(e[0],"cb-",a,null,!1),e[1]=n(e[1],"cb-",a,null,!1);const s=o.apply(this,e);return a.nextPromise=s,t.emit("propagate",[i,!0],s,!1,!1),s},i.prototype.then[W.Jt]=o,t.on("executor-start",(function(e){e[0]=n(e[0],"resolve-",this,null,!1),e[1]=n(e[1],"resolve-",this,null,!1)})),t.on("executor-err",(function(e,t,r){e[1](r)})),t.on("cb-end",(function(e,r,n){t.emit("propagate",[n,!0],this.nextPromise,!1,!1)})),t.on("propagate",(function(e,r,n){this.getCtx&&!r||(this.getCtx=function(){if(e instanceof Promise)var r=t.context(e);return r&&r.getCtx?r.getCtx():this})}))}(),t}const De={},Ke="setTimeout",Ue="setInterval",Ve="clearTimeout",Ge="-start",Fe=[Ke,"setImmediate",Ue,Ve,"clearImmediate"];function Be(e){const t=function(e){return(e||B.ee).get("timer")}(e);if(De[t.debugId]++)return t;De[t.debugId]=1;var r=(0,W.YM)(t);return r.inPlace(p.gm,Fe.slice(0,2),Ke+"-"),r.inPlace(p.gm,Fe.slice(2,3),Ue+"-"),r.inPlace(p.gm,Fe.slice(3),Ve+"-"),t.on(Ue+Ge,(function(e,t,n){e[0]=r(e[0],"fn-",null,n)})),t.on(Ke+Ge,(function(e,t,n){this.method=n,this.timerDuration=isNaN(e[1])?0:+e[1],e[0]=r(e[0],"fn-",this,n)})),t}const We={};function ze(e){const t=function(e){return(e||B.ee).get("mutation")}(e);if(!p.RI||We[t.debugId])return t;We[t.debugId]=!0;var r=(0,W.YM)(t),n=p.gm.MutationObserver;return n&&(window.MutationObserver=function(e){return this instanceof n?new n(r(e,"fn-")):n.apply(this,arguments)},MutationObserver.prototype=n.prototype),t}const{TZ:qe,d3:Ze,Kp:Ye,$p:Je,wW:Xe,e5:$e,tH:Qe,uP:et,rw:tt,Lc:rt}=je;class nt extends b{static featureName=qe;constructor(e,t=!0){if(super(e,qe,t),!p.RI)return;try{this.removeOnAbort=new AbortController}catch(e){}let r,n=0;const i=this.ee.get("tracer"),o=Le(this.ee),a=Me(this.ee),s=Be(this.ee),c=Z(this.ee),u=this.ee.get("events"),d=ne(this.ee),l=ve(this.ee),f=ze(this.ee);function h(e,t){l.emit("newURL",[""+window.location,t])}function g(){n++,r=window.location.hash,this[et]=(0,N.t)()}function m(){n--,window.location.hash!==r&&h(0,!0);var e=(0,N.t)();this[$e]=~~this[$e]+e-this[et],this[rt]=e}function v(e,t){e.on(t,(function(){this[t]=(0,N.t)()}))}this.ee.on(et,g),a.on(tt,g),o.on(tt,g),this.ee.on(rt,m),a.on(Xe,m),o.on(Xe,m),this.ee.on("fn-err",((...t)=>{t[2]?.__newrelic?.[e.agentIdentifier]||(0,x.p)("function-err",[...t],void 0,this.featureName,this.ee)})),this.ee.buffer([et,rt,"xhr-resolved"],this.featureName),u.buffer([et],this.featureName),s.buffer(["setTimeout"+Ye,"clearTimeout"+Ze,et],this.featureName),c.buffer([et,"new-xhr","send-xhr"+Ze],this.featureName),d.buffer([Qe+Ze,Qe+"-done",Qe+Je+Ze,Qe+Je+Ye],this.featureName),l.buffer(["newURL"],this.featureName),f.buffer([et],this.featureName),a.buffer(["propagate",tt,Xe,"executor-err","resolve"+Ze],this.featureName),i.buffer([et,"no-"+et],this.featureName),o.buffer(["new-jsonp","cb-start","jsonp-error","jsonp-end"],this.featureName),v(d,Qe+Ze),v(d,Qe+"-done"),v(o,"new-jsonp"),v(o,"jsonp-end"),v(o,"cb-start"),l.on("pushState-end",h),l.on("replaceState-end",h),window.addEventListener("hashchange",h,(0,A.jT)(!0,this.removeOnAbort?.signal)),window.addEventListener("load",h,(0,A.jT)(!0,this.removeOnAbort?.signal)),window.addEventListener("popstate",(function(){h(0,n>1)}),(0,A.jT)(!0,this.removeOnAbort?.signal)),this.abortHandler=this.#n,this.importAggregator(e)}#n(){this.removeOnAbort?.abort(),this.abortHandler=void 0}}var it=i(3333);class ot extends b{static featureName=it.TZ;constructor(e,t=!0){super(e,it.TZ,t);const r=[e.init.page_action.enabled,e.init.performance.capture_marks,e.init.performance.capture_measures,e.init.user_actions.enabled,e.init.performance.resources.enabled];if(p.RI&&(e.init.user_actions.enabled&&(it.Zp.forEach((e=>(0,A.sp)(e,(e=>(0,x.p)("ua",[e],void 0,this.featureName,this.ee)),!0))),it.qN.forEach((e=>{const t=(0,v.s)((e=>{(0,x.p)("ua",[e],void 0,this.featureName,this.ee)}),500,{leading:!0});(0,A.sp)(e,t)}))),e.init.performance.resources.enabled&&p.gm.PerformanceObserver?.supportedEntryTypes.includes("resource"))){new PerformanceObserver((e=>{e.getEntries().forEach((e=>{(0,x.p)("browserPerformance.resource",[e],void 0,this.featureName,this.ee)}))})).observe({type:"resource",buffered:!0})}r.some((e=>e))?this.importAggregator(e):this.deregisterDrain()}}var at=i(993),st=i(3785),ct=i(9414);class ut extends b{static featureName=at.TZ;constructor(e,t=!0){super(e,at.TZ,t);const r=this.ee;(0,ct.J)(r,p.gm.console,"log",{level:"info"}),(0,ct.J)(r,p.gm.console,"error",{level:"error"}),(0,ct.J)(r,p.gm.console,"warn",{level:"warn"}),(0,ct.J)(r,p.gm.console,"info",{level:"info"}),(0,ct.J)(r,p.gm.console,"debug",{level:"debug"}),(0,ct.J)(r,p.gm.console,"trace",{level:"trace"}),this.ee.on("wrap-logger-end",(function([e]){const{level:t,customAttributes:n}=this;(0,st.R)(r,e,n,t)})),this.importAggregator(e)}}new class extends o{constructor(t){super(),p.gm?(this.features={},(0,R.bQ)(this.agentIdentifier,this),this.desiredFeatures=new Set(t.features||[]),this.desiredFeatures.add(w),this.runSoftNavOverSpa=[...this.desiredFeatures].some((e=>e.featureName===a.K7.softNav)),(0,d.j)(this,t,t.loaderType||"agent"),this.run()):(0,e.R)(21)}get config(){return{info:this.info,init:this.init,loader_config:this.loader_config,runtime:this.runtime}}run(){try{const t=u(this.agentIdentifier),r=[...this.desiredFeatures];r.sort(((e,t)=>a.P3[e.featureName]-a.P3[t.featureName])),r.forEach((r=>{if(!t[r.featureName]&&r.featureName!==a.K7.pageViewEvent)return;if(this.runSoftNavOverSpa&&r.featureName===a.K7.spa)return;if(!this.runSoftNavOverSpa&&r.featureName===a.K7.softNav)return;const n=function(e){switch(e){case a.K7.ajax:return[a.K7.jserrors];case a.K7.sessionTrace:return[a.K7.ajax,a.K7.pageViewEvent];case a.K7.sessionReplay:return[a.K7.sessionTrace];case a.K7.pageViewTiming:return[a.K7.pageViewEvent];default:return[]}}(r.featureName).filter((e=>!(e in this.features)));n.length>0&&(0,e.R)(36,{targetFeature:r.featureName,missingDependencies:n}),this.features[r.featureName]=new r(this)}))}catch(t){(0,e.R)(22,t);for(const e in this.features)this.features[e].abortHandler?.();const r=(0,R.Zm)();delete r.initializedAgents[this.agentIdentifier]?.api,delete r.initializedAgents[this.agentIdentifier]?.features,delete this.sharedAggregator;return r.ee.get(this.agentIdentifier).abort(),!1}}}({features:[he,w,S,Se,Oe,O,M,ot,ut,Pe,nt],loaderType:"spa"})})()})();</script><link rel="preload" href="/article-pages/_nuxt/e397d1a.js" as="script"><link rel="preload" href="/article-pages/_nuxt/2abb6c5.js" as="script"><link rel="preload" href="/article-pages/_nuxt/css/66101cf.css" as="style"><link rel="preload" href="/article-pages/_nuxt/701e3a3.js" as="script"><link rel="preload" href="/article-pages/_nuxt/css/dac93f2.css" as="style"><link rel="preload" href="/article-pages/_nuxt/71728a1.js" as="script"><link rel="preload" href="/article-pages/_nuxt/a5e7651.js" as="script"><link rel="preload" href="/article-pages/_nuxt/css/e5cdfa1.css" as="style"><link rel="preload" href="/article-pages/_nuxt/f548f7f.js" as="script"><link rel="preload" href="/article-pages/_nuxt/css/868b092.css" as="style"><link rel="preload" href="/article-pages/_nuxt/e3c5a8f.js" as="script"><link rel="preload" href="/article-pages/_nuxt/css/97e04e3.css" as="style"><link rel="preload" href="/article-pages/_nuxt/0d6d8e5.js" as="script"><link rel="preload" href="/article-pages/_nuxt/ed7fc59.js" as="script"><link rel="stylesheet" href="/article-pages/_nuxt/css/66101cf.css"><link rel="stylesheet" href="/article-pages/_nuxt/css/dac93f2.css"><link rel="stylesheet" href="/article-pages/_nuxt/css/e5cdfa1.css"><link rel="stylesheet" href="/article-pages/_nuxt/css/868b092.css"><link rel="stylesheet" href="/article-pages/_nuxt/css/97e04e3.css"> <meta property="fb:admins" content="1841006843"> </head> <body > <button class="BypassBlock__firstEl"></button> <a href="#main-content" class="BypassBlock__wrapper"> <span class="BypassBlock__button">Skip to main content</span> </a> <!-- Google Tag Manager (noscript) --> <noscript> <iframe src="https://tag-manager.frontiersin.org/ns.html?id=GTM-M322FV2>m_auth=owVbWxfaJr21yQv1fe1cAQ>m_preview=env-1>m_cookies_win=x" height="0" width="0" style="display:none;visibility:hidden"></iframe> </noscript> <!-- End Google Tag Manager (noscript) --> <div data-server-rendered="true" id="__nuxt"><div id="__layout"><div theme="purple" class="ArticleLayout"><nav class="Ibar"><div class="Ibar__main"><div class="Ibar__wrapper"><button aria-label="Open Menu" data-event="iBar-btn-openMenu" class="Ibar__burger"></button> <div class="Ibar__logo"><a href="//www.frontiersin.org/" aria-label="Frontiershome" data-event="iBar-a-home" class="Ibar__logo__link"><svg viewBox="0 0 2811 590" fill="none" xmlns="http://www.w3.org/2000/svg" class="Ibar__logo__svg"><path d="M633.872 234.191h-42.674v-57.246h42.674c0-19.776 2.082-35.389 5.204-48.92 4.164-13.53 9.368-23.939 17.695-31.225 8.326-8.326 18.735-13.53 32.266-16.653 13.531-3.123 29.143-5.204 47.878-5.204h21.858c7.286 0 14.572 1.04 21.857 1.04v62.451c-8.326-1.041-16.653-2.082-23.939-2.082-10.408 0-17.694 1.041-23.939 4.164-6.245 3.122-9.368 10.408-9.368 22.898v13.531h53.083v57.246h-53.083v213.372h-89.512V234.191zM794.161 176.945h86.39v47.879h1.041c6.245-17.694 16.653-30.185 31.225-39.552 14.572-9.368 31.225-13.531 49.96-13.531h10.409c3.122 0 7.286 1.041 10.408 2.082v81.185c-6.245-2.082-11.449-3.122-16.653-4.163-5.204-1.041-11.449-1.041-16.654-1.041-11.449 0-20.816 2.082-29.143 5.204-8.327 3.123-15.613 8.327-20.817 14.572-5.204 6.245-10.408 12.49-12.49 20.817-3.123 8.326-4.163 15.612-4.163 23.939v133.228h-88.472V176.945h-1.041zM989.84 312.254c0-19.776 3.122-39.552 10.41-56.205 7.28-17.695 16.65-32.266 29.14-45.797 12.49-13.531 27.06-22.899 44.76-30.185 17.69-7.285 36.43-11.449 57.24-11.449 20.82 0 39.56 4.164 57.25 11.449 17.69 7.286 32.27 17.695 45.8 30.185 12.49 12.49 22.9 28.102 29.14 45.797 7.29 17.694 10.41 36.429 10.41 56.205 0 20.817-3.12 39.552-10.41 57.246-7.29 17.695-16.65 32.266-29.14 44.756-12.49 12.49-28.11 22.899-45.8 30.185-17.69 7.286-36.43 11.449-57.25 11.449-20.81 0-40.59-4.163-57.24-11.449-17.7-7.286-32.27-17.695-44.76-30.185-12.49-12.49-21.86-28.102-29.14-44.756-7.288-17.694-10.41-36.429-10.41-57.246zm88.47 0c0 8.327 1.04 17.694 3.12 26.021 2.09 9.368 5.21 16.653 9.37 23.939 4.16 7.286 9.37 13.531 16.65 17.695 7.29 4.163 15.62 7.285 26.03 7.285 10.4 0 18.73-2.081 26.02-7.285 7.28-4.164 12.49-10.409 16.65-17.695 4.16-7.286 7.29-15.612 9.37-23.939 2.08-9.368 3.12-17.694 3.12-26.021 0-8.327-1.04-17.694-3.12-26.021-2.08-9.368-5.21-16.653-9.37-23.939-4.16-7.286-9.37-13.531-16.65-17.695-7.29-5.204-15.62-7.285-26.02-7.285-10.41 0-18.74 2.081-26.03 7.285-7.28 5.205-12.49 10.409-16.65 17.695-4.16 7.286-7.28 15.612-9.37 23.939-2.08 9.368-3.12 17.694-3.12 26.021zM1306.25 176.945h86.39v37.47h1.04c4.17-7.286 9.37-13.531 15.62-18.735 6.24-5.204 13.53-10.408 20.81-14.572 7.29-4.163 15.62-7.286 23.94-9.367 8.33-2.082 16.66-3.123 24.98-3.123 22.9 0 40.6 4.164 53.09 11.449 13.53 7.286 22.89 16.654 29.14 27.062 6.24 10.409 10.41 21.858 12.49 34.348 2.08 12.49 2.08 22.898 2.08 33.307v172.779h-88.47V316.417v-27.061c0-9.368-1.04-16.654-4.16-23.94-3.13-7.286-7.29-12.49-13.53-16.653-6.25-4.164-15.62-6.245-27.07-6.245-8.32 0-15.61 2.081-21.85 5.204-6.25 3.122-11.45 7.286-14.58 13.531-4.16 5.204-6.24 11.449-8.32 18.735s-3.12 14.572-3.12 21.858v145.717h-88.48V176.945zM1780.88 234.19h-55.17v122.819c0 10.408 3.12 17.694 8.33 20.817 6.24 3.122 13.53 5.204 22.9 5.204 4.16 0 7.28 0 11.45-1.041h11.45v65.573c-8.33 0-15.62 1.041-23.94 2.082-8.33 1.04-16.66 1.041-23.94 1.041-18.74 0-34.35-2.082-46.84-5.205-12.49-3.122-21.86-8.326-29.14-15.612-7.29-7.286-12.49-16.654-14.58-29.144-3.12-12.49-4.16-27.062-4.16-45.797V234.19h-44.76v-57.246h44.76V94.717h88.47v82.227h55.17v57.246zM1902.66 143.639h-88.48V75.984h88.48v67.655zm-89.52 33.307h88.48v270.618h-88.48V176.946zM2024.43 334.111c1.04 18.735 6.25 33.307 16.66 44.756 10.4 11.449 24.98 16.653 43.71 16.653 10.41 0 20.82-2.081 30.19-7.286 9.36-5.204 16.65-12.49 20.81-22.898h83.27c-4.16 15.613-10.41 29.144-19.78 40.593-9.36 11.449-19.77 20.817-31.22 28.102-12.49 7.286-24.98 12.491-39.55 16.654-14.57 3.122-29.15 5.204-43.72 5.204-21.86 0-41.63-3.122-60.37-9.367-18.73-6.246-34.34-15.613-46.83-28.103-12.49-12.49-22.9-27.062-30.19-45.797-7.28-17.694-10.41-38.511-10.41-60.369 0-20.817 4.17-39.552 11.45-57.246 7.29-17.694 17.7-32.266 31.23-44.756 13.53-12.49 29.14-21.858 46.83-29.144 17.7-7.286 36.43-10.408 56.21-10.408 23.94 0 45.8 4.163 63.49 12.49 17.7 8.327 33.31 19.776 44.76 35.389 11.45 15.612 20.81 32.266 26.02 52.042 5.2 19.776 8.33 41.633 7.28 64.532h-199.84v-1.041zm110.33-49.961c-1.04-15.612-6.24-28.102-15.61-39.551-9.37-10.409-21.86-16.654-37.47-16.654s-28.1 5.204-38.51 15.613c-10.41 10.408-16.66 23.939-18.74 40.592h110.33zM2254.46 176.945h86.39v47.879h1.04c6.25-17.694 16.65-30.185 31.23-39.552 14.57-9.368 31.22-13.531 49.96-13.531h10.4c3.13 0 7.29 1.041 10.41 2.082v81.185c-6.24-2.082-11.45-3.122-16.65-4.163-5.21-1.041-11.45-1.041-16.65-1.041-11.45 0-20.82 2.082-29.15 5.204-8.32 3.123-15.61 8.327-20.81 14.572-6.25 6.245-10.41 12.49-12.49 20.817-3.13 8.326-4.17 15.612-4.17 23.939v133.228h-88.47V176.945h-1.04zM2534.45 359.091c0 7.286 1.04 12.49 4.16 17.694 3.12 5.204 6.24 9.368 10.41 12.49 4.16 3.123 9.36 5.204 14.57 7.286 6.24 2.082 11.45 2.082 17.69 2.082 4.17 0 8.33 0 13.53-2.082 5.21-1.041 9.37-3.123 13.53-5.204 4.17-2.082 7.29-5.204 10.41-9.368 3.13-4.163 4.17-8.327 4.17-13.531 0-5.204-2.09-9.367-5.21-12.49-3.12-3.122-7.28-6.245-11.45-8.327-4.16-2.081-9.36-4.163-14.57-5.204-5.2-1.041-9.37-2.081-13.53-3.122-13.53-3.123-28.1-6.245-42.67-9.368-14.58-3.122-28.11-7.286-40.6-12.49-12.49-6.245-22.9-13.531-30.18-23.939-8.33-10.409-11.45-23.94-11.45-42.675 0-16.653 4.16-30.184 11.45-40.592 8.33-10.409 17.69-18.736 30.18-24.981 12.49-6.245 26.02-10.408 40.6-13.53 14.57-3.123 28.1-4.164 41.63-4.164 14.57 0 29.14 1.041 43.71 4.164 14.58 2.081 27.07 7.285 39.56 13.53 12.49 6.245 21.85 15.613 29.14 27.062 7.29 11.45 11.45 26.021 12.49 43.716h-82.23c0-10.409-4.16-18.736-11.45-23.94-7.28-4.163-16.65-7.286-28.1-7.286-4.16 0-8.32 0-12.49 1.041-4.16 1.041-8.32 1.041-12.49 2.082-4.16 1.041-7.28 3.122-9.37 6.245-2.08 3.122-4.16 6.245-4.16 11.449 0 6.245 3.12 11.449 10.41 15.613 6.24 4.163 14.57 7.286 24.98 10.408 10.41 2.082 20.82 5.204 32.27 7.286 11.44 2.082 22.89 4.163 33.3 6.245 13.53 3.123 24.98 7.286 33.31 13.531 9.37 6.245 15.61 12.49 20.82 19.776 5.2 7.286 9.36 14.572 11.45 21.858 2.08 7.285 3.12 13.53 3.12 19.776 0 17.694-4.17 33.306-11.45 45.796-8.33 12.491-17.7 21.858-30.19 30.185-12.49 7.286-26.02 12.49-41.63 16.653-15.61 3.123-31.22 5.204-45.8 5.204-15.61 0-32.26-1.04-47.87-4.163-15.62-3.122-29.15-8.327-41.64-15.612a83.855 83.855 0 01-30.18-30.185c-8.33-12.49-12.49-28.102-12.49-46.838h84.31v-2.081z" fill="#FFFFFF" class="Ibar__logo__text"></path> <path d="M0 481.911V281.028l187.351-58.287v200.882L0 481.911z" fill="#8BC53F"></path> <path d="M187.351 423.623V222.741l126.983 87.431v200.882l-126.983-87.431z" fill="#EBD417"></path> <path d="M126.982 569.341L0 481.911l187.351-58.287 126.983 87.43-187.352 58.287z" fill="#034EA1"></path> <path d="M183.188 212.331l51.001-116.574 65.573 155.085-51.001 116.574-65.573-155.085z" fill="#712E74"></path> <path d="M248.761 367.415l51.001-116.574 171.739-28.102-49.96 115.533-172.78 29.143z" fill="#009FD1"></path> <path d="M299.762 250.842L234.189 95.757l171.739-28.103 65.573 155.085-171.739 28.103z" fill="#F6921E"></path> <path d="M187.352 222.741L59.328 198.802 44.757 71.819 172.78 95.76l14.572 126.982z" fill="#DA2128"></path> <path d="M172.78 95.758L44.757 71.818l70.777-70.776 128.023 23.94-70.777 70.776z" fill="#25BCBD"></path> <path d="M258.129 153.005l-70.777 69.736-14.571-126.982 70.777-70.778 14.571 128.024z" fill="#00844A"></path></svg></a></div> <a aria-label="Frontiers in Systems Neuroscience" href="//www.frontiersin.org/journals/systems-neuroscience" data-event="iBar-a-journalHome" class="Ibar__journalName"><div logoClass="Ibar__logo--mixed" class="Ibar__journalName__container"><div class="Ibar__journal__maskLogo" style="display:none;"><img src="" class="Ibar__journal__logo"></div> <div class="Ibar__journalName"><span>Frontiers in</span> <span> Systems Neuroscience</span></div></div></a> <div parent-data-event="iBar" class="Ibar__dropdown Ibar__dropdown--aboutUs"><button class="Ibar__dropdown__trigger"><!----> About us </button> <div class="Ibar__dropdown__menu"><div class="Ibar__dropdown__menu__header"><button aria-label="Close Dropdown" class="Ibar__dropdown__menu__header__title"> About us </button> <button aria-label="Close Dropdown" class="Ibar__close"></button></div> <div class="Ibar__dropdown__about"><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Who we are</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/mission" target="_self" data-event="iBar-aboutUs_0-a_whoWeAre">Mission and values</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/history" target="_self" data-event="iBar-aboutUs_0-a_whoWeAre">History</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/leadership" target="_self" data-event="iBar-aboutUs_0-a_whoWeAre">Leadership</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/awards" target="_self" data-event="iBar-aboutUs_0-a_whoWeAre">Awards</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Impact and progress</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/impact" target="_self" data-event="iBar-aboutUs_1-a_impactAndProgress">Frontiers' impact</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://progressreport.frontiersin.org/?utm_source=fweb&utm_medium=frep&utm_campaign=pr20" target="_blank" data-event="iBar-aboutUs_1-a_impactAndProgress">Progress Report 2022</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/annual-reports" target="_self" data-event="iBar-aboutUs_1-a_impactAndProgress">All annual reports</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Publishing model</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/how-we-publish" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">How we publish</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/open-access" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Open access</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/peer-review" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Peer review</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/research-integrity" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Research integrity</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/research-topics" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Research Topics</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/fair-data-management" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">FAIR² Data Management</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/fee-policy" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Fee policy</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Services</li> <li class="Ibar__dropdown__about__block__item"><a href="https://publishingpartnerships.frontiersin.org/" target="_blank" data-event="iBar-aboutUs_3-a_services">Societies</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/open-access-agreements/consortia" target="_self" data-event="iBar-aboutUs_3-a_services">National consortia</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/open-access-agreements" target="_self" data-event="iBar-aboutUs_3-a_services">Institutional partnerships</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/collaborators" target="_self" data-event="iBar-aboutUs_3-a_services">Collaborators</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">More from Frontiers</li> <li class="Ibar__dropdown__about__block__item"><a href="https://forum.frontiersin.org/" target="_blank" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Frontiers Forum</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersplanetprize.org/" target="_blank" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Frontiers Planet Prize</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://pressoffice.frontiersin.org/" target="_blank" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Press office</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.orgabout/sustainability" target="_self" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Sustainability</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://careers.frontiersin.org/" target="_blank" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Career opportunities</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/contact" target="_self" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Contact us</a></li></ul></div></div></div> <a href="https://www.frontiersin.org/journals" data-event="iBar-a-allJournals" class="Ibar__link">All journals</a><a href="https://www.frontiersin.org/articles" data-event="iBar-a-allArticles" class="Ibar__link">All articles</a> <a href="https://www.frontiersin.org/submission/submit?domainid=1&fieldid=55&specialtyid=1091&entitytype=1&entityid=5" data-event="iBar-a-submit" class="Ibar__button Ibar__submit">Submit your research</a> <div class="Ibar__spacer"></div> <a href="/search" aria-label="Search" data-event="iBar-a-search" class="Ibar__icon Ibar__icon--search"><span>Search</span></a> <!----> <!----> <!----> <div class="Ibar__userArea"></div></div></div> <div class="Ibar__menu Ibar__menu--journal"><div class="Ibar__menu__header"><div class="Ibar__logo"><div class="Ibar__logo"><a href="//www.frontiersin.org/" aria-label="Frontiershome" data-event="iBar-a-home" class="Ibar__logo__link"><svg viewBox="0 0 2811 590" fill="none" xmlns="http://www.w3.org/2000/svg" class="Ibar__logo__svg"><path d="M633.872 234.191h-42.674v-57.246h42.674c0-19.776 2.082-35.389 5.204-48.92 4.164-13.53 9.368-23.939 17.695-31.225 8.326-8.326 18.735-13.53 32.266-16.653 13.531-3.123 29.143-5.204 47.878-5.204h21.858c7.286 0 14.572 1.04 21.857 1.04v62.451c-8.326-1.041-16.653-2.082-23.939-2.082-10.408 0-17.694 1.041-23.939 4.164-6.245 3.122-9.368 10.408-9.368 22.898v13.531h53.083v57.246h-53.083v213.372h-89.512V234.191zM794.161 176.945h86.39v47.879h1.041c6.245-17.694 16.653-30.185 31.225-39.552 14.572-9.368 31.225-13.531 49.96-13.531h10.409c3.122 0 7.286 1.041 10.408 2.082v81.185c-6.245-2.082-11.449-3.122-16.653-4.163-5.204-1.041-11.449-1.041-16.654-1.041-11.449 0-20.816 2.082-29.143 5.204-8.327 3.123-15.613 8.327-20.817 14.572-5.204 6.245-10.408 12.49-12.49 20.817-3.123 8.326-4.163 15.612-4.163 23.939v133.228h-88.472V176.945h-1.041zM989.84 312.254c0-19.776 3.122-39.552 10.41-56.205 7.28-17.695 16.65-32.266 29.14-45.797 12.49-13.531 27.06-22.899 44.76-30.185 17.69-7.285 36.43-11.449 57.24-11.449 20.82 0 39.56 4.164 57.25 11.449 17.69 7.286 32.27 17.695 45.8 30.185 12.49 12.49 22.9 28.102 29.14 45.797 7.29 17.694 10.41 36.429 10.41 56.205 0 20.817-3.12 39.552-10.41 57.246-7.29 17.695-16.65 32.266-29.14 44.756-12.49 12.49-28.11 22.899-45.8 30.185-17.69 7.286-36.43 11.449-57.25 11.449-20.81 0-40.59-4.163-57.24-11.449-17.7-7.286-32.27-17.695-44.76-30.185-12.49-12.49-21.86-28.102-29.14-44.756-7.288-17.694-10.41-36.429-10.41-57.246zm88.47 0c0 8.327 1.04 17.694 3.12 26.021 2.09 9.368 5.21 16.653 9.37 23.939 4.16 7.286 9.37 13.531 16.65 17.695 7.29 4.163 15.62 7.285 26.03 7.285 10.4 0 18.73-2.081 26.02-7.285 7.28-4.164 12.49-10.409 16.65-17.695 4.16-7.286 7.29-15.612 9.37-23.939 2.08-9.368 3.12-17.694 3.12-26.021 0-8.327-1.04-17.694-3.12-26.021-2.08-9.368-5.21-16.653-9.37-23.939-4.16-7.286-9.37-13.531-16.65-17.695-7.29-5.204-15.62-7.285-26.02-7.285-10.41 0-18.74 2.081-26.03 7.285-7.28 5.205-12.49 10.409-16.65 17.695-4.16 7.286-7.28 15.612-9.37 23.939-2.08 9.368-3.12 17.694-3.12 26.021zM1306.25 176.945h86.39v37.47h1.04c4.17-7.286 9.37-13.531 15.62-18.735 6.24-5.204 13.53-10.408 20.81-14.572 7.29-4.163 15.62-7.286 23.94-9.367 8.33-2.082 16.66-3.123 24.98-3.123 22.9 0 40.6 4.164 53.09 11.449 13.53 7.286 22.89 16.654 29.14 27.062 6.24 10.409 10.41 21.858 12.49 34.348 2.08 12.49 2.08 22.898 2.08 33.307v172.779h-88.47V316.417v-27.061c0-9.368-1.04-16.654-4.16-23.94-3.13-7.286-7.29-12.49-13.53-16.653-6.25-4.164-15.62-6.245-27.07-6.245-8.32 0-15.61 2.081-21.85 5.204-6.25 3.122-11.45 7.286-14.58 13.531-4.16 5.204-6.24 11.449-8.32 18.735s-3.12 14.572-3.12 21.858v145.717h-88.48V176.945zM1780.88 234.19h-55.17v122.819c0 10.408 3.12 17.694 8.33 20.817 6.24 3.122 13.53 5.204 22.9 5.204 4.16 0 7.28 0 11.45-1.041h11.45v65.573c-8.33 0-15.62 1.041-23.94 2.082-8.33 1.04-16.66 1.041-23.94 1.041-18.74 0-34.35-2.082-46.84-5.205-12.49-3.122-21.86-8.326-29.14-15.612-7.29-7.286-12.49-16.654-14.58-29.144-3.12-12.49-4.16-27.062-4.16-45.797V234.19h-44.76v-57.246h44.76V94.717h88.47v82.227h55.17v57.246zM1902.66 143.639h-88.48V75.984h88.48v67.655zm-89.52 33.307h88.48v270.618h-88.48V176.946zM2024.43 334.111c1.04 18.735 6.25 33.307 16.66 44.756 10.4 11.449 24.98 16.653 43.71 16.653 10.41 0 20.82-2.081 30.19-7.286 9.36-5.204 16.65-12.49 20.81-22.898h83.27c-4.16 15.613-10.41 29.144-19.78 40.593-9.36 11.449-19.77 20.817-31.22 28.102-12.49 7.286-24.98 12.491-39.55 16.654-14.57 3.122-29.15 5.204-43.72 5.204-21.86 0-41.63-3.122-60.37-9.367-18.73-6.246-34.34-15.613-46.83-28.103-12.49-12.49-22.9-27.062-30.19-45.797-7.28-17.694-10.41-38.511-10.41-60.369 0-20.817 4.17-39.552 11.45-57.246 7.29-17.694 17.7-32.266 31.23-44.756 13.53-12.49 29.14-21.858 46.83-29.144 17.7-7.286 36.43-10.408 56.21-10.408 23.94 0 45.8 4.163 63.49 12.49 17.7 8.327 33.31 19.776 44.76 35.389 11.45 15.612 20.81 32.266 26.02 52.042 5.2 19.776 8.33 41.633 7.28 64.532h-199.84v-1.041zm110.33-49.961c-1.04-15.612-6.24-28.102-15.61-39.551-9.37-10.409-21.86-16.654-37.47-16.654s-28.1 5.204-38.51 15.613c-10.41 10.408-16.66 23.939-18.74 40.592h110.33zM2254.46 176.945h86.39v47.879h1.04c6.25-17.694 16.65-30.185 31.23-39.552 14.57-9.368 31.22-13.531 49.96-13.531h10.4c3.13 0 7.29 1.041 10.41 2.082v81.185c-6.24-2.082-11.45-3.122-16.65-4.163-5.21-1.041-11.45-1.041-16.65-1.041-11.45 0-20.82 2.082-29.15 5.204-8.32 3.123-15.61 8.327-20.81 14.572-6.25 6.245-10.41 12.49-12.49 20.817-3.13 8.326-4.17 15.612-4.17 23.939v133.228h-88.47V176.945h-1.04zM2534.45 359.091c0 7.286 1.04 12.49 4.16 17.694 3.12 5.204 6.24 9.368 10.41 12.49 4.16 3.123 9.36 5.204 14.57 7.286 6.24 2.082 11.45 2.082 17.69 2.082 4.17 0 8.33 0 13.53-2.082 5.21-1.041 9.37-3.123 13.53-5.204 4.17-2.082 7.29-5.204 10.41-9.368 3.13-4.163 4.17-8.327 4.17-13.531 0-5.204-2.09-9.367-5.21-12.49-3.12-3.122-7.28-6.245-11.45-8.327-4.16-2.081-9.36-4.163-14.57-5.204-5.2-1.041-9.37-2.081-13.53-3.122-13.53-3.123-28.1-6.245-42.67-9.368-14.58-3.122-28.11-7.286-40.6-12.49-12.49-6.245-22.9-13.531-30.18-23.939-8.33-10.409-11.45-23.94-11.45-42.675 0-16.653 4.16-30.184 11.45-40.592 8.33-10.409 17.69-18.736 30.18-24.981 12.49-6.245 26.02-10.408 40.6-13.53 14.57-3.123 28.1-4.164 41.63-4.164 14.57 0 29.14 1.041 43.71 4.164 14.58 2.081 27.07 7.285 39.56 13.53 12.49 6.245 21.85 15.613 29.14 27.062 7.29 11.45 11.45 26.021 12.49 43.716h-82.23c0-10.409-4.16-18.736-11.45-23.94-7.28-4.163-16.65-7.286-28.1-7.286-4.16 0-8.32 0-12.49 1.041-4.16 1.041-8.32 1.041-12.49 2.082-4.16 1.041-7.28 3.122-9.37 6.245-2.08 3.122-4.16 6.245-4.16 11.449 0 6.245 3.12 11.449 10.41 15.613 6.24 4.163 14.57 7.286 24.98 10.408 10.41 2.082 20.82 5.204 32.27 7.286 11.44 2.082 22.89 4.163 33.3 6.245 13.53 3.123 24.98 7.286 33.31 13.531 9.37 6.245 15.61 12.49 20.82 19.776 5.2 7.286 9.36 14.572 11.45 21.858 2.08 7.285 3.12 13.53 3.12 19.776 0 17.694-4.17 33.306-11.45 45.796-8.33 12.491-17.7 21.858-30.19 30.185-12.49 7.286-26.02 12.49-41.63 16.653-15.61 3.123-31.22 5.204-45.8 5.204-15.61 0-32.26-1.04-47.87-4.163-15.62-3.122-29.15-8.327-41.64-15.612a83.855 83.855 0 01-30.18-30.185c-8.33-12.49-12.49-28.102-12.49-46.838h84.31v-2.081z" fill="#FFFFFF" class="Ibar__logo__text"></path> <path d="M0 481.911V281.028l187.351-58.287v200.882L0 481.911z" fill="#8BC53F"></path> <path d="M187.351 423.623V222.741l126.983 87.431v200.882l-126.983-87.431z" fill="#EBD417"></path> <path d="M126.982 569.341L0 481.911l187.351-58.287 126.983 87.43-187.352 58.287z" fill="#034EA1"></path> <path d="M183.188 212.331l51.001-116.574 65.573 155.085-51.001 116.574-65.573-155.085z" fill="#712E74"></path> <path d="M248.761 367.415l51.001-116.574 171.739-28.102-49.96 115.533-172.78 29.143z" fill="#009FD1"></path> <path d="M299.762 250.842L234.189 95.757l171.739-28.103 65.573 155.085-171.739 28.103z" fill="#F6921E"></path> <path d="M187.352 222.741L59.328 198.802 44.757 71.819 172.78 95.76l14.572 126.982z" fill="#DA2128"></path> <path d="M172.78 95.758L44.757 71.818l70.777-70.776 128.023 23.94-70.777 70.776z" fill="#25BCBD"></path> <path d="M258.129 153.005l-70.777 69.736-14.571-126.982 70.777-70.778 14.571 128.024z" fill="#00844A"></path></svg></a></div></div> <button aria-label="Close Menu" data-event="iBarMenu-btn-closeMenu" class="Ibar__close"></button></div> <div class="Ibar__menu__wrapper"><div class="Ibar__menu__journal"><a href="//www.frontiersin.org/journals/systems-neuroscience" data-event="iBarMenu-a-journalHome"><div class="Ibar__journalName__container"><div class="Ibar__journal__maskLogo" style="display:none;"><img src="" class="Ibar__journal__logo"></div> <div class="Ibar__journalName"><span>Frontiers in</span> <span> Systems Neuroscience</span></div></div></a> <!----> <a href="//www.frontiersin.org/journals/systems-neuroscience/articles" data-event="iBar-a-articles" class="Ibar__link">Articles</a><a href="//www.frontiersin.org/journals/systems-neuroscience/research-topics" data-event="iBar-a-researchTopics" class="Ibar__link">Research Topics</a><a href="//www.frontiersin.org/journals/systems-neuroscience/editors" data-event="iBar-a-editorialBoard" class="Ibar__link">Editorial board</a> <div parent-data-event="iBarMenu" class="Ibar__dropdown"><button class="Ibar__dropdown__trigger"><!----> About journal </button> <div class="Ibar__dropdown__menu"><div class="Ibar__dropdown__menu__header"><button aria-label="Close Dropdown" class="Ibar__dropdown__menu__header__title"> About journal </button> <button aria-label="Close Dropdown" class="Ibar__close"></button></div> <div class="Ibar__dropdown__about"><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Scope</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-editors" target="_self" data-event="iBar-aboutJournal_0-a_scope">Specialty chief editors</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-scope" target="_self" data-event="iBar-aboutJournal_0-a_scope">Mission & scope</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-facts" target="_self" data-event="iBar-aboutJournal_0-a_scope">Facts</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-submission" target="_self" data-event="iBar-aboutJournal_0-a_scope">Submission</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-open" target="_self" data-event="iBar-aboutJournal_0-a_scope">Open access statement</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#copyright-statement" target="_self" data-event="iBar-aboutJournal_0-a_scope">Copyright statement</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-quality" target="_self" data-event="iBar-aboutJournal_0-a_scope">Quality</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">For authors</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/why-submit" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Why submit?</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/article-types" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Article types</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/author-guidelines" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Author guidelines</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/editor-guidelines" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Editor guidelines</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/publishing-fees" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Publishing fees</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/submission-checklist" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Submission checklist</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/contact-editorial-office" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Contact editorial office</a></li></ul></div></div></div></div> <div parent-data-event="iBarMenu" class="Ibar__dropdown Ibar__dropdown--aboutUs"><button class="Ibar__dropdown__trigger"><!----> About us </button> <div class="Ibar__dropdown__menu"><div class="Ibar__dropdown__menu__header"><button aria-label="Close Dropdown" class="Ibar__dropdown__menu__header__title"> About us </button> <button aria-label="Close Dropdown" class="Ibar__close"></button></div> <div class="Ibar__dropdown__about"><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Who we are</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/mission" target="_self" data-event="iBar-aboutUs_0-a_whoWeAre">Mission and values</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/history" target="_self" data-event="iBar-aboutUs_0-a_whoWeAre">History</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/leadership" target="_self" data-event="iBar-aboutUs_0-a_whoWeAre">Leadership</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/awards" target="_self" data-event="iBar-aboutUs_0-a_whoWeAre">Awards</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Impact and progress</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/impact" target="_self" data-event="iBar-aboutUs_1-a_impactAndProgress">Frontiers' impact</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://progressreport.frontiersin.org/?utm_source=fweb&utm_medium=frep&utm_campaign=pr20" target="_blank" data-event="iBar-aboutUs_1-a_impactAndProgress">Progress Report 2022</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/annual-reports" target="_self" data-event="iBar-aboutUs_1-a_impactAndProgress">All annual reports</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Publishing model</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/how-we-publish" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">How we publish</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/open-access" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Open access</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/peer-review" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Peer review</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/research-integrity" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Research integrity</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/research-topics" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Research Topics</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/fair-data-management" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">FAIR² Data Management</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/fee-policy" target="_self" data-event="iBar-aboutUs_2-a_publishingModel">Fee policy</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Services</li> <li class="Ibar__dropdown__about__block__item"><a href="https://publishingpartnerships.frontiersin.org/" target="_blank" data-event="iBar-aboutUs_3-a_services">Societies</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/open-access-agreements/consortia" target="_self" data-event="iBar-aboutUs_3-a_services">National consortia</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/open-access-agreements" target="_self" data-event="iBar-aboutUs_3-a_services">Institutional partnerships</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/collaborators" target="_self" data-event="iBar-aboutUs_3-a_services">Collaborators</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">More from Frontiers</li> <li class="Ibar__dropdown__about__block__item"><a href="https://forum.frontiersin.org/" target="_blank" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Frontiers Forum</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersplanetprize.org/" target="_blank" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Frontiers Planet Prize</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://pressoffice.frontiersin.org/" target="_blank" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Press office</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.orgabout/sustainability" target="_self" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Sustainability</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://careers.frontiersin.org/" target="_blank" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Career opportunities</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/about/contact" target="_self" data-event="iBar-aboutUs_4-a_moreFromFrontiers">Contact us</a></li></ul></div></div></div> <a href="https://www.frontiersin.org/journals" data-event="iBar-a-allJournals" class="Ibar__link">All journals</a><a href="https://www.frontiersin.org/articles" data-event="iBar-a-allArticles" class="Ibar__link">All articles</a> <!----> <!----> <!----> <a href="https://www.frontiersin.org/submission/submit?domainid=1&fieldid=55&specialtyid=1091&entitytype=1&entityid=5" data-event="iBarMenu-a-submit" class="Ibar__button Ibar__submit">Submit your research</a></div></div> <div class="Ibar__journal"><div class="Ibar__wrapper Ibar__wrapper--journal"><a aria-label="Frontiers in Systems Neuroscience" href="//www.frontiersin.org/journals/systems-neuroscience" data-event="iBarJournal-a-journalHome" class="Ibar__journalName"><div class="Ibar__journalName__container"><div class="Ibar__journal__maskLogo" style="display:none;"><img src="" class="Ibar__journal__logo"></div> <div class="Ibar__journalName"><span>Frontiers in</span> <span> Systems Neuroscience</span></div></div></a> <!----> <a href="//www.frontiersin.org/journals/systems-neuroscience/articles" data-event="iBar-a-articles" class="Ibar__link">Articles</a><a href="//www.frontiersin.org/journals/systems-neuroscience/research-topics" data-event="iBar-a-researchTopics" class="Ibar__link">Research Topics</a><a href="//www.frontiersin.org/journals/systems-neuroscience/editors" data-event="iBar-a-editorialBoard" class="Ibar__link">Editorial board</a> <div parent-data-event="iBarJournal" class="Ibar__dropdown"><button class="Ibar__dropdown__trigger"><!----> About journal </button> <div class="Ibar__dropdown__menu"><div class="Ibar__dropdown__menu__header"><button aria-label="Close Dropdown" class="Ibar__dropdown__menu__header__title"> About journal </button> <button aria-label="Close Dropdown" class="Ibar__close"></button></div> <div class="Ibar__dropdown__about"><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Scope</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-editors" target="_self" data-event="iBar-aboutJournal_0-a_scope">Specialty chief editors</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-scope" target="_self" data-event="iBar-aboutJournal_0-a_scope">Mission & scope</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-facts" target="_self" data-event="iBar-aboutJournal_0-a_scope">Facts</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-submission" target="_self" data-event="iBar-aboutJournal_0-a_scope">Submission</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-open" target="_self" data-event="iBar-aboutJournal_0-a_scope">Open access statement</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#copyright-statement" target="_self" data-event="iBar-aboutJournal_0-a_scope">Copyright statement</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-quality" target="_self" data-event="iBar-aboutJournal_0-a_scope">Quality</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">For authors</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/why-submit" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Why submit?</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/article-types" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Article types</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/author-guidelines" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Author guidelines</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/editor-guidelines" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Editor guidelines</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/publishing-fees" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Publishing fees</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/submission-checklist" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Submission checklist</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/contact-editorial-office" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Contact editorial office</a></li></ul></div></div></div> <div class="Ibar__spacer"></div></div></div> <div class="Ibar__journal Ibar__journal--mix"><div class="Ibar__wrapper Ibar__wrapper--journal"><div class="Ibar__logo"><a href="//www.frontiersin.org/" aria-label="Frontiershome" data-event="iBar-a-home" class="Ibar__logo__link"><svg viewBox="0 0 2811 590" fill="none" xmlns="http://www.w3.org/2000/svg" class="Ibar__logo__svg"><path d="M633.872 234.191h-42.674v-57.246h42.674c0-19.776 2.082-35.389 5.204-48.92 4.164-13.53 9.368-23.939 17.695-31.225 8.326-8.326 18.735-13.53 32.266-16.653 13.531-3.123 29.143-5.204 47.878-5.204h21.858c7.286 0 14.572 1.04 21.857 1.04v62.451c-8.326-1.041-16.653-2.082-23.939-2.082-10.408 0-17.694 1.041-23.939 4.164-6.245 3.122-9.368 10.408-9.368 22.898v13.531h53.083v57.246h-53.083v213.372h-89.512V234.191zM794.161 176.945h86.39v47.879h1.041c6.245-17.694 16.653-30.185 31.225-39.552 14.572-9.368 31.225-13.531 49.96-13.531h10.409c3.122 0 7.286 1.041 10.408 2.082v81.185c-6.245-2.082-11.449-3.122-16.653-4.163-5.204-1.041-11.449-1.041-16.654-1.041-11.449 0-20.816 2.082-29.143 5.204-8.327 3.123-15.613 8.327-20.817 14.572-5.204 6.245-10.408 12.49-12.49 20.817-3.123 8.326-4.163 15.612-4.163 23.939v133.228h-88.472V176.945h-1.041zM989.84 312.254c0-19.776 3.122-39.552 10.41-56.205 7.28-17.695 16.65-32.266 29.14-45.797 12.49-13.531 27.06-22.899 44.76-30.185 17.69-7.285 36.43-11.449 57.24-11.449 20.82 0 39.56 4.164 57.25 11.449 17.69 7.286 32.27 17.695 45.8 30.185 12.49 12.49 22.9 28.102 29.14 45.797 7.29 17.694 10.41 36.429 10.41 56.205 0 20.817-3.12 39.552-10.41 57.246-7.29 17.695-16.65 32.266-29.14 44.756-12.49 12.49-28.11 22.899-45.8 30.185-17.69 7.286-36.43 11.449-57.25 11.449-20.81 0-40.59-4.163-57.24-11.449-17.7-7.286-32.27-17.695-44.76-30.185-12.49-12.49-21.86-28.102-29.14-44.756-7.288-17.694-10.41-36.429-10.41-57.246zm88.47 0c0 8.327 1.04 17.694 3.12 26.021 2.09 9.368 5.21 16.653 9.37 23.939 4.16 7.286 9.37 13.531 16.65 17.695 7.29 4.163 15.62 7.285 26.03 7.285 10.4 0 18.73-2.081 26.02-7.285 7.28-4.164 12.49-10.409 16.65-17.695 4.16-7.286 7.29-15.612 9.37-23.939 2.08-9.368 3.12-17.694 3.12-26.021 0-8.327-1.04-17.694-3.12-26.021-2.08-9.368-5.21-16.653-9.37-23.939-4.16-7.286-9.37-13.531-16.65-17.695-7.29-5.204-15.62-7.285-26.02-7.285-10.41 0-18.74 2.081-26.03 7.285-7.28 5.205-12.49 10.409-16.65 17.695-4.16 7.286-7.28 15.612-9.37 23.939-2.08 9.368-3.12 17.694-3.12 26.021zM1306.25 176.945h86.39v37.47h1.04c4.17-7.286 9.37-13.531 15.62-18.735 6.24-5.204 13.53-10.408 20.81-14.572 7.29-4.163 15.62-7.286 23.94-9.367 8.33-2.082 16.66-3.123 24.98-3.123 22.9 0 40.6 4.164 53.09 11.449 13.53 7.286 22.89 16.654 29.14 27.062 6.24 10.409 10.41 21.858 12.49 34.348 2.08 12.49 2.08 22.898 2.08 33.307v172.779h-88.47V316.417v-27.061c0-9.368-1.04-16.654-4.16-23.94-3.13-7.286-7.29-12.49-13.53-16.653-6.25-4.164-15.62-6.245-27.07-6.245-8.32 0-15.61 2.081-21.85 5.204-6.25 3.122-11.45 7.286-14.58 13.531-4.16 5.204-6.24 11.449-8.32 18.735s-3.12 14.572-3.12 21.858v145.717h-88.48V176.945zM1780.88 234.19h-55.17v122.819c0 10.408 3.12 17.694 8.33 20.817 6.24 3.122 13.53 5.204 22.9 5.204 4.16 0 7.28 0 11.45-1.041h11.45v65.573c-8.33 0-15.62 1.041-23.94 2.082-8.33 1.04-16.66 1.041-23.94 1.041-18.74 0-34.35-2.082-46.84-5.205-12.49-3.122-21.86-8.326-29.14-15.612-7.29-7.286-12.49-16.654-14.58-29.144-3.12-12.49-4.16-27.062-4.16-45.797V234.19h-44.76v-57.246h44.76V94.717h88.47v82.227h55.17v57.246zM1902.66 143.639h-88.48V75.984h88.48v67.655zm-89.52 33.307h88.48v270.618h-88.48V176.946zM2024.43 334.111c1.04 18.735 6.25 33.307 16.66 44.756 10.4 11.449 24.98 16.653 43.71 16.653 10.41 0 20.82-2.081 30.19-7.286 9.36-5.204 16.65-12.49 20.81-22.898h83.27c-4.16 15.613-10.41 29.144-19.78 40.593-9.36 11.449-19.77 20.817-31.22 28.102-12.49 7.286-24.98 12.491-39.55 16.654-14.57 3.122-29.15 5.204-43.72 5.204-21.86 0-41.63-3.122-60.37-9.367-18.73-6.246-34.34-15.613-46.83-28.103-12.49-12.49-22.9-27.062-30.19-45.797-7.28-17.694-10.41-38.511-10.41-60.369 0-20.817 4.17-39.552 11.45-57.246 7.29-17.694 17.7-32.266 31.23-44.756 13.53-12.49 29.14-21.858 46.83-29.144 17.7-7.286 36.43-10.408 56.21-10.408 23.94 0 45.8 4.163 63.49 12.49 17.7 8.327 33.31 19.776 44.76 35.389 11.45 15.612 20.81 32.266 26.02 52.042 5.2 19.776 8.33 41.633 7.28 64.532h-199.84v-1.041zm110.33-49.961c-1.04-15.612-6.24-28.102-15.61-39.551-9.37-10.409-21.86-16.654-37.47-16.654s-28.1 5.204-38.51 15.613c-10.41 10.408-16.66 23.939-18.74 40.592h110.33zM2254.46 176.945h86.39v47.879h1.04c6.25-17.694 16.65-30.185 31.23-39.552 14.57-9.368 31.22-13.531 49.96-13.531h10.4c3.13 0 7.29 1.041 10.41 2.082v81.185c-6.24-2.082-11.45-3.122-16.65-4.163-5.21-1.041-11.45-1.041-16.65-1.041-11.45 0-20.82 2.082-29.15 5.204-8.32 3.123-15.61 8.327-20.81 14.572-6.25 6.245-10.41 12.49-12.49 20.817-3.13 8.326-4.17 15.612-4.17 23.939v133.228h-88.47V176.945h-1.04zM2534.45 359.091c0 7.286 1.04 12.49 4.16 17.694 3.12 5.204 6.24 9.368 10.41 12.49 4.16 3.123 9.36 5.204 14.57 7.286 6.24 2.082 11.45 2.082 17.69 2.082 4.17 0 8.33 0 13.53-2.082 5.21-1.041 9.37-3.123 13.53-5.204 4.17-2.082 7.29-5.204 10.41-9.368 3.13-4.163 4.17-8.327 4.17-13.531 0-5.204-2.09-9.367-5.21-12.49-3.12-3.122-7.28-6.245-11.45-8.327-4.16-2.081-9.36-4.163-14.57-5.204-5.2-1.041-9.37-2.081-13.53-3.122-13.53-3.123-28.1-6.245-42.67-9.368-14.58-3.122-28.11-7.286-40.6-12.49-12.49-6.245-22.9-13.531-30.18-23.939-8.33-10.409-11.45-23.94-11.45-42.675 0-16.653 4.16-30.184 11.45-40.592 8.33-10.409 17.69-18.736 30.18-24.981 12.49-6.245 26.02-10.408 40.6-13.53 14.57-3.123 28.1-4.164 41.63-4.164 14.57 0 29.14 1.041 43.71 4.164 14.58 2.081 27.07 7.285 39.56 13.53 12.49 6.245 21.85 15.613 29.14 27.062 7.29 11.45 11.45 26.021 12.49 43.716h-82.23c0-10.409-4.16-18.736-11.45-23.94-7.28-4.163-16.65-7.286-28.1-7.286-4.16 0-8.32 0-12.49 1.041-4.16 1.041-8.32 1.041-12.49 2.082-4.16 1.041-7.28 3.122-9.37 6.245-2.08 3.122-4.16 6.245-4.16 11.449 0 6.245 3.12 11.449 10.41 15.613 6.24 4.163 14.57 7.286 24.98 10.408 10.41 2.082 20.82 5.204 32.27 7.286 11.44 2.082 22.89 4.163 33.3 6.245 13.53 3.123 24.98 7.286 33.31 13.531 9.37 6.245 15.61 12.49 20.82 19.776 5.2 7.286 9.36 14.572 11.45 21.858 2.08 7.285 3.12 13.53 3.12 19.776 0 17.694-4.17 33.306-11.45 45.796-8.33 12.491-17.7 21.858-30.19 30.185-12.49 7.286-26.02 12.49-41.63 16.653-15.61 3.123-31.22 5.204-45.8 5.204-15.61 0-32.26-1.04-47.87-4.163-15.62-3.122-29.15-8.327-41.64-15.612a83.855 83.855 0 01-30.18-30.185c-8.33-12.49-12.49-28.102-12.49-46.838h84.31v-2.081z" fill="#FFFFFF" class="Ibar__logo__text"></path> <path d="M0 481.911V281.028l187.351-58.287v200.882L0 481.911z" fill="#8BC53F"></path> <path d="M187.351 423.623V222.741l126.983 87.431v200.882l-126.983-87.431z" fill="#EBD417"></path> <path d="M126.982 569.341L0 481.911l187.351-58.287 126.983 87.43-187.352 58.287z" fill="#034EA1"></path> <path d="M183.188 212.331l51.001-116.574 65.573 155.085-51.001 116.574-65.573-155.085z" fill="#712E74"></path> <path d="M248.761 367.415l51.001-116.574 171.739-28.102-49.96 115.533-172.78 29.143z" fill="#009FD1"></path> <path d="M299.762 250.842L234.189 95.757l171.739-28.103 65.573 155.085-171.739 28.103z" fill="#F6921E"></path> <path d="M187.352 222.741L59.328 198.802 44.757 71.819 172.78 95.76l14.572 126.982z" fill="#DA2128"></path> <path d="M172.78 95.758L44.757 71.818l70.777-70.776 128.023 23.94-70.777 70.776z" fill="#25BCBD"></path> <path d="M258.129 153.005l-70.777 69.736-14.571-126.982 70.777-70.778 14.571 128.024z" fill="#00844A"></path></svg></a></div> <a aria-label="Frontiers in Systems Neuroscience" href="//www.frontiersin.org/journals/systems-neuroscience" data-event="iBarJournal-a-journalHome" class="Ibar__journalName"><div logoClass="Ibar__logo--mixed" class="Ibar__journalName__container"><div class="Ibar__journal__maskLogo" style="display:none;"><img src="" class="Ibar__journal__logo"></div> <div class="Ibar__journalName"><span>Frontiers in</span> <span> Systems Neuroscience</span></div></div></a> <div class="Ibar__spacer"></div> <!----> <a href="//www.frontiersin.org/journals/systems-neuroscience/articles" data-event="iBar-a-articles" class="Ibar__link">Articles</a><a href="//www.frontiersin.org/journals/systems-neuroscience/research-topics" data-event="iBar-a-researchTopics" class="Ibar__link">Research Topics</a><a href="//www.frontiersin.org/journals/systems-neuroscience/editors" data-event="iBar-a-editorialBoard" class="Ibar__link">Editorial board</a> <div parent-data-event="iBarJournal" class="Ibar__dropdown"><button class="Ibar__dropdown__trigger"><!----> About journal </button> <div class="Ibar__dropdown__menu"><div class="Ibar__dropdown__menu__header"><button aria-label="Close Dropdown" class="Ibar__dropdown__menu__header__title"> About journal </button> <button aria-label="Close Dropdown" class="Ibar__close"></button></div> <div class="Ibar__dropdown__about"><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">Scope</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-editors" target="_self" data-event="iBar-aboutJournal_0-a_scope">Specialty chief editors</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-scope" target="_self" data-event="iBar-aboutJournal_0-a_scope">Mission & scope</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-facts" target="_self" data-event="iBar-aboutJournal_0-a_scope">Facts</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-submission" target="_self" data-event="iBar-aboutJournal_0-a_scope">Submission</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-open" target="_self" data-event="iBar-aboutJournal_0-a_scope">Open access statement</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#copyright-statement" target="_self" data-event="iBar-aboutJournal_0-a_scope">Copyright statement</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/about#about-quality" target="_self" data-event="iBar-aboutJournal_0-a_scope">Quality</a></li></ul><ul class="Ibar__dropdown__about__block"><li class="Ibar__dropdown__about__block__title">For authors</li> <li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/why-submit" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Why submit?</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/article-types" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Article types</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/author-guidelines" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Author guidelines</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/editor-guidelines" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Editor guidelines</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/publishing-fees" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Publishing fees</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/submission-checklist" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Submission checklist</a></li><li class="Ibar__dropdown__about__block__item"><a href="https://www.frontiersin.org/journals/systems-neuroscience/for-authors/contact-editorial-office" target="_self" data-event="iBar-aboutJournal_1-a_forAuthors">Contact editorial office</a></li></ul></div></div></div> <div class="Ibar__spacer"></div> <a href="https://www.frontiersin.org/submission/submit?domainid=1&fieldid=55&specialtyid=1091&entitytype=1&entityid=5" data-event="iBarJournal-a-submit" class="Ibar__button Ibar__submit"><span>Submit</span> <span> your research</span></a> <a href="/search" aria-label="Search" data-event="iBar-a-search" class="Ibar__icon Ibar__icon--search"><span>Search</span></a> <!----> <!----> <!----> <div class="Ibar__userArea"></div></div></div></nav> <div class="ArticlePage"><div><div class="Layout Layout--withAside Layout--withIbarMix ArticleDetails"><!----> <main class="Layout__main"><!----> <div class="ArticleDetails__main"><div class="ArticleLayoutHeader"><div class="ArticleLayoutHeader__info"><p class="ArticleLayoutHeader__info__title"> HYPOTHESIS AND THEORY article </p> <p class="ArticleLayoutHeader__info__journalDate"><span>Front. Syst. Neurosci.</span> <span>, 18 November 2015</span></p> <!----> <p class="ArticleLayoutHeader__info__doiVolume"><span> Volume 9 - 2015 | </span> <a href="https://doi.org/10.3389/fnsys.2015.00156" class="ArticleLayoutHeader__info__doi"> https://doi.org/10.3389/fnsys.2015.00156 </a></p> <!----></div> <!----> <p class="ArticleLayoutHeader__isPartOfRT"><span class="ArticleLayoutHeader__isPartOfRT__label">This article is part of the Research Topic</span> <span class="ArticleLayoutHeader__isPartOfRT__title">Paradigm shifts and innovations in Neuroscience</span> <span class="Link__wrapper"><a aria-label="View all 89 articles" href="https://www.frontiersin.org/research-topics/3099/paradigm-shifts-and-innovations-in-neuroscience/articles" target="_self" data-event="customLink-link-a_viewAll89Articles" class="Link Link--linkType Link--maincolor Link--medium Link--icon Link--chevronRight Link--right"><span>View all 89 articles</span></a></span></p></div> <div class="ArticleDetails__main__content"><div class="ArticleDetails__main__content__main ArticleDetails__main__content__main--fullArticle"><div class="JournalAbstract"><div class="JournalAbstract__titleWrapper"><h1>Perception and Reality: Why a Wholly Empirical Paradigm is Needed to Understand Vision</h1> <!----></div> <div class="RelatedArticles"><div class="Alert Alert--warning"><div class="Alert__icon"><svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 48 48" class="FeedbackIcon" style="width:24px;"><path fill="#fff" d="M12 10h24v28H12z"></path> <path fill="var(--orange50)" d="M24 2.88a21.12 21.12 0 1 0 0 42.24 21.12 21.12 0 0 0 0-42.24Zm1.92 31.73a.96.96 0 0 1-.96.96h-1.92a.96.96 0 0 1-.96-.96v-1.92a.96.96 0 0 1 .96-.96h1.92a.96.96 0 0 1 .96.96v1.92Zm0-7.68a.96.96 0 0 1-.96.96h-1.92a.96.96 0 0 1-.96-.96V13.49a.96.96 0 0 1 .96-.96h1.92a.96.96 0 0 1 .96.96v13.44Z"></path></svg></div> <div class="Alert__main"><p class="Alert__message">A commentary has been posted on this article:</p> <!----></div> <ol class="Alert__info"><li class="Alert__infoItem"><p class="Alert__infoItem__text">Commentary: Perception and Reality: Why a Wholly Empirical Paradigm is Needed to Understand Vision</p> <ol class="Alert__infoItem__links"><li><span class="Link__wrapper"><a aria-label="Read general commentary" href="/articles/10.3389/fnsys.2016.00077" data-event="customLink-link-a_readGeneralCommentary" class="Link Link--linkType Link--grey Link--small Link--icon Link--chevronRight Link--right"><span>Read general commentary</span></a></span></li></ol></li></ol></div></div></div> <div class="JournalFullText"><div class="JournalAbstract"> <a id="h1" name="h1"></a> <div class="authors"><span class="author-wrapper notranslate"> <a href="https://loop.frontiersin.org/people/37265" class="user-id-37265"><img class="pr5" src="https://loop.frontiersin.org/images/profile/37265/74" onerror="this.onerror=null;this.src='https://loop.frontiersin.org/cdn/images/profile/default_32.jpg';" alt="\r\nDale Purves*">Dale Purves</a><sup>1*</sup></span><span class="author-wrapper notranslate"><a href="https://loop.frontiersin.org/people/101928" class="user-id-101928"><img class="pr5" src="https://loop.frontiersin.org/images/profile/101928/74" onerror="this.onerror=null;this.src='https://loop.frontiersin.org/cdn/images/profile/default_32.jpg';" alt="Yaniv Morgenstern">Yaniv Morgenstern</a><sup>2</sup></span><span class="author-wrapper notranslate"><a href="https://loop.frontiersin.org/people/149132" class="user-id-149132"><img class="pr5" src="https://loop.frontiersin.org/images/profile/149132/74" onerror="this.onerror=null;this.src='https://loop.frontiersin.org/cdn/images/profile/default_32.jpg';" alt="William T. Wojtach,">William T. Wojtach</a><sup>1,2</sup></span></div> <ul class="notes"> <li><span><sup>1</sup></span>Duke Institute for Brain Sciences, Duke University, Durham, NC, USA</li> <li><span><sup>2</sup></span>Duke-NUS Graduate Medical School, Singapore, Singapore</li> </ul> <p>A central puzzle in vision science is how perceptions that are routinely at odds with physical measurements of real world properties can arise from neural responses that nonetheless lead to effective behaviors. Here we argue that the solution depends on: (1) rejecting the assumption that the goal of vision is to recover, however imperfectly, properties of the world; and (2) replacing it with a paradigm in which perceptions reflect biological utility based on past experience rather than objective features of the environment. Present evidence is consistent with the conclusion that conceiving vision in wholly empirical terms provides a plausible way to understand what we see and why.</p> <div class="clear"></div> </div> <div class="JournalFullText"> <a id="h2" name="h2"></a><h2>Introduction</h2> <p class="mb15">A widely accepted concept of vision in recent decades stems from studies carried out by Stephen Kuffler, David Hubel and Torsten Wiesel beginning in the 1950s (<a href="#B45">Kuffler, 1953</a>; <a href="#B33">Hubel and Wiesel, 2005</a>). This seminal work showed that neurons in the primary visual pathway of cats and monkeys respond to light stimuli in specific ways, implying that the detection of retinal image features plays a central role in visual perception. Based on the properties of simpler input-level cells, Hubel and Wiesel discovered that neurons in V1 respond selectively to retinal activation elicited by oriented bars of light, bars of a certain length, bars moving in different directions, and stimuli with different spectral properties. These and other findings earned Hubel and Wiesel a Nobel Prize in 1981 (Kuffler had died in 1980), and inspired a generation of scientists to pursue similar electrophysiological and neuroanatomical research in a variety of species in the ongoing effort to reveal how vision works.</p> <p class="mb15">A seemingly straightforward interpretation of these observations is that the visual system operates analytically, extracting features from retinal images, efficiently filtering and processing image features in a series of computational steps, and ultimately combining them to provide a close approximation of physical reality that is then used to guide behavior. This concept of visual perception is logical, accords with electrophysiological and anatomical evidence, and has the further merit of being similar to the operation of computers, providing an analogy that connects biological vision with machine vision and artificial intelligence (<a href="#B50">Marr, 1982</a>). Finally, this interpretation concurs with the impression that we see the world more or less as it really is and behave accordingly. Indeed, to do otherwise would seem to defy common sense and insure failure.</p> <p class="mb0">Attractive though it is, this interpretation fails to consider an axiomatic fact about biological vision: retinal images conflate the physical properties of objects, and therefore cannot be used to recover the objective properties of the world (Figure <a href="#F1">1</a>). Consequently, the basic visual qualities we perceive—e.g., lightness, color, form, distance, depth and motion—cannot specify reality. A further fact that adds to the challenge of understanding how vision works is the discrepancy between these perceived qualities and the physical parameters of objects and conditions in the world (Figure <a href="#F2">2</a>). As numerous psychophysical studies have shown, lightness and darkness percepts are at odds with luminance, color is at odds with distributions of spectral power, size, distance and depth are at odds with geometrical measurements, and speeds and directions of motion are at odds with measured vectors (<a href="#B25">Gelb, 1929</a>; <a href="#B69">Stevens, 1975</a>; <a href="#B62">Rock, 1984</a>; <a href="#B61">Robinson, 1998</a>; <a href="#B56">Purves and Lotto, 2003</a>; <a href="#B77">Wojtach et al., 2008</a>, <a href="#B78">2009</a>; <a href="#B71">Sung et al., 2009</a>; <a href="#B58">Purves et al., 2014</a>). These differences between perception and reality cannot be dismissed as minor errors or approximations that are “close enough” to succeed, since the discrepancies are ubiquitous and often profound (see Figure <a href="#F2">2A</a>, for example).</p> <div class="DottedLine"></div> <div class="Imageheaders">FIGURE 1</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g001.jpg" name="figure1" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g001.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g001.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g001.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g001.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g001.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g001.jpg" alt="www.frontiersin.org" id="F1" loading="lazy"> </picture> </a> <p><strong>Figure 1. The major obstacle to the concept of vision as feature representation. (A)</strong> Luminance values in retinal stimuli are determined by illumination and reflectance, as well as a host of other factors (e.g., atmospheric transmittance, spectral content, and many more). These physical parameters are conflated in light stimuli, however, precluding biological measurements of the objective world in which perceptions and behaviors must play out. <strong>(B)</strong> The analogous conflation of geometrical information in retinal stimuli.</p></div> <div class="clear"></div> <div class="DottedLine mb15"></div> <div class="Imageheaders">FIGURE 2</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g002.jpg" name="figure2" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g002.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g002.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g002.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g002.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g002.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g002.jpg" alt="www.frontiersin.org" id="F2" loading="lazy"> </picture> </a> <p><strong>Figure 2. The perception of basic visual qualities is at odds with the world assessed by physical instruments. (A)</strong> One of many examples generated over the last century or more illustrating the discrepancy between luminance and lightness. Although each of the patches indicated in the inset returns the same amount of light to the eye (i.e., they have the same luminance), their apparent lightness values in the scene are very different. <strong>(B)</strong> An example of the discrepancy between perceived and measured geometry that has again been repeatedly documented since the mid-19th century. The lines on the left are all of equal length, but, as shown on the right, are perceived differently depending on their orientation (apparent length is expressed in relation to the horizontal line, which is seen as shortest in psychophysical testing).</p></div> <div class="clear"></div> <div class="DottedLine"></div> <p class="mb0 w100pc float_left mt15">The result has been diminished confidence in concepts of vision based on retinal feature detection, opening the door to other ways of understanding visual perception, the purposes of visual circuitry, and the genesis of visually guided behavior. A common denominator of these alternative views is the use of past experience—i.e., empirical evidence—to explain vision.</p> <a id="h3" name="h3"></a><h2>Early Ideas About Vision on an Empirical Basis</h2> <p class="mb15">The loss of information due to the transformation of three-dimensional (3-D) Euclidean space into two-dimensional (2-D) images and the introduction of noise inherent in biological processes led some early schools of psychology to advocate theories of vision that included the influence of lifetime experience. This line of thinking began in the mid-19th century when Hermann von Helmholtz proposed that perceptions arising from impoverished images are supplemented by “unconscious inferences” about reality made on the basis of individual experience (<a href="#B29">Helmholtz, 1866/1924</a>). He added the qualifier “unconscious” because observers are rarely aware of how their past experience could affect perception.</p> <p class="mb15">For much of the first half of the 20th century the role of empirical information in determining perception was conceived in terms of gestalt laws or other heuristics. The gestalt school was founded shortly after the turn of the century by <a href="#B76">Max Wertheimer (1912/1950)</a>, and advanced under the aegis of his students <a href="#B41">Kurt Koffka (1935)</a> and <a href="#B42">Wolfgang Köhler (1947)</a>. At the core of gestalt theory is the idea that the “units of experience go with the functional units in the underlying physiological processes” (<a href="#B42">Wolfgang Köhler, 1947</a>, p. 63). In gestalt terms, this influence was codified as the “the law of präganz” (meaning “conciseness”), expressing the idea that, based on experience, any perception would be determined by the simplest possible source of the image in question. Building on some of these ideas Egon Brünswik argued further that, in order to fully understand perception, the connection between the organism and the environment must be clarified. Given that information acquired by sense organs is uncertain, he supposed that visual animals must rely on the statistical nature of environments to achieve their goals. As described in his theory of “probabilistic functionalism” (<a href="#B13">Brünswik, 1956/1997</a>), Brünswik anticipated some current empirical approaches to vision based on probable world states (see Vision as Bayesian Inference).</p> <p class="mb15">Brünswik’s emphasis on the environment influenced the subsequent work of <a href="#B26">James Gibson (1966</a>, <a href="#B27">1979)</a>, who carried empirical thinking in yet another direction by arguing that perception is determined by the objects and circumstances observers are exposed to when moving though the world. Gibson proposed that observers could directly perceive their environment by relying on “invariances” in the structure of retinal images, a position similar to the statistical regularity of objects and conditions (e.g., commonly encountered ratios, proportions, and the like) identified by Brünswik. The invariances used by agents exploring the world led Gibson to posit vision as a “perceptual system” that included both the body and its environment—a position fundamentally different from Helmholtz’s idea of empirically modifying retinal image information acquired by a “sensing system”. In the case of size and distance, for example, Gibson maintained that the ratio of object projections to background textures provided the kind of invariant information that would allow an observer to directly apprehend otherwise ambiguous size-distance relationships. He took the mechanism to be one of “resonance” between the activity of a perceptual system and the properties of the environment that gave rise to light stimuli.</p> <p class="mb15">Although these early empirical strategies were imaginative and in some ways prescient, they suffered from an absence of ties to the structure and function of animal visual systems. Thus Helmholtz, Wertheimer, Koffka, Köhler, Brünswik, and Gibson were necessarily vague, speculative or simply mute about how empirical information might be usefully implemented in visual system physiology and anatomy. In consequence, empirical approaches to vision began to languish at mid-century, while visual neurobiology with its increasingly concrete evidence about how visual systems operate at the neuronal level came to dominate vision science in the 1960s and for the next several decades (<a href="#B33">Hubel and Wiesel, 2005</a>).</p> <p class="mb0">By the 1990s, however, it was becoming increasingly apparent that, despite key insights into the feature-selective properties of visual neurons, neurophysiological and neuroanatomical approaches to perception were unable to explain how processing retinal image features could, in principle, contend with the inability of visual stimuli to convey information about the objective properties of the world (see Figure <a href="#F1">1</a>). At the same time, advances in computer hardware and software were rapidly making the evaluation of large datasets relatively easy. Accordingly, investigators began to re-examine the merits of vision determined by past experience. The basis of much of this thinking has been that visual perception could be understood as probabilistic inferences about the most likely physical states of the world.</p> <a id="h4" name="h4"></a><h2>Vision as Bayesian Inference</h2> <p class="mb0">The most popular approach to vision as statistical inference is based on Bayesian decision theory (<a href="#B39">Knill and Richards, 1996</a>; <a href="#B49">Mamassian et al., 2002</a>; <a href="#B37">Kersten and Yuille, 2003</a>; <a href="#B47">Lee and Mumford, 2003</a>; <a href="#B36">Kersten et al., 2004</a>; <a href="#B38">Knill and Pouget, 2004</a>; <a href="#B43">Körding, 2014</a>). In effect, investigators built on Helmholtz’s idea of unconscious inference, formally recasting it in terms of Bayes’ theorem (<a href="#B6">Bayes, 1763</a>), a widely used procedure for assessing the probability of an inference being correct given a set of inconclusive evidence. The theorem states that the probability of inference <i>A</i> being true given evidence <i>B</i> (the posterior probability) depends on the probability of obtaining <i>B</i> given inference <i>A</i> (the likelihood), multiplied by the probability of inference <i>A</i> being true (the prior), these factors typically being normalized by dividing by the probability of evidence <i>B</i>. Thus the theorem can be written as:</p> <div class="equationImageholder"> <math id="M1" dislay="block"><mi mathsize="11pt" mathcolor="black">p</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">(</mo><mi mathsize="11pt" mathcolor="black">A</mi><mo stretchy='false' mathsize="9pt" mathcolor="black">|</mo><mi mathsize="11pt" mathcolor="black">B</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">)</mo><mo stretchy='false' mathsize="11pt" mathcolor="black">=</mo><mfrac><mrow><mi mathsize="11pt" mathcolor="black">p</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">(</mo><mi mathsize="11pt" mathcolor="black">B</mi><mo stretchy='false' mathsize="9pt" mathcolor="black">|</mo><mi mathsize="11pt" mathcolor="black">A</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">)</mo><mi mathsize="11pt" mathcolor="black">p</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">(</mo><mi mathsize="11pt" mathcolor="black">A</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">)</mo></mrow><mrow><mi mathsize="11pt" mathcolor="black">p</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">(</mo><mi mathsize="11pt" mathcolor="black">B</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">)</mo></mrow></mfrac><mrow><mo stretchy='false' mathsize="11pt" mathcolor="black" lspace="3em">(</mo><mn mathsize="11pt" mathcolor="black">1</mn><mo stretchy='false' mathsize="11pt" mathcolor="black">)</mo></mrow></math> <div class="clear"></div> </div> <p class="mb0">To illustrate a Bayesian approach to vision, consider a simple example in which an image is generated by a single light source and a given surface reflectance (e.g., <a href="#B12">Brainard, 2009</a>; see also <a href="#B1">Allred and Brainard, 2013</a>). Although many physical factors are involved in generating natural images (see Figure <a href="#F1">1A</a>), the luminance values (L) in an image are primarily the product of the intensity of illumination (I) and reflectance properties of surfaces (R). Thus the first step in validating the idea that vision follows Bayes’ theorem would be to determine the probability distributions of surface reflectance and illumination values—the priors <i>p</i>(R) and <i>p</i>(I), respectively—which can be approximated by measurements in the environment. The next step would be to derive the likelihood function <i>p</i>(L|R, I), i.e., the probability of a specific luminance being generated by various surface reflectance and illumination intensities. The posterior distribution, <i>p</i>(R, I|L), is then obtained by multiplying the prior distribution by the likelihood function:</p> <div class="equationImageholder"> <math id="M2" dislay="block"><mi mathsize="11pt" mathcolor="black">p</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">(</mo><mi mathsize="11pt" mathcolor="black">R</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">,</mo><mi mathsize="11pt" mathcolor="black">I</mi><mo stretchy='false' mathsize="9pt" mathcolor="black">|</mo><mi mathsize="11pt" mathcolor="black">L</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">)</mo><mo stretchy='false' mathsize="11pt" mathcolor="black">=</mo><mfrac><mrow><mi mathsize="11pt" mathcolor="black">p</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">(</mo><mi mathsize="11pt" mathcolor="black">L</mi><mo stretchy='false' mathsize="9pt" mathcolor="black">|</mo><mi mathsize="11pt" mathcolor="black">R</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">,</mo><mi mathsize="11pt" mathcolor="black">I</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">)</mo><mi mathsize="11pt" mathcolor="black">p</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">(</mo><mi mathsize="11pt" mathcolor="black">R</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">)</mo><mi mathsize="11pt" mathcolor="black">p</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">(</mo><mi mathsize="11pt" mathcolor="black">I</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">)</mo></mrow><mrow><mi mathsize="11pt" mathcolor="black">p</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">(</mo><mi mathsize="11pt" mathcolor="black">L</mi><mo stretchy='false' mathsize="11pt" mathcolor="black">)</mo></mrow></mfrac><mrow><mo stretchy='false' mathsize="11pt" mathcolor="black" lspace="3em">(</mo><mn mathsize="11pt" mathcolor="black">2</mn><mo stretchy='false' mathsize="11pt" mathcolor="black">)</mo></mrow></math> <div class="clear"></div> </div> <p class="mb0">Because the posterior distribution indicates only the relative probabilities of a set of possible sources, a final step is to select particular reflectance and illumination values from the set according to an assumed gain-loss function. The perceptual outcome—the lightness seen—would presumably accord with the surface reflectance at the most likely combination of surface reflectance and illuminant intensity values. Thus, perceived lightness is taken to be an estimate of surface reflectance.</p> <p class="mb15">Experimental assessments of the responses to visual stimuli are made in terms of a Bayesian “ideal observer”, defined as an observer who always responds to the most probable state of the world (<a href="#B21">Geisler, 2011</a>)—e.g., the most probable surface reflectance value that could have given rise to a retinal luminance value. As indicated in equation (2), an experimenter can validate how well humans approach this ideal by measuring perceptual estimates—in this case, by gauging perceived lightness, which is assumed to be an estimate of surface reflectance—and comparing these to predictions that combine stimulus information in a statistically optical fashion (e.g., <a href="#B18">Ernst and Banks, 2002</a>; <a href="#B75">Weiss et al., 2002</a>). Studies of this sort have supported the conclusion that vision can indeed be modeled as a system based on Bayesian inferences. Whether estimating surface slant (<a href="#B40">Knill and Saunders, 2003</a>), responding to apparent motion stimuli (<a href="#B75">Weiss et al., 2002</a>; <a href="#B70">Stocker and Simoncelli, 2006</a>), planning movements (<a href="#B44">Körding and Wolpert, 2004</a>; <a href="#B72">Tassinari et al., 2006</a>), integrating somatosensory haptics and visual cues (<a href="#B18">Ernst and Banks, 2002</a>), combining prior real world assumptions with those in the scene at hand (<a href="#B51">Morgenstern et al., 2011</a>), or reporting lightness (<a href="#B1">Allred and Brainard, 2013</a>), subjects perform at close to Bayesian optimality.</p> <p class="mb15">The compelling logic of Bayesian decision theory and its useful formalization of Helmholtz’s concept of empirical inference notwithstanding, Bayesian approaches that rely on estimating properties of the world are at a loss when seeking to understand visual neurobiology and/or the neural mechanisms underlying psychophysical functions. The reason is simply that biological visual systems cannot acquire the information that Bayesian decision theory demands: when a Bayesian ideal observer predicts perception, it is because the perceived quality is assumed to estimate the actual properties and conditions in the world. Given the inherent ambiguity of retinal images (see Figure <a href="#F1">1</a>), however, Bayesian priors and likelihoods of reflectance, illumination or other physical variables are not available to biological visual systems.</p> <p class="mb0">Although it is possible to model how neural activity in different sensory systems could be combined using Bayesian decision theory (<a href="#B19">Fetsch et al., 2013</a>), such models cannot indicate how information about the physical world could be obtained in a way that avoids the quandary illustrated in Figure <a href="#F1">1</a>. Indeed, any model based on recovering or estimating real-world parameters, statistically or otherwise, will fail as a canonical explanation of visual perception (see also <a href="#B35">Jones and Love, 2011</a>; <a href="#B10">Bowers and Davis, 2012</a>). Biological vision must therefore depend on some other strategy that does not require accessing the real-world parameters of image sources.</p> <a id="h5" name="h5"></a><h2>Information Theoretic Approaches</h2> <p class="mb15">A different empirical approach to vision is based on information theory. Within a few years of Claude Shannon’s idea of using Boolean algebra to design switching circuits that could make messages transmitted over noisy communication channels more efficient (<a href="#B64">Shannon, 1948</a>; <a href="#B65">Shannon and Weaver, 1949</a>), this framework was applied to vision (<a href="#B3">Attneave, 1954</a>; <a href="#B5">Barlow, 1961</a>). The premise of these studies was that the properties of visual and other sensory systems would encode, transmit, and decode the empirical characteristics of naturally occurring stimuli with maximum efficiency. Subsequent approaches in these terms have variously interpreted vision to operate on the basis of predictive coding (<a href="#B68">Srinivasan et al., 1982</a>; <a href="#B60">Rao and Ballard, 1999</a>; <a href="#B30">Hosoya et al., 2005</a>); coding that de-correlates the information of noisy inputs (<a href="#B5">Barlow, 1961</a>; <a href="#B46">Laughlin, 1981</a>); a filtering scheme for ensuring sparse coding (<a href="#B54">Olshausen and Field, 1996</a>); and/or greater efficiency achieved by divisive normalization (<a href="#B63">Schwartz and Simoncelli, 2001</a>; <a href="#B15">Carandini and Heeger, 2012</a>).</p> <p class="mb15">The overarching theme of this approach is that optimizing information transfer by minimizing the metabolic and other costs of wiring, action potential generation and synaptic transfer—while at the same time maximizing the entropy of neural communication—could rationalize the characteristics of receptive fields in visual animals (<a href="#B28">Graham and Field, 2007</a>). As it has turned out, the idea that some features of visual systems arise from efficiently encoding the statistical structure of natural environments is consistent with a number of computational (<a href="#B68">Srinivasan et al., 1982</a>; <a href="#B2">Atick and Redlich, 1993</a>; <a href="#B54">Olshausen and Field, 1996</a>; <a href="#B7">Bell and Sejnowski, 1997</a>; <a href="#B73">van Hateren and van der Schaaf, 1998</a>; <a href="#B11">Brady and Field, 2000</a>; <a href="#B63">Schwartz and Simoncelli, 2001</a>; <a href="#B67">Simoncelli and Olshausen, 2001</a>) and physiological studies (<a href="#B17">Dan et al., 1996</a>; <a href="#B4">Baddeley et al., 1997</a>; <a href="#B74">Vinje and Gallant, 2000</a>).</p> <p class="mb15">Although the success of models based on information theory leaves no doubt about the advantages of efficient visual processing, the models do not explain how the inevitable conflation of information in images is dealt with by the visual system (see earlier and Figure <a href="#F1">1</a>), or why perceived visual qualities do not correspond with measured physical parameters in the visual environment (see Figure <a href="#F2">2</a>). Nor do they indicate how biological visual systems successfully guide behavior.</p> <p class="mb0">While these deficiencies do not diminish the importance of efficient neural processing conceived in terms of Shannon entropy, efficiency is not directly germane to perception and behavior, just as efficiency in telecommunication is not germane to the content of the messages that are transmitted. Generating perceptions that succeed in a world whose physical parameters cannot be recovered is a different goal, in much the same way that the functional aim of any organ system differs from the concurrent need to achieve its purposes as efficiently as possible.</p> <a id="h6" name="h6"></a><h2>A Wholly Empirical Approach</h2> <p class="mb15">The aim of the visual system in these approaches is assumed to be the recovery of real world properties, however imperfectly, from information in retinal stimuli. A different supposition is that since retinal images cannot specify the measurable properties of objects (see Figure <a href="#F1">1</a>), achieving this goal is impossible. It follows that visual perceptions must therefore arise from a strategy that does not rely on real world properties as such. In a wholly empirical conception of vision, the perceptual values we experience are determined by ordering visual qualities according to the frequency of occurrence of image patterns and how this impacts survival (<a href="#B56">Purves and Lotto, 2003</a>, <a href="#B57">2011</a>; <a href="#B59">Purves et al., 2011</a>, <a href="#B58">2014</a>).</p> <p class="mb0">In general terms, understanding this strategy is straightforward. Imagine a population of primitive organisms whose behavior is dictated by rudimentary collections of photoreceptors and associated neural connections. As stipulated by neo-Darwinian theory, the organization of both the receptors and their connections in the population is subject to small random variations in structure and function that are acted on by natural selection. Based on interactions with the environment, variations of pre-neural and neural configurations that promote survival tend to be passed down to future generations. As a result, the ranks of visual qualities an agent perceives over some evolved range (darkest-lightest, largest-smallest, fastest-slowest, etc.) reflect biological utility rather than the physically measureable properties of objects and conditions in the world. In short, the role of perceptual states is not to reveal the physical world, but to promote useful behaviors. In this scheme, the world is simply the arena in which the utility of perceptions and other behavioral responses pertinent to survival and reproduction is tested, with feedback from the environment acting as the driving force that gradually instantiates the needed circuitry (Figure <a href="#F3">3</a>).</p> <div class="DottedLine"></div> <div class="Imageheaders">FIGURE 3</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g003.jpg" name="figure3" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g003.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g003.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g003.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g003.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g003.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g003.jpg" alt="www.frontiersin.org" id="F3" loading="lazy"> </picture> </a> <p><strong>Figure 3. Visual perception based on the frequency of occurrence of patterns and subsequent behavior.</strong> By depending on the frequency of scale-invariant patterns in images, useful perceptions can arise without information about physically measurable properties of the world. The driving force in this understanding of vision is a biological feedback loop that, over time, orders the basic visual qualities we perceive by associating the frequency of recurring image patterns with perceptual qualities according to survival and reproductive success.</p></div> <div class="clear"></div> <div class="DottedLine"></div> <p class="mb0 w100pc float_left mt15">In implementing this strategy, however, vision cannot rely on entire images, as efficient coding theory has long recognized (see Information Theoretic Approaches). The reason is that the extraordinary detail in most retinal images will rarely, if ever, activate the full array of photoreceptors in exactly the same way again. Processes like evolution and lifetime learning, however, depend on repeated trial and error. Thus rather than relying on images <i>per se</i>, biological vision is better served by relying on the recurring scale-invariant patterns within images to rank perceptual qualities (scale invariance refers to a relationship that does not change when variables such as length and width are multiplied by a common factor). In this way the biological feedback loop diagrammed in Figure <a href="#F3">3</a> can progressively organize both ordinal (e.g., lighter-darker, larger-smaller) and non-ordinal (e.g., color, direction) visual qualities over useful ranges according to the relative frequency of pattern occurrences and feedback from behavior. This concept is consistent with classical physiological studies demonstrating the transformation of images by the evolved receptive fields of early level visual neurons (<a href="#B32">Hubel, 1988</a>; <a href="#B33">Hubel and Wiesel, 2005</a>), with the goal of reducing the redundancy of image information by efficient coding (<a href="#B28">Graham and Field, 2007</a>), and with psychophysical studies showing that the frequency of occurrence of image patterns extracted from natural scenes predicts human visual perceptions (<a href="#B79">Yang and Purves, 2004</a>).</p> <a id="h7" name="h7"></a><h2>An Example</h2> <p class="mb0">To appreciate how vision can operate in this way, consider the perceptions of lightness-darkness elicited by natural luminance patterns. Figure <a href="#F4">4</a> shows two simple patterns in which the luminance of the central squares is the same, but the luminance of the surrounding areas differs. As has been noted since Michel Chevreul’s studies in the 19th century, the central squares appear differently light, thus failing to agree with physical measurements.</p> <div class="DottedLine"></div> <div class="Imageheaders">FIGURE 4</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g004.jpg" name="figure4" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g004.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g004.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g004.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g004.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g004.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g004.jpg" alt="www.frontiersin.org" id="F4" loading="lazy"> </picture> </a> <p><strong>Figure 4. Lightness percepts elicited by luminance patterns.</strong> The two patterns comprise central squares with identical luminance values surrounded by regions that have a lower (left panel) or higher (right panel) luminance. The central squares appear differently light in these contexts, despite the fact that they are physically the same. The inset shows that when placed on the same background the central squares elicit the same lightness, although this percept differs from the lightness of the squares in either of the two patterns above.</p></div> <div class="clear"></div> <div class="DottedLine"></div> <p class="mt15 w100pc float_left">In wholly empirical terms, the reason for this effect is outlined in Figure <a href="#F5">5</a>. In the course of maximizing survival and reproductive success in response to scale-invariant patterns of luminance, evolution and lifetime learning will have ranked perceptions of relative lightness-darkness according to the frequency of occurrence of the luminance of any element in a pattern, given the luminance values of the rest of the elements. Absent this ordering according to the frequency of recurring image patterns, the generation of useful perceptions and behaviors would be stymied by the fact that these or any other patterns cannot specify the measured properties of the objects and conditions that gave rise to them (see Figure <a href="#F1">1</a>).</p> <div class="DottedLine"></div> <div class="Imageheaders">FIGURE 5</div> <div class="FigureDesc"> <a href="https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g005.jpg" name="figure5" target="_blank"> <picture> <source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=480&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g005.jpg" media="(max-width: 563px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=370&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g005.jpg" media="(max-width: 1024px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=290&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g005.jpg" media="(max-width: 1441px)"><source type="image/webp" srcset="https://images-provider.frontiersin.org/api/ipx/w=410&f=webp/https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g005.jpg" media=""><source type="image/jpg" srcset="https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g005.jpg" media=""> <img src="https://www.frontiersin.org/files/Articles/163471/fnsys-09-00156-HTML/image_m/fnsys-09-00156-g005.jpg" alt="www.frontiersin.org" id="F5" loading="lazy"> </picture> </a> <p><strong>Figure 5. Lightness predicted by the frequency of recurrent luminance patterns.</strong> The contexts of luminance patterns in column 1 are the same as in Figure <a style="color:grey;" href="#F4">4</a>, with an unspecified central value indicated by the question marks. The frequency of occurrence of central luminance values in these patterns can be determined by repeatedly sampling natural images using the patterns as templates (see column 2). To maximize behavioral success, the lightness elicited by the central luminance value in Figure <a style="color:grey;" href="#F4">4</a> (indicated by the red ‘Ts’ in column 2) should evolve to accord with its accumulated frequency of occurrence in the two patterns (dashed red lines in the graphs in column 3) rather than with its actual luminance, thus explaining why the same central luminance in Figure <a style="color:grey;" href="#F4">4</a> is perceived differently. Organisms therefore evolve to match their perceptions to the accumulated frequencies of occurrence of targets given a context through their enhanced survival over evolutionary time (as shown in Figure <a style="color:grey;" href="#F3">3</a>). (Note that using templates to determine the frequency of occurrence of patterns is simply a convenient way of collecting the pertinent data, and does not imply that the visual system uses templates to sample retinal images.) (Original data is in <a style="color:grey;" href="#B79">Yang and Purves, 2004</a>).</p></div> <div class="clear"></div> <div class="DottedLine"></div> <p class="mb15 w100pc float_left mt15">As shown in Figure <a href="#F5">5</a>, the empirical incidence of the two patterns arising in retinal images generated by a database of natural images shows that the same central luminance value occurs less often in the context of a lower-luminance surround than in the context of a higher-luminance surround (column 2; <a href="#B79">Yang and Purves, 2004</a>). The reason is that in any non-random pattern, nearby points will tend to have similar luminance values (see Figure <a href="#F4">4</a>; <a href="#B54">Olshausen and Field, 1996</a>, <a href="#B540">2000</a>). Consequently, if the lightness-darkness values of the central squares are ordered according to their relative frequency of occurrence in these patterns (column 3), the same luminance value should elicit a lighter appearance in the context of a less luminant surround when compared to a more luminant surround, as it does (see Figure <a href="#F4">4</a>).</p> <p class="mb0">In summary, the frequencies of occurrence of luminance values in image patterns responded to over time predict the qualities we see in this example because the range of this basic perceptual quality (lightness-darkness) has been ordered over a useful range (lightest to darkest) according to the relative success of stimulus-response associations. Similar ordering of data arising from the frequency of pattern occurrence in both natural and simulated environments has been used to rationalize more complex stimuli that elicit perceptions of lightness (<a href="#B79">Yang and Purves, 2004</a>), color (<a href="#B48">Long et al., 2006</a>), interval and angle magnitude (<a href="#B31">Howe and Purves, 2005</a>), the speed of motion (<a href="#B77">Wojtach et al., 2008</a>, <a href="#B78">2009</a>), and the direction of motion (<a href="#B71">Sung et al., 2009</a>).</p> <a id="h8" name="h8"></a><h2>Consequences of Input-Output Associations on a Wholly Empirical Basis</h2> <p class="mb15">Behaviorally successful associations generated in this way automatically tie the frequency of occurrence of stimulus patterns to the frequency of occurrence of responses, explaining why relying on the frequency of occurrence of stimulus patterns predicts perception: every time a given image pattern occurs as input, the associated output arises from trial and error feedback, which in biology tracks reproductive success. The result is perceptions that become more and more useful over time. Although in any trial and error process input-output equivalence is never reached, after sufficient evolution the cumulative distribution function of the stimulus input will come to align with the cumulative distribution function of the perceptual output closely enough to predict many of the results of human psychophysics (<a href="#B58">Purves et al., 2014</a>).</p> <p class="mb0">When conceived in this way it makes sense that visual perceptions are not correlated with light intensity or any other physical property, as psychophysics amply demonstrates. Although relying on the frequency of occurrence of patterns uncouples perceived values from their measured physical parameters (e.g., surface reflectance), it endows visual agents with the ability to perceive and act in their environments in ways that led to biological success in the past, and are therefore likely to succeed in the present. While this strategy makes it seem that we see the world as it really is, vision on a wholly empirical basis is not veridical and has a different goal: to generate useful perceptions without measuring or recovering real-world properties.</p> <a id="h9" name="h9"></a><h2>Exploring Neuronal Connectivity in Wholly Empirical Terms</h2> <p class="mb15">Bayesian approaches to perception use inferences about real-world properties as a tool for understanding whatever processing is accomplished by the visual brain. But as has already been emphasized, biological sensing systems cannot recover these properties.</p> <p class="mb15">The wholly empirical alternative we describe is generally consistent with other studies that do not assume the recovery of real-world properties (e.g., <a href="#B34">Janke et al., 1999</a>; <a href="#B55">Onat et al., 2011</a>). Ultimately, any approach to vision based on empirically successful input-output associations must explain how this strategy is related to the documented physiology and anatomy of the primate and other visual systems. In principle, the most direct way to unravel the circuit mechanics underlying a wholly empirical (or any other) strategy would be to mimic the trial and error process of association on which evolution relies. Until relatively recently, this approach would have been fanciful. But the advent of genetic and other computer algorithms has made simulating the evolution of artificial neural networks in model environments relatively easy. This technology offers a way of linking any empirical understanding of vision to the wealth of information already in hand from physiological and anatomical studies.</p> <p class="mb0">A number of studies have shown the feasibility of evolving neural networks on the basis of experience (<a href="#B23">Geisler and Diehl, 2002</a>; <a href="#B8">Boots et al., 2007</a>; <a href="#B16">Corney and Lotto, 2007</a>; <a href="#B24">Geisler et al., 2009</a>; <a href="#B14">Burge and Geisler, 2011</a>). More recent work has asked whether the connectivity and operating principles of networks evolved on a wholly empirical basis is similar to that found in biological circuitry. For example, simple networks have been evolved to rank responses according to the frequency of occurrence of patterns extracted from natural and simulated images (<a href="#B53">Ng et al., 2013</a>; <a href="#B52">Morgenstern et al., 2014</a>). The most obvious feature that emerges is the center-surround receptive field. In addition to efficiency, this organization enables the interaction of targets and contexts, heightens sensitivity to frequently occurring stimuli, and automatically adapts to overall luminance and local contrast. These features are all characteristic of neurons in the early stages of visual systems like ours (<a href="#B630">Sakmann and Creutzfeldt, 1969</a>; <a href="#B22">Geisler and Albrecht, 1992</a>; <a href="#B9">Bonin et al., 2005</a>; <a href="#B33">Hubel and Wiesel, 2005</a>).</p> <a id="h10" name="h10"></a><h2>Vision as Reflexive</h2> <p class="mb0">Any fully empirical account of vision implies that perceptions and their neural underpinnings are reflexive. The term “reflex” alludes to behaviors such as the “knee-jerk” (myotatic) response that depend on the transfer of information from sensory input to motor output via circuitry established by behavioral success over evolutionary time. The advantages of reflex responses are clear: circuitry that links input to output as directly as possible allows the nervous system to respond with maximum speed and accuracy. It does not follow, however, that reflex responses must be “simple”, that they are limited to motor acts, or that they entail only “lower order” neural circuitry. <a href="#B66">Sherrington (1947)</a>, who pioneered the study of reflex circuits, was well aware that the concept of a “simple” reflex is, in his words, a “convenient…fiction”, since “all parts of the nervous system are connected together and no part of it is ever capable of reaction without affecting and being affected by other parts …”. There is no evidence that any response to sensory input differs from a spinal reflex, other than by the number of synaptic connections in the input-output circuitry. Understanding vision as reflexive (i.e., hard-wired at any given moment but subject to modification by subsequent experience) also affords the ability to account for visual perceptions generated within a few tens of milliseconds in response to complex stimuli such as wind-blown leaves, running water, animal movements and numerous other circumstances. Computer vision models that depend on reverse-engineering scenes from images by inferring the large number of real world sources that could have generated these complex image streams would likely require more computational power than is necessary for the tasks that visual and other biological sensing systems routinely carry out. Although it is difficult to imagine how visual systems could generate perceptions of complex scenes almost immediately by a series of hierarchical computations, this problem is resolved if visual “processing” is re-imagined as the result of “computations” that have, in effect, already been accomplished by laying down connectivity instantiated by feedback from empirical success over evolutionary and individual time (see Figure <a href="#F3">3</a>). This strategic difference is presumably the main reason why machine vision based on logical algorithms cannot match the performance of biological vision on many tasks.</p> <a id="h11" name="h11"></a><h2>Limitations of a Wholly Empirical Approach</h2> <p class="mb0">As with any theory, there are limitations to the strategy of visual perception advocated here, both methodological and conceptual. With respect to methodology, when investigating the perception of lightness (see Figures <a href="#F4">4</a>, <a href="#F5">5</a>), the luminance values comprising the database were collected from a limited range of environments assumed to be representative of the types of scenes where the human visual system evolved. In addition, the fact that humans and other animals attend to specific aspects of the environment, thus biasing the frequency distribution of sensory input, was not taken into account. While these and other deficiencies are important, given that this strategy successfully predicts the standard simultaneous lightness contrast effect shown in Figure <a href="#F4">4</a> and a variety of more complex lightness effects (<a href="#B79">Yang and Purves, 2004</a>)—in addition to other puzzling perceptions of color, form and motion (see above)—the empirical framework seems well supported by evidence that has not been supplied by other approaches. This last point stands as a challenge to any theory of perception, including broader unifying concepts such as the idea that the common goal of brain function is to satisfy a “free-energy principle” (<a href="#B20">Friston, 2010</a>).</p> <a id="h12" name="h12"></a><h2>The Wholly Empirical Theory and Cognition</h2> <p class="mb0">It is worth noting that higher order phenomena such as visual attention and visual memory could also arise by associating the relative frequency of recurring scale-invariant image patterns with useful responses. As in the case of the basic visual qualities considered here, the relevant circuitry would also be reflexive, without the need to invoke additional “cognitive” mechanisms: every time a given image pattern occurred the response dictated by association would be further enhanced according to its utility. As a result the foci of visual attention and the visual memories elicited would, like perceptions, gradually become more and more useful over time.</p> <a id="h13" name="h13"></a><h2>Conclusion</h2> <p class="mb0">The idea that vision operates empirically has taken several forms and enjoyed different degrees of enthusiasm since Helmholtz introduced the concept of unconscious inference in the 19th century. Vision on a wholly empirical basis is now seen by some investigators as the most plausible way to understand how stimuli that cannot specify their physical sources can nonetheless give rise to useful perceptions and routinely successful visually guided behaviors. Understanding perception in these terms implies a strategy of nervous system operation that differs fundamentally from the concept of detecting stimulus features and recovering real-world properties by algorithmic computations that in one way or another depend on accessing physical parameters to guide actions. By relying on evolved reflex associations that have ordered visual qualities according to the impact of the relative frequency of occurrence of stimulus patterns on reproductive success, vision can circumvent the inherent uncertainty of retinal images, and explain the qualities we actually see.</p> <a id="h14" name="h14"></a><h2>Conflict of Interest Statement</h2> <p class="mb0">The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p> <a id="h15" name="h15"></a><h2>References</h2> <div class="References"> <p class="ReferencesCopy1"><a name="B1" id="B1"></a>Allred, S. R., and Brainard, D. H. (2013). A bayesian model of lightness perception that incorporates spatial variation in the illumination. <i>J. Vis.</i> 13:18. doi: 10.1167/13.7.18</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=23814073" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1167/13.7.18" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=A+bayesian+model+of+lightness+perception+that+incorporates+spatial+variation+in+the+illumination&author=Allred+S.+R.&author=Brainard+D.+H.&publication_year=2013&journal=J.+Vis.&volume=13&pages=18" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B2" id="B2"></a>Atick, J., and Redlich, A. (1993). Convergent algorithm for sensory receptive field development. <i>Neural Comput.</i> 5, 45–60. doi: 10.1162/neco.1993.5.1.45</p> <p class="ReferencesCopy2"><a href="http://dx.doi.org/10.1162/neco.1993.5.1.45" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Convergent+algorithm+for+sensory+receptive+field+development&author=Atick+J.&author=Redlich+A.&publication_year=1993&journal=Neural+Comput.&volume=5&pages=45-60" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B3" id="B3"></a>Attneave, F. (1954). Informational aspects of visual perception. <i>Psychol. Rev.</i> 61, 183–193. doi: 10.1037/h0054663</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=13167245" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1037/h0054663" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Informational+aspects+of+visual+perception&author=Attneave+F.&publication_year=1954&journal=Psychol.+Rev.&volume=61&pages=183-193" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B4" id="B4"></a>Baddeley, R., Abbott, L. F., Booth, M. C., Sengpiel, F., Freeman, T., Wakeman, E. A., et al. (1997). Responses of neurons in primary and inferior temporal visual cortices to natural scenes. <i>Proc. Biol. Sci.</i> 264, 1775–1783. doi: 10.1098/rspb.1997.0246</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=9447735" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1098/rspb.1997.0246" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Responses+of+neurons+in+primary+and+inferior+temporal+visual+cortices+to+natural+scenes&author=Baddeley+R.&author=Abbott+L.+F.&author=Booth+M.+C.&author=Sengpiel+F.&author=Freeman+T.&author=Wakeman+E.+A.&+&publication_year=1997&journal=Proc.+Biol.+Sci.&volume=264&pages=1775-1783" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B5" id="B5"></a>Barlow, H. B. (1961). “Possible principles underlying the transformation of sensory messages,” in <i>Sensory Communication</i>, ed. W. A. Rosenblith (Cambrdge MA: MIT Press), 217–236.</p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B6" id="B6"></a>Bayes, T. R. (1763). An essay towards solving a problem in the doctrine of chances. <i>Phil. Trans. R. Soc. London</i> 53, 370–418. doi: 10.1098/rstl.1763.0053</p> <p class="ReferencesCopy2"><a href="http://dx.doi.org/10.1098/rstl.1763.0053" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=An+essay+towards+solving+a+problem+in+the+doctrine+of+chances&author=Bayes+T.+R.&publication_year=1763&journal=Phil.+Trans.+R.+Soc.+London&volume=53&pages=370-418" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B7" id="B7"></a>Bell, A. J., and Sejnowski, T. J. (1997). The “independent components” of natural scenes are edge filters. <i>Vision Res.</i> 37, 3327–3338. doi: 10.1016/s0042-6989(97)00121-1</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=9425547" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1016/s0042-6989(97)00121-1" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=The+“independent+components”+of+natural+scenes+are+edge+filters&author=Bell+A.+J.&author=Sejnowski+T.+J.&publication_year=1997&journal=Vision+Res.&volume=37&pages=3327-3338" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B8" id="B8"></a>Boots, B., Nundy, S., and Purves, D. (2007). Evolution of visually guided behavior in artificial agents. <i>Network</i> 18, 11–34. doi: 10.1080/09548980601113254</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=17454680" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1080/09548980601113254" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Evolution+of+visually+guided+behavior+in+artificial+agents&author=Boots+B.&author=Nundy+S.&author=Purves+D.&publication_year=2007&journal=Network&volume=18&pages=11-34" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B9" id="B9"></a>Bonin, V., Mante, V., and Carandini, M. (2005). The suppressive field of neurons in lateral geniculate nucleus. <i>J. Neurosci.</i> 25, 10844–10856. doi: 10.1523/jneurosci.3562-05.2005</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=16306397" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1523/jneurosci.3562-05.2005" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=The+suppressive+field+of+neurons+in+lateral+geniculate+nucleus&author=Bonin+V.&author=Mante+V.&author=Carandini+M.&publication_year=2005&journal=J.+Neurosci.&volume=25&pages=10844-10856" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B10" id="B10"></a>Bowers, J. S., and Davis, C. J. (2012). Bayesian just-so stories in psychology and neuroscience. <i>Psychol. Bull.</i> 138, 389–414. doi: 10.1037/a0026450</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=22545686" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1037/a0026450" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Bayesian+just-so+stories+in+psychology+and+neuroscience&author=Bowers+J.+S.&author=Davis+C.+J.&publication_year=2012&journal=Psychol.+Bull.&volume=138&pages=389-414" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B11" id="B11"></a>Brady, N., and Field, D. J. (2000). Local contrast in natural images: normalization and coding efficiency. <i>Perception</i> 29, 1041–1056. doi: 10.1068/p2996</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=11144818" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1068/p2996" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Local+contrast+in+natural+images%3A+normalization+and+coding+efficiency&author=Brady+N.&author=Field+D.+J.&publication_year=2000&journal=Perception&volume=29&pages=1041-1056" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B12" id="B12"></a>Brainard, D. H. (2009). “Bayesian approaches to color vision,” in <i>The Cognitive Neurosciences, Fourth Edition</i>, ed. M. S. Gazzaniga (Cambridge, MA: MIT Press), 395–408.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=Bayesian+approaches+to+color+vision&author=Brainard+D.+H.&publication_year=2009&pages=395-408" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B13" id="B13"></a>Brünswik, E. (1956/1997). <i>Perception and the Psychological Design of Representative Experiments.</i> 2nd Edn. Berkeley: University of California Press.</p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B14" id="B14"></a>Burge, J., and Geisler, W. S. (2011). Optimal defocus estimation in individual natural images. <i>Proc. Natl. Acad. Sci. U S A</i> 108, 16849–16854. doi: 10.1073/pnas.1108491108</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=21930897" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1073/pnas.1108491108" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Optimal+defocus+estimation+in+individual+natural+images&author=Burge+J.&author=Geisler+W.+S.&publication_year=2011&journal=Proc.+Natl.+Acad.+Sci.+U+S+A&volume=108&pages=16849-16854" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B15" id="B15"></a>Carandini, M., and Heeger, D. J. (2012). Normalization as a canonical neural computation. <i>Nat. Rev. Neurosci.</i> 13, 51–62. doi: 10.1038/nrn3136</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=22108672" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1038/nrn3136" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Normalization+as+a+canonical+neural+computation&author=Carandini+M.&author=Heeger+D.+J.&publication_year=2012&journal=Nat.+Rev.+Neurosci.&volume=13&pages=51-62" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B16" id="B16"></a>Corney, D., and Lotto, R. B. (2007). What are lightness illusions and why do we see them? <i>PLoS Comput. Biol.</i> 3:e180. doi: 10.1371/journal.pcbi.0030180</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=17907795" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1371/journal.pcbi.0030180" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=What+are+lightness+illusions+and+why+do+we+see+them%3F&author=Corney+D.&author=Lotto+R.+B.&publication_year=2007&journal=PLoS+Comput.+Biol.&volume=3&pages=e180" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B17" id="B17"></a>Dan, Y., Atick, J. J., and Reid, R. C. (1996). Efficient coding of natural scenes in the lateral geniculate nucleus: experimental test of a computational theory. <i>J. Neurosci.</i> 16, 3351–3362. </p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=8627371" target="_blank">PubMed Abstract</a> | <a href="http://scholar.google.com/scholar_lookup?title=Efficient+coding+of+natural+scenes+in+the+lateral+geniculate+nucleus%3A+experimental+test+of+a+computational+theory&author=Dan+Y.&author=Atick+J.+J.&author=Reid+R.+C.&publication_year=1996&journal=J.+Neurosci.&volume=16&pages=3351-3362" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B18" id="B18"></a>Ernst, M. O., and Banks, M. S. (2002). Humans integrate visual and haptic information in a statistically optimal fashion. <i>Nature</i> 415, 429–433. doi: 10.1038/415429a</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=11807554" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1038/415429a" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Humans+integrate+visual+and+haptic+information+in+a+statistically+optimal+fashion&author=Ernst+M.+O.&author=Banks+M.+S.&publication_year=2002&journal=Nature&volume=415&pages=429-433" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B19" id="B19"></a>Fetsch, C. R., DeAngelis, G. C., and Angelaki, D. E. (2013). Bridging the gap between theories of sensory cue integration and the physiology of multisensory neurons. <i>Nat. Rev. Neurosci.</i> 14, 429–442. doi: 10.1038/nrn3503</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=23686172" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1038/nrn3503" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Bridging+the+gap+between+theories+of+sensory+cue+integration+and+the+physiology+of+multisensory+neurons&author=Fetsch+C.+R.&author=DeAngelis+G.+C.&author=Angelaki+D.+E.&publication_year=2013&journal=Nat.+Rev.+Neurosci.&volume=14&pages=429-442" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B20" id="B20"></a>Friston, K. (2010). The free-energy principle: a unified brain theory? <i>Nat. Rev. Neurosci.</i> 11, 127–138. doi: 10.1038/nrn2787</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=20068583" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1038/nrn2787" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=The+free-energy+principle%3A+a+unified+brain+theory%3F&author=Friston+K.&publication_year=2010&journal=Nat.+Rev.+Neurosci.&volume=11&pages=127-138" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B21" id="B21"></a>Geisler, W. S. (2011). Contributions of ideal observer theory to vision research. <i>Vision Res.</i> 51, 771–781. doi: 10.1016/j.visres.2010.09.027</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=20920517" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1016/j.visres.2010.09.027" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Contributions+of+ideal+observer+theory+to+vision+research&author=Geisler+W.+S.&publication_year=2011&journal=Vision+Res.&volume=51&pages=771-781" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B22" id="B22"></a>Geisler, W. S., and Albrecht, D. G. (1992). Cortical neurons: isolation of contrast gain control. <i>Vision Res.</i> 32, 1409–1410. doi: 10.1016/0042-6989(92)90196-p</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=1455713" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1016/0042-6989(92)90196-p" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Cortical+neurons%3A+isolation+of+contrast+gain+control&author=Geisler+W.+S.&author=Albrecht+D.+G.&publication_year=1992&journal=Vision+Res.&volume=32&pages=1409-1410" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B23" id="B23"></a>Geisler, W. S., and Diehl, R. L. (2002). Bayesian natural selection and the evolution of perceptual systems. <i>Philos. Trans. R. Soc. Lond. B Biol. Sci.</i> 357, 419–448. doi: 10.1098/rstb.2001.1055</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=12028784" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1098/rstb.2001.1055" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Bayesian+natural+selection+and+the+evolution+of+perceptual+systems&author=Geisler+W.+S.&author=Diehl+R.+L.&publication_year=2002&journal=Philos.+Trans.+R.+Soc.+Lond.+B+Biol.+Sci.&volume=357&pages=419-448" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B24" id="B24"></a>Geisler, W. S., Najemnik, J., and Ing, A. D. (2009). Optimal stimulus encoders for natural tasks. <i>J. Vis.</i> 17, 1–16. doi: 10.1167/9.13.17</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=20055550" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1167/9.13.17" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Optimal+stimulus+encoders+for+natural+tasks&author=Geisler+W.+S.&author=Najemnik+J.&author=Ing+A.+D.&publication_year=2009&journal=J.+Vis.&volume=17&pages=1-16" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B25" id="B25"></a>Gelb, A. (1929). “Die farbenkonstanz der sehdinge,” in <i>Handbuch Normalen und Pathologischen Psychologie</i>, ed. W. A. von Bethe (Berlin: Springer-Verlag), 594–678.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=Die+farbenkonstanz+der+sehdinge&author=Gelb+A.&publication_year=1929&pages=594-678" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B26" id="B26"></a>Gibson, J. J. (1966). <i>The Senses Considered as Perceptual Systems.</i> Boston: Houghton Mifflin.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=The+Senses+Considered+as+Perceptual+Systems&author=Gibson+J.+J.&publication_year=1966" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B27" id="B27"></a>Gibson, J. J. (1979). <i>The Ecological Approach to Visual Perception.</i> Hillsdale, NJ: Lawrence Erlbaum.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=The+Ecological+Approach+to+Visual+Perception&author=Gibson+J.+J.&publication_year=1979" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B28" id="B28"></a>Graham, D. J., and Field, D. J. (2007). “Efficient coding of natural images,” in <i>New Encyclopedia of Neurosciences</i>, L. R. Squire (New York: Elsevier), 19–27.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=Efficient+coding+of+natural+images&author=Graham+D.+J.&author=Field+D.+J.&publication_year=2007&pages=19-27" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B29" id="B29"></a>Helmholtz, H. (1866/1924). <i>Helmholtz’s Treatise on Physiological Optics, Third German Edition, Vols. I-III, 1909</i>, (J. P. C. Southall translation) (New York: The Optical Society of America).</p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B30" id="B30"></a>Hosoya, T., Baccus, S. A., and Meister, M. (2005). Dynamic predictive coding by the retina. <i>Nature</i> 436, 71–77. doi: 10.1038/nature03689</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=16001064" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1038/nature03689" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Dynamic+predictive+coding+by+the+retina&author=Hosoya+T.&author=Baccus+S.+A.&author=Meister+M.&publication_year=2005&journal=Nature&volume=436&pages=71-77" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B31" id="B31"></a>Howe, C. Q., and Purves, D. (2005). <i>Perceiving Geometry: Geometrical Illusions Explained by Natural Scene Statistics.</i> New York: Springer Press.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=Perceiving+Geometry%3A+Geometrical+Illusions+Explained+by+Natural+Scene+Statistics&author=Howe+C.+Q.&author=Purves+D.&publication_year=2005" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B32" id="B32"></a>Hubel, D. H. (1988). <i>Eye Brain and Vision</i>. New York: Scientific American Press.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=Eye+Brain+and+Vision&author=Hubel+D.+H.&publication_year=1988" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B33" id="B33"></a>Hubel, D. H., and Wiesel, T. (2005). <i>Brain and Visual Perception. A story of a 25-year Collaboration.</i> New York: Oxford University Press.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=Brain+and+Visual+Perception.+A+story+of+a+25-year+Collaboration&author=Hubel+D.+H.&author=Wiesel+T.&publication_year=2005" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B34" id="B34"></a>Janke, D., Erlhagen, W., Dinse, H. R., Akhavan, A. C., Giese, M., Steinhage, A., et al. (1999). Parametric population representation of retinal location: neuronal interaction dynamics in cat primary visual cortex. <i>J. Neurosci.</i> 19, 9016–9028. </p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=10516319" target="_blank">PubMed Abstract</a> | <a href="http://scholar.google.com/scholar_lookup?title=Parametric+population+representation+of+retinal+location%3A+neuronal+interaction+dynamics+in+cat+primary+visual+cortex&author=Janke+D.&author=Erlhagen+W.&author=Dinse+H.+R.&author=Akhavan+A.+C.&author=Giese+M.&author=Steinhage+A.&+&publication_year=1999&journal=J.+Neurosci.&volume=19&pages=9016-9028" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B35" id="B35"></a>Jones, M., and Love, B. C. (2011). Bayesian fundamentalism or enlightenment? on the explanatory status and theoretical contributions of bayesian models of cognition. <i>Behav. Brain Sci.</i> 34, 169–188, disuccsion 188–231. doi: 10.1017/s0140525x10003134</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=21864419" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1017/s0140525x10003134" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Bayesian+fundamentalism+or+enlightenment%3F+on+the+explanatory+status+and+theoretical+contributions+of+bayesian+models+of+cognition&author=Jones+M.&author=Love+B.+C.&publication_year=2011&journal=Behav.+Brain+Sci.&volume=34&pages=169-188" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B36" id="B36"></a>Kersten, D., Mamassian, P., and Yuille, A. (2004). Object perception as bayesian inference. <i>Annu. Rev. Psychol.</i> 55, 271–304. doi: 10.1146/annurev.psych.55.090902.142005</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=14744217" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1146/annurev.psych.55.090902.142005" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Object+perception+as+bayesian+inference&author=Kersten+D.&author=Mamassian+P.&author=Yuille+A.&publication_year=2004&journal=Annu.+Rev.+Psychol.&volume=55&pages=271-304" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B37" id="B37"></a>Kersten, D., and Yuille, A. (2003). Bayesian models of object perception. <i>Curr. Opin. Neurobiol.</i> 13, 1–9. doi: 10.1016/s0959-4388(03)00042-4</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=Bayesian+models+of+object+perception&author=Kersten+D.&author=Yuille+A.&publication_year=2003&journal=Curr.+Opin.+Neurobiol.&volume=13&pages=1-9" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B38" id="B38"></a>Knill, D. C., and Pouget, A. (2004). The bayesian brain: the role of uncertainty in neural coding and computation. <i>Trends Neurosci.</i> 27, 712–719. doi: 10.1016/j.tins.2004.10.007</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=15541511" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1016/j.tins.2004.10.007" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=The+bayesian+brain%3A+the+role+of+uncertainty+in+neural+coding+and+computation&author=Knill+D.+C.&author=Pouget+A.&publication_year=2004&journal=Trends+Neurosci.&volume=27&pages=712-719" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B39" id="B39"></a>Knill, D. C., and Richards, W. (1996). <i>Perception as Bayesian Inference.</i> Cambridge: Cambridge University Press.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=Perception+as+Bayesian+Inference&author=Knill+D.+C.&author=Richards+W.&publication_year=1996" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B40" id="B40"></a>Knill, D. C., and Saunders, J. A. (2003). Do humans optimally integrate stereo and texture information for judgments of surface slant? <i>Vision Res.</i> 43, 2539–2558. doi: 10.1016/s0042-6989(03)00458-9</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=13129541" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1016/s0042-6989(03)00458-9" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Do+humans+optimally+integrate+stereo+and+texture+information+for+judgments+of+surface+slant%3F&author=Knill+D.+C.&author=Saunders+J.+A.&publication_year=2003&journal=Vision+Res.&volume=43&pages=2539-2558" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B41" id="B41"></a>Koffka, K. (1935). <i>Principals of Gestalt Psychology.</i> New York: Harcourt Brace.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=Principals+of+Gestalt+Psychology&author=Koffka+K.&publication_year=1935" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B42" id="B42"></a>Köhler, W. (1947). <i>Gestalt Psychology: An Introduction to New Concepts in Modern Psychology.</i> New York: Liveright.</p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B43" id="B43"></a>Körding, K. P. (2014). Bayesian statistics: relevant for the brain? <i>Curr. Opin. Neurobiol.</i> 25, 130–133. doi: 10.1016/j.conb.2014.01.003</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=24463330" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1016/j.conb.2014.01.003" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Bayesian+statistics%3A+relevant+for+the+brain%3F&author=Körding+K.+P.&publication_year=2014&journal=Curr.+Opin.+Neurobiol.&volume=25&pages=130-133" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B44" id="B44"></a>Körding, K. P., and Wolpert, D. M. (2004). Bayesian integration in sensorimotor learning. <i>Nature</i> 427, 244–247. doi: 10.1038/nature02169</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=14724638" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1038/nature02169" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Bayesian+integration+in+sensorimotor+learning&author=Körding+K.+P.&author=Wolpert+D.+M.&publication_year=2004&journal=Nature&volume=427&pages=244-247" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B45" id="B45"></a>Kuffler, S. W. (1953). Discharge patterns and functional organization of mammalian retina. <i>J. Neurophysiol.</i> 16, 37–68. </p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=13035466" target="_blank">PubMed Abstract</a> | <a href="http://scholar.google.com/scholar_lookup?title=Discharge+patterns+and+functional+organization+of+mammalian+retina&author=Kuffler+S.+W.&publication_year=1953&journal=J.+Neurophysiol.&volume=16&pages=37-68" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B46" id="B46"></a>Laughlin, S. (1981). A simple coding procedure enhances a neuron’s information capacity. <i>Z. Naturforsch.</i> 36, 910–912. </p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=7303823" target="_blank">PubMed Abstract</a> | <a href="http://scholar.google.com/scholar_lookup?title=A+simple+coding+procedure+enhances+a+neuron's+information+capacity&author=Laughlin+S.&publication_year=1981&journal=Z.+Naturforsch.&volume=36&pages=910-912" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B47" id="B47"></a>Lee, T. S., and Mumford, D. (2003). Hierarchical bayesian inference in the visual cortex. <i>J. Opt. Soc. Am. A</i> 20, 1434–1448. doi: 10.1364/josaa.20.001434</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=12868647" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1364/josaa.20.001434" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Hierarchical+bayesian+inference+in+the+visual+cortex&author=Lee+T.+S.&author=Mumford+D.&publication_year=2003&journal=J.+Opt.+Soc.+Am.+A&volume=20&pages=1434-1448" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B48" id="B48"></a>Long, F., Yang, Z., and Purves, D. (2006). Spectral statistics in natural scenes predict hue, saturation and brightness. <i>Proc. Natl. Acad. Sci. U S A</i> 103, 6013–6018. doi: 10.1073/pnas.0600890103</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=16595630" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1073/pnas.0600890103" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Spectral+statistics+in+natural+scenes+predict+hue,+saturation+and+brightness&author=Long+F.&author=Yang+Z.&author=Purves+D.&publication_year=2006&journal=Proc.+Natl.+Acad.+Sci.+U+S+A&volume=103&pages=6013-6018" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B49" id="B49"></a>Mamassian, P., Landy, M., and Maloney, L. T. (2002). “Bayesian modelling of visual perception,” in <i>Probabilistic Models of the Brain: Perception and Neural Function</i>, eds R. P. N. Rao, B. A. Olshausen, and M. S. Lewicki (Cambridge, MA: MIT Press), 13–36.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=Bayesian+modelling+of+visual+perception&author=Mamassian+P.&author=Landy+M.&author=Maloney+L.+T.&publication_year=2002&pages=13-36" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B50" id="B50"></a>Marr, D. (1982). <i>Vision: A Computational Investigation into Human Representation and Processing of Visual Information.</i> San Francisco: W.H. Freeman.</p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B51" id="B51"></a>Morgenstern, Y., Murray, R. F., and Harris, L. R. (2011). The human visual system’s assumption that light comes from above is weak. <i>Proc. Natl. Acad. Sci. U S A</i> 108, 12551–12553. doi: 10.1073/pnas.1100794108</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=21746935" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1073/pnas.1100794108" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=The+human+visual+system's+assumption+that+light+comes+from+above+is+weak&author=Morgenstern+Y.&author=Murray+R.+F.&author=Harris+L.+R.&publication_year=2011&journal=Proc.+Natl.+Acad.+Sci.+U+S+A&volume=108&pages=12551-12553" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B52" id="B52"></a>Morgenstern, Y., Rukmini, D. V., Monson, B. B., and Purves, D. (2014). Properties of artificial neurons that report lightness based on accumulated experience with luminance. <i>Front. Comput. Neurosci.</i> 8:134. doi: 10.3389/fncom.2014.00134</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=25404912" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.3389/fncom.2014.00134" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Properties+of+artificial+neurons+that+report+lightness+based+on+accumulated+experience+with+luminance&author=Morgenstern+Y.&author=Rukmini+D.+V.&author=Monson+B.+B.&author=Purves+D.&publication_year=2014&journal=Front.+Comput.+Neurosci.&volume=8&pages=134" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B53" id="B53"></a>Ng, C., Sundararajan, J., Hogan, M., and Purves, D. (2013). Network connections that evolve to circumvent the inverse optics problem. <i>PLoS One</i> 8:e60490. doi: 10.1371/journal.pone.0060490</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=23555981" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1371/journal.pone.0060490" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Network+connections+that+evolve+to+circumvent+the+inverse+optics+problem&author=Ng+C.&author=Sundararajan+J.&author=Hogan+M.&author=Purves+D.&publication_year=2013&journal=PLoS+One&volume=8&pages=e60490" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B54" id="B54"></a>Olshausen, B. A., and Field, D. J. (1996). Emergence of simple-cell receptive field properties by learning a sparse code for natural images. <i>Nature</i> 381, 607–609. doi: 10.1038/381607a0</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=8637596" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1038/381607a0" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Emergence+of+simple-cell+receptive+field+properties+by+learning+a+sparse+code+for+natural+images&author=Olshausen+B.+A.&author=Field+D.+J.&publication_year=1996&journal=Nature&volume=381&pages=607-609" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B540" id="B540"></a>Olshausen, B. A., and Field, D. J. (2000). Vision and the coding of natural images. <i>Am. Sci.</i> 88, 238–245. doi: 10.1511/2000.3.238</p> <p class="ReferencesCopy2"><a href="http://dx.doi.org/10.1511/2000.3.238" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Vision+and+the+coding+of+natural+images&author=Olshausen+B.+A.&author=Field+D.+J.&publication_year=2000&journal=Am.+Sci.&volume=88&pages=238-245" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B55" id="B55"></a>Onat, S., König, P., and Jancke, D. (2011). Natural scene evoked population dynamics across cat primary visual cortex captured with voltage-sensitive dye imaging. <i>Cereb. Cortex</i> 21, 2542–2554. doi: 10.1093/cercor/bhr038</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=21459837" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1093/cercor/bhr038" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Natural+scene+evoked+population+dynamics+across+cat+primary+visual+cortex+captured+with+voltage-sensitive+dye+imaging&author=Onat+S.&author=König+P.&author=Jancke+D.&publication_year=2011&journal=Cereb.+Cortex&volume=21&pages=2542-2554" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B56" id="B56"></a>Purves, D., and Lotto, R. B. (2003). <i>Why We See What We Do: An Empirical Theory of Vision.</i> Sunderland, MA: Sinauer Associates.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=Why+We+See+What+We+Do%3A+An+Empirical+Theory+of+Vision&author=Purves+D.&author=Lotto+R.+B.&publication_year=2003" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B57" id="B57"></a>Purves, D., and Lotto, R. B. (2011). <i>Why We See What We Do Redux: A Wholly Empirical Theory of Vision.</i> Sunderland, MA: Sinauer Associates.</p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B58" id="B58"></a>Purves, D., Monson, B. B., Sundararajan, J., and Wojtach, W. T. (2014). How biological vision succeeds in the physical world. <i>Proc. Natl. Acad. Sci. U S A</i> 111, 4750–4755. doi: 10.1073/pnas.1311309111</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=24639506" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1073/pnas.1311309111" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=How+biological+vision+succeeds+in+the+physical+world&author=Purves+D.&author=Monson+B.+B.&author=Sundararajan+J.&author=Wojtach+W.+T.&publication_year=2014&journal=Proc.+Natl.+Acad.+Sci.+U+S+A&volume=111&pages=4750-4755" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B59" id="B59"></a>Purves, D., Wojtach, W. T., and Lotto, R. B. (2011). Understanding vision in wholly empirical terms. <i>Proc. Natl. Acad. Sci. U S A</i> 108, 15588–15595. doi: 10.1073/pnas.1012178108</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=21383192" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1073/pnas.1012178108" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Understanding+vision+in+wholly+empirical+terms&author=Purves+D.&author=Wojtach+W.+T.&author=Lotto+R.+B.&publication_year=2011&journal=Proc.+Natl.+Acad.+Sci.+U+S+A&volume=108&pages=15588-15595" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B60" id="B60"></a>Rao, R. P., and Ballard, D. H. (1999). Predictive coding in the visual cortex: a functional interpretation of some extra-classical receptive-field effects. <i>Nat. Neurosci.</i> 2, 79–87. doi: 10.1038/4580</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=10195184" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1038/4580" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Predictive+coding+in+the+visual+cortex%3A+a+functional+interpretation+of+some+extra-classical+receptive-field+effects&author=Rao+R.+P.&author=Ballard+D.+H.&publication_year=1999&journal=Nat.+Neurosci.&volume=2&pages=79-87" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B61" id="B61"></a>Robinson, J. O. (1998). <i>The Psychology of Visual Illusions.</i> New York: Dover (corrected republication of the 1972 edition published by Hutchinson and Co. in England).</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=The+Psychology+of+Visual+Illusions&author=Robinson+J.+O.&publication_year=1998" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B62" id="B62"></a>Rock, I. (1984). <i>Perception.</i> New York: MacMillan. </p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=6442161" target="_blank">PubMed Abstract</a> | <a href="http://scholar.google.com/scholar_lookup?title=Perception&author=Rock+I.&publication_year=1984" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B630" id="B630"></a>Sakmann, B., and Creutzfeldt, O. D. (1969). Scotopic and mesopic light adaptation in the cat’s retina. <i>Plügers Archiv.</i> 313, 168–185. doi: 10.1007/BF00586245</p> <p class="ReferencesCopy2"><a href="http://dx.doi.org/10.1007/BF00586245" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Scotopic+and+mesopic+light+adaptation+in+the+cat's+retina&author=Sakmann+B.&author=Creutzfeldt+O.+D.&publication_year=1969&journal=Plügers+Archiv.&volume=313&pages=168-185" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B63" id="B63"></a>Schwartz, O., and Simoncelli, E. P. (2001). Natural signal statistics and sensory gain control. <i>Nat. Neurosci.</i> 4, 819–825. doi: 10.1038/90526</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=11477428" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1038/90526" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Natural+signal+statistics+and+sensory+gain+control&author=Schwartz+O.&author=Simoncelli+E.+P.&publication_year=2001&journal=Nat.+Neurosci.&volume=4&pages=819-825" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B64" id="B64"></a>Shannon, C. E. (1948). A mathematical theory of communication. <i>Bell Sys. Tech. J.</i> 27, 379–423, 623–656.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=A+mathematical+theory+of+communication&author=Shannon+C.+E.&publication_year=1948&journal=Bell+Sys.+Tech.+J.&volume=27&pages=379-423" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B65" id="B65"></a>Shannon, C. E., and Weaver, W. (1949). <i>The Mathematical Theory of Communication.</i> Chicago: University of Illinois Press.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=The+Mathematical+Theory+of+Communication&author=Shannon+C.+E.&author=Weaver+W.&publication_year=1949" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B66" id="B66"></a>Sherrington, C. S. (1947). <i>The Integrative Action of the Nervous System.</i> New Haven, CT: Yale University Press.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=The+Integrative+Action+of+the+Nervous+System&author=Sherrington+C.+S.&publication_year=1947" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B67" id="B67"></a>Simoncelli, E. P., and Olshausen, B. A. (2001). Natural image statistics and neural representation. <i>Annu. Rev. Neurosci.</i> 24, 1193–1216. doi: 10.1146/annurev.neuro.24.1.1193</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=11520932" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1146/annurev.neuro.24.1.1193" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Natural+image+statistics+and+neural+representation&author=Simoncelli+E.+P.&author=Olshausen+B.+A.&publication_year=2001&journal=Annu.+Rev.+Neurosci.&volume=24&pages=1193-1216" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B68" id="B68"></a>Srinivasan, M. V., Laughlin, S. B., and Dubs, A. (1982). Predictive coding: a fresh view of inhibition in the retina. <i>Proc. R. Lond. B. Biol. Sci.</i> 216, 427–459. doi: 10.1098/rspb.1982.0085</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=6129637" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1098/rspb.1982.0085" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Predictive+coding%3A+a+fresh+view+of+inhibition+in+the+retina&author=Srinivasan+M.+V.&author=Laughlin+S.+B.&author=Dubs+A.&publication_year=1982&journal=Proc.+R.+Lond.+B.+Biol.+Sci.&volume=216&pages=427-459" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B69" id="B69"></a>Stevens, S. S. (1975). <i>Psychophysics.</i> New York: John Wiley.</p> <p class="ReferencesCopy2"><a href="http://scholar.google.com/scholar_lookup?title=Psychophysics&author=Stevens+S.+S.&publication_year=1975" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B70" id="B70"></a>Stocker, A. A., and Simoncelli, E. P. (2006). Noise characteristics and prior expectations in human visual speed perception. <i>Nat. Neurosci.</i> 9, 578–585. doi: 10.1038/nn1669</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=16547513" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1038/nn1669" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Noise+characteristics+and+prior+expectations+in+human+visual+speed+perception&author=Stocker+A.+A.&author=Simoncelli+E.+P.&publication_year=2006&journal=Nat.+Neurosci.&volume=9&pages=578-585" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B71" id="B71"></a>Sung, K., Wojtach, W. T., and Purves, D. (2009). An empirical explanation of aperture effects. <i>Proc. Natl. Acad. Sci. U S A</i> 106, 298–303. doi: 10.1073/pnas.0811702106</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=19114661" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1073/pnas.0811702106" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=An+empirical+explanation+of+aperture+effects&author=Sung+K.&author=Wojtach+W.+T.&author=Purves+D.&publication_year=2009&journal=Proc.+Natl.+Acad.+Sci.+U+S+A&volume=106&pages=298-303" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B72" id="B72"></a>Tassinari, H., Hudson, T. E., and Landy, M. S. (2006). Combining priors and noisy visual cues in a rapid pointing task. <i>J. Neurosci.</i> 26, 10154–10163. doi: 10.1523/jneurosci.2779-06.2006</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=17021171" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1523/jneurosci.2779-06.2006" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Combining+priors+and+noisy+visual+cues+in+a+rapid+pointing+task&author=Tassinari+H.&author=Hudson+T.+E.&author=Landy+M.+S.&publication_year=2006&journal=J.+Neurosci.&volume=26&pages=10154-10163" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B73" id="B73"></a>van Hateren, J. H., and van der Schaaf, A. (1998). Independent component filters of natural images compared with simple cells in primary visual cortex. <i>Proc. R Soc. Lond. B.</i> 265, 359–366. doi: 10.1098/rspb.1998.0303</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=9523437" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1098/rspb.1998.0303" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Independent+component+filters+of+natural+images+compared+with+simple+cells+in+primary+visual+cortex&author=van+Hateren+J.+H.&author=van+der+Schaaf+A.&publication_year=1998&journal=Proc.+R+Soc.+Lond.+B.&volume=265&pages=359-366" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B74" id="B74"></a>Vinje, W. E., and Gallant, J. L. (2000). Sparse coding and decorrelation in primary visual cortex during natural vision. <i>Science</i> 287, 1273–1276. doi: 10.1126/science.287.5456.1273</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=10678835" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1126/science.287.5456.1273" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Sparse+coding+and+decorrelation+in+primary+visual+cortex+during+natural+vision&author=Vinje+W.+E.&author=Gallant+J.+L.&publication_year=2000&journal=Science&volume=287&pages=1273-1276" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B75" id="B75"></a>Weiss, Y., Simoncelli, E. P., and Adelson, E. H. (2002). Motion illusions as optimal percepts. <i>Nat. Neurosci.</i> 5, 598–604. doi: 10.1038/nn858</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=12021763" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1038/nn858" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=Motion+illusions+as+optimal+percepts&author=Weiss+Y.&author=Simoncelli+E.+P.&author=Adelson+E.+H.&publication_year=2002&journal=Nat.+Neurosci.&volume=5&pages=598-604" target="_blank">Google Scholar</a></p> </div> <div class="References" style="margin-bottom:0.5em;"> <p class="ReferencesCopy1"><a name="B76" id="B76"></a>Wertheimer, M. (1912/1950). “Laws of organization in perceptual forms,” in <i>A Sourcebook of Gestalt Psychology</i>, ed. W. D. Ellis Translator (New York: Humanities Press), 71–88.</p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B77" id="B77"></a>Wojtach, W. T., Sung, K., Truong, S., and Purves, D. (2008). An empirical explanation of the flash-lag effect. <i>Proc. Natl. Acad. Sci. U S A</i> 105, 16338–16343. doi: 10.1073/pnas.0808916105</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=18852459" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1073/pnas.0808916105" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=An+empirical+explanation+of+the+flash-lag+effect&author=Wojtach+W.+T.&author=Sung+K.&author=Truong+S.&author=Purves+D.&publication_year=2008&journal=Proc.+Natl.+Acad.+Sci.+U+S+A&volume=105&pages=16338-16343" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B78" id="B78"></a>Wojtach, W. T., Sung, K., and Purves, D. (2009). An empirical explanation of the speed-distance effect. <i>PLoS One</i> 4:e6771. doi: 10.1371/journal.pone.0006771</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=19707552" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1371/journal.pone.0006771" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=An+empirical+explanation+of+the+speed-distance+effect&author=Wojtach+W.+T.&author=Sung+K.&author=Purves+D.&publication_year=2009&journal=PLoS+One&volume=4&pages=e6771" target="_blank">Google Scholar</a></p> </div> <div class="References"> <p class="ReferencesCopy1"><a name="B79" id="B79"></a>Yang, Z., and Purves, D. (2004). The statistical structure of natural light patterns determines perceived light intensity. <i>Proc. Natl. Acad. Sci. U S A</i> 101, 8745–8750. doi: 10.1073/pnas.0402192101</p> <p class="ReferencesCopy2"><a href="http://www.ncbi.nlm.nih.gov/sites/entrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=15152077" target="_blank">PubMed Abstract</a> | <a href="http://dx.doi.org/10.1073/pnas.0402192101" target="_blank">CrossRef Full Text</a> | <a href="http://scholar.google.com/scholar_lookup?title=The+statistical+structure+of+natural+light+patterns+determines+perceived+light+intensity&author=Yang+Z.&author=Purves+D.&publication_year=2004&journal=Proc.+Natl.+Acad.+Sci.+U+S+A&volume=101&pages=8745-8750" target="_blank">Google Scholar</a></p> </div> </div> <div class="thinLineM20"></div> <div class="AbstractSummary"> <p><span>Keywords:</span> vision, visual perception, feature detection, Bayesian probability, efficient coding, empirical ranking</p> <p><span>Citation:</span> Purves D, Morgenstern Y and Wojtach WT (2015) Perception and Reality: Why a Wholly Empirical Paradigm is Needed to Understand Vision. Front. Syst. Neurosci. 9:156. doi: 10.3389/fnsys.2015.00156</p> <p id="timestamps"><span>Received:</span> 30 July 2015; <span>Accepted:</span> 29 October 2015;<br> <span>Published:</span> 18 November 2015.</p> <div><p>Edited by:</p> <a href="http://loop.frontiersin.org/people/114786/overview">Chrystalina A. Antoniades</a>, University of Oxford, UK</div> <div><p>Reviewed by:</p> <a href="http://loop.frontiersin.org/people/514/overview">Dirk Jancke</a>, Ruhr-University Bochum, Germany<br> <a href="http://loop.frontiersin.org/people/3813/overview">Rava Azeredo Da Silveira</a>, Ecole Normale Supérieure, France<br> <a href="http://loop.frontiersin.org/people/30948/overview">Walter Glannon</a>, University of Calgary, Canada</div> <p><span>Copyright</span> © 2015 Purves, Morgenstern and Wojtach. This is an open-access article distributed under the terms of the <a rel="license" href="http://creativecommons.org/licenses/by/4.0/" target="_blank">Creative Commons Attribution License (CC BY)</a>. The use, distribution and reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p> <p><span>*Correspondence:</span> Dale Purves, <a id="encmail">cHVydmVzQG5ldXJvLmR1a2UuZWR1</a></p> <div class="clear"></div> </div> </div></div> <p class="AbstractSummary__disclaimer"><span>Disclaimer: </span> All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher. </p></div></div></main> <aside class="Layout__aside"><div class="ArticleDetails__wrapper"><div class="ArticleDetails__aside"><div class="ArticleDetails__aside__responsiveButtons"><div id="FloatingButtonsEl" class="ActionsDropDown"><button aria-label="Open dropdown" data-event="actionsDropDown-button-toggle" class="ActionsDropDown__button ActionsDropDown__button--type ActionsDropDown__button--icon"><span class="ActionsDropDown__button__label">Download article</span></button> <div class="ActionsDropDown__menuWrapper"><!----> <ul class="ActionsDropDown__menu"><li><a href="/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/pdf" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-pdf" class="ActionsDropDown__option"> Download PDF </a></li><li><a href="http://www.readcube.com/articles/10.3389/fnsys.2015.00156" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-readCube" class="ActionsDropDown__option"> ReadCube </a></li><li><a href="/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/epub" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-epub" class="ActionsDropDown__option"> EPUB </a></li><li><a href="/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/xml/nlm" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-nlmXml" class="ActionsDropDown__option"> XML (NLM) </a></li></ul> <button aria-label="Close modal" data-event="actionsDropDown-button-close" class="ActionsDropDown__mobileClose"></button></div></div> <div class="ArticleDetails__aside__responsiveButtons__items"><!----> <div class="ArticleDetailsShare__responsive"><button aria-label="Open share options" class="ArticleDetailsShare__trigger"></button> <div class="ArticleDetailsShare"><p class="ArticleDetailsShare__title">Share on</p> <ul class="ArticleDetailsShare__list"><li class="ArticleDetailsShare__item"><a href="https://www.twitter.com/share?url=https://www.frontiersin.org/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/full?utm_source=Email_to_authors_&utm_medium=Email&utm_content=T1_11.5e1_author&utm_campaign=Email_publication&field&journalName=Frontiers_in_Systems_Neuroscience&id=163471" target="_blank" title="Share on X" aria-label="Share on X" class="ArticleDetailsShare__link ArticleDetailsShare__link--x"></a></li><li class="ArticleDetailsShare__item"><a href="https://www.linkedin.com/share?url=https://www.frontiersin.org/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/full?utm_source=Email_to_authors_&utm_medium=Email&utm_content=T1_11.5e1_author&utm_campaign=Email_publication&field&journalName=Frontiers_in_Systems_Neuroscience&id=163471" target="_blank" title="Share on Linkedin" aria-label="Share on Linkedin" class="ArticleDetailsShare__link ArticleDetailsShare__link--linkedin"></a></li><li class="ArticleDetailsShare__item"><a href="https://www.facebook.com/sharer/sharer.php?u=https://www.frontiersin.org/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/full?utm_source=Email_to_authors_&utm_medium=Email&utm_content=T1_11.5e1_author&utm_campaign=Email_publication&field&journalName=Frontiers_in_Systems_Neuroscience&id=163471" target="_blank" title="Share on Facebook" aria-label="Share on Facebook" class="ArticleDetailsShare__link ArticleDetailsShare__link--facebook"></a></li></ul></div></div> <div class="ActionsDropDown"><button aria-label="Open dropdown" data-event="actionsDropDown-button-toggle" class="ActionsDropDown__button ActionsDropDown__button--typeIconButton ActionsDropDown__button--iconQuote"><!----></button> <div class="ActionsDropDown__menuWrapper"><div class="ActionsDropDown__mobileTitle"> Export citation </div> <ul class="ActionsDropDown__menu"><li><a href="/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/endNote" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-endNote" class="ActionsDropDown__option"> EndNote </a></li><li><a href="/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/reference" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-referenceManager" class="ActionsDropDown__option"> Reference Manager </a></li><li><a href="/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/text" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-simpleTextFile" class="ActionsDropDown__option"> Simple Text file </a></li><li><a href="/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/bibTex" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-bibTex" class="ActionsDropDown__option"> BibTex </a></li></ul> <button aria-label="Close modal" data-event="actionsDropDown-button-close" class="ActionsDropDown__mobileClose"></button></div></div></div></div> <div class="TotalViews"><div class="TotalViews__data"><div class="TotalViews__data__metrics"><div class="TotalViews__data__metrics__number"> 21,7K </div> <div class="TotalViews__data__metrics__text"><div class="TotalViews__data__metrics__label">Total views</div></div></div> <div class="TotalViews__data__metrics"><div class="TotalViews__data__metrics__number"> 2,7K </div> <div class="TotalViews__data__metrics__text"><div class="TotalViews__data__metrics__label">Downloads</div></div></div> <div class="TotalViews__data__metrics"><div class="TotalViews__data__metrics__number"> 43 </div> <div class="TotalViews__data__metrics__text"><div class="TotalViews__data__metrics__label">Citations</div></div></div> <div class="ImpactMetricsInfoPopover"><button aria-label="Open impact metrics info" class="ImpactMetricsInfoPopover__button"></button> <div class="ImpactMetricsInfoPopover__tooltip"><button aria-label="Close impact metrics info" class="ImpactMetricsInfoPopover__tooltip__closeButton"></button> <div class="ImpactMetricsInfoPopover__tooltip__text"> Citation numbers are available from Dimensions </div></div></div></div> <div class="TotalViews__viewImpactLink"><span class="Link__wrapper"><a aria-label="View article impact" href="http://loop-impact.frontiersin.org/impact/article/163471#totalviews/views" target="_blank" data-event="customLink-link-a_viewArticleImpact" class="Link Link--linkType Link--maincolor Link--medium Link--icon Link--chevronRight Link--right"><span>View article impact</span></a></span></div> <div class="TotalViews__altmetric"><div data-badge-popover="bottom" data-badge-type="donut" data-doi="10.3389/fnsys.2015.00156" data-condensed="true" data-link-target="new" class="altmetric-embed"></div> <span class="Link__wrapper"><a aria-label="View altmetric score" href="https://www.altmetric.com/details/doi/10.3389/fnsys.2015.00156" target="_blank" data-event="customLink-link-a_viewAltmetricScore" class="Link Link--linkType Link--maincolor Link--medium Link--icon Link--chevronRight Link--right"><span>View altmetric score</span></a></span></div></div> <div class="ArticleDetailsShare"><p class="ArticleDetailsShare__title">Share on</p> <ul class="ArticleDetailsShare__list"><li class="ArticleDetailsShare__item"><a href="https://www.twitter.com/share?url=https://www.frontiersin.org/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/full?utm_source=Email_to_authors_&utm_medium=Email&utm_content=T1_11.5e1_author&utm_campaign=Email_publication&field&journalName=Frontiers_in_Systems_Neuroscience&id=163471" target="_blank" title="Share on X" aria-label="Share on X" class="ArticleDetailsShare__link ArticleDetailsShare__link--x"></a></li><li class="ArticleDetailsShare__item"><a href="https://www.linkedin.com/share?url=https://www.frontiersin.org/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/full?utm_source=Email_to_authors_&utm_medium=Email&utm_content=T1_11.5e1_author&utm_campaign=Email_publication&field&journalName=Frontiers_in_Systems_Neuroscience&id=163471" target="_blank" title="Share on Linkedin" aria-label="Share on Linkedin" class="ArticleDetailsShare__link ArticleDetailsShare__link--linkedin"></a></li><li class="ArticleDetailsShare__item"><a href="https://www.facebook.com/sharer/sharer.php?u=https://www.frontiersin.org/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/full?utm_source=Email_to_authors_&utm_medium=Email&utm_content=T1_11.5e1_author&utm_campaign=Email_publication&field&journalName=Frontiers_in_Systems_Neuroscience&id=163471" target="_blank" title="Share on Facebook" aria-label="Share on Facebook" class="ArticleDetailsShare__link ArticleDetailsShare__link--facebook"></a></li></ul></div> <div class="ArticleDetailsEditors"><div class="ArticleDetailsEditors__editors"><div class="ArticleDetailsEditors__title">Edited by</div> <a href="https://loop.frontiersin.org/people/114786/overview" data-event="editorInfo-a-chrystalinaAAntoniades" class="ArticleDetailsEditors__ediorInfo"><figure class="Avatar Avatar--size-32"><img src="https://loop.frontiersin.org/images/profile/114786/32" alt="Chrystalina A Antoniades" class="Avatar__img is-inside-mask"></figure> <div class="ArticleDetailsEditors__ediorInfo__info"><div class="ArticleDetailsEditors__ediorInfo__name notranslate"> Chrystalina A Antoniades </div> <div class="ArticleDetailsEditors__ediorInfo__affiliation notranslate"> Nuffield Department of Clinical Neurosciences, Medical Sciences Division, University of Oxford, United Kingdom </div></div></a></div></div> <div class="ArticleDetailsEditors"><div class="ArticleDetailsEditors__editors"><div class="ArticleDetailsEditors__title">Reviewed by</div> <a href="https://loop.frontiersin.org/people/514/overview" data-event="editorInfo-a-dirkJancke" class="ArticleDetailsEditors__ediorInfo"><figure class="Avatar Avatar--size-32"><img src="https://loop.frontiersin.org/images/profile/514/32" alt="Dirk Jancke" class="Avatar__img is-inside-mask"></figure> <div class="ArticleDetailsEditors__ediorInfo__info"><div class="ArticleDetailsEditors__ediorInfo__name notranslate"> Dirk Jancke </div> <div class="ArticleDetailsEditors__ediorInfo__affiliation notranslate"> Ruhr University Bochum, Germany </div></div></a><a href="https://loop.frontiersin.org/people/3813/overview" data-event="editorInfo-a-ravaAzeredoDaSilveira" class="ArticleDetailsEditors__ediorInfo"><figure class="Avatar Avatar--size-32"><img src="https://loop.frontiersin.org/images/profile/3813/32" alt="Rava Azeredo da Silveira" class="Avatar__img is-inside-mask"></figure> <div class="ArticleDetailsEditors__ediorInfo__info"><div class="ArticleDetailsEditors__ediorInfo__name notranslate"> Rava Azeredo da Silveira </div> <div class="ArticleDetailsEditors__ediorInfo__affiliation notranslate"> École Normale Supérieure, France </div></div></a><a href="https://loop.frontiersin.org/people/30948/overview" data-event="editorInfo-a-walterGlannon" class="ArticleDetailsEditors__ediorInfo"><figure class="Avatar Avatar--size-32"><img src="https://loop.frontiersin.org/images/profile/30948/32" alt="Walter Glannon" class="Avatar__img is-inside-mask"></figure> <div class="ArticleDetailsEditors__ediorInfo__info"><div class="ArticleDetailsEditors__ediorInfo__name notranslate"> Walter Glannon </div> <div class="ArticleDetailsEditors__ediorInfo__affiliation notranslate"> University of Calgary, Canada </div></div></a></div></div> <div class="ArticleDetailsGlossary ArticleDetailsGlossary--open"><button class="ArticleDetailsGlossary__header"><div class="ArticleDetailsGlossary__header__title">Table of contents</div> <div class="ArticleDetailsGlossary__header__arrow"></div></button> <div class="ArticleDetailsGlossary__content"><ul class="flyoutJournal"> <li><a href="#h1">Abstract</a></li> <li><a href="#h2">Introduction</a></li> <li><a href="#h3">Early Ideas About Vision on an Empirical Basis</a></li> <li><a href="#h4">Vision as Bayesian Inference</a></li> <li><a href="#h5">Information Theoretic Approaches</a></li> <li><a href="#h6">A Wholly Empirical Approach</a></li> <li><a href="#h7">An Example</a></li> <li><a href="#h8">Consequences of Input-Output Associations on a Wholly Empirical Basis</a></li> <li><a href="#h9">Exploring Neuronal Connectivity in Wholly Empirical Terms</a></li> <li><a href="#h10">Vision as Reflexive</a></li> <li><a href="#h11">Limitations of a Wholly Empirical Approach</a></li> <li><a href="#h12">The Wholly Empirical Theory and Cognition</a></li> <li><a href="#h13">Conclusion</a></li> <li><a href="#h14">Conflict of Interest Statement</a></li> <li><a href="#h15">References</a></li> </ul></div></div> <!----> <div class="ActionsDropDown"><button aria-label="Open dropdown" data-event="actionsDropDown-button-toggle" class="ActionsDropDown__button ActionsDropDown__button--typeOutline ActionsDropDown__button--iconQuote"><span class="ActionsDropDown__button__label">Export citation</span></button> <div class="ActionsDropDown__menuWrapper"><!----> <ul class="ActionsDropDown__menu"><li><a href="/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/endNote" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-endNote" class="ActionsDropDown__option"> EndNote </a></li><li><a href="/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/reference" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-referenceManager" class="ActionsDropDown__option"> Reference Manager </a></li><li><a href="/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/text" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-simpleTextFile" class="ActionsDropDown__option"> Simple Text file </a></li><li><a href="/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/bibTex" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-bibTex" class="ActionsDropDown__option"> BibTex </a></li></ul> <button aria-label="Close modal" data-event="actionsDropDown-button-close" class="ActionsDropDown__mobileClose"></button></div></div> <div class="CheckForUpdates"><button data-target="crossmark" data-event="checkForUpdates-btn-openModal" class="CheckForUpdates__link"><img src="/article-pages/_nuxt/img/crossmark.5c8ec60.svg" alt="Crossmark icon" class="CheckForUpdates__link__img"> <div class="CheckForUpdates__link__text">Check for updates</div></button></div> <div class="Announcement"><p class="Announcement__title"> Research integrity at Frontiers </p> <article class="CardA"><div class="CardA__wrapper CardA__wrapper--vertical"><figure class="FrontiersImage CardA__img"><picture class="FrontiersImage"><source srcset="https://images-provider.frontiersin.org/api/ipx/w=440&f=webp/https://d2csxpduxe849s.cloudfront.net/media/E32629C6-9347-4F84-81FEAEF7BFA342B3/0B4B1380-42EB-4FD5-9D7E2DBC603E79F8/webimage-C4875379-1478-416F-B03DF68FE3D8DBB5.png" media="(max-width: 563px)"><source srcset="https://images-provider.frontiersin.org/api/ipx/s=320x400&fit=outside&f=webp/https://d2csxpduxe849s.cloudfront.net/media/E32629C6-9347-4F84-81FEAEF7BFA342B3/0B4B1380-42EB-4FD5-9D7E2DBC603E79F8/webimage-C4875379-1478-416F-B03DF68FE3D8DBB5.png" media="(max-width: 1024px)"><source srcset="https://images-provider.frontiersin.org/api/ipx/s=268x280&fit=outside&f=webp/https://d2csxpduxe849s.cloudfront.net/media/E32629C6-9347-4F84-81FEAEF7BFA342B3/0B4B1380-42EB-4FD5-9D7E2DBC603E79F8/webimage-C4875379-1478-416F-B03DF68FE3D8DBB5.png" media="(max-width: 1441px)"><source srcset="https://images-provider.frontiersin.org/api/ipx/s=366x408&fit=outside&f=webp/https://d2csxpduxe849s.cloudfront.net/media/E32629C6-9347-4F84-81FEAEF7BFA342B3/0B4B1380-42EB-4FD5-9D7E2DBC603E79F8/webimage-C4875379-1478-416F-B03DF68FE3D8DBB5.png" media=""><source srcset="https://d2csxpduxe849s.cloudfront.net/media/E32629C6-9347-4F84-81FEAEF7BFA342B3/0B4B1380-42EB-4FD5-9D7E2DBC603E79F8/webimage-C4875379-1478-416F-B03DF68FE3D8DBB5.png" media=""> <img src="https://d2csxpduxe849s.cloudfront.net/media/E32629C6-9347-4F84-81FEAEF7BFA342B3/0B4B1380-42EB-4FD5-9D7E2DBC603E79F8/webimage-C4875379-1478-416F-B03DF68FE3D8DBB5.png" alt="Man ultramarathon runner in the mountains he trains at sunset" loading="lazy" class="is-inside-mask"></picture> <!----></figure> <div class="CardA__info"><h2 class="CardA__title">94% of researchers rate our articles as excellent or good</h2> <p class="CardA__text">Learn more about the work of our research integrity team to safeguard the quality of each article we publish.</p> <br> <span class="Link__wrapper"><a aria-label="About our research integrity team" href="https://www.frontiersin.org/about/research-integrity" target="_self" data-event="customLink-link-a_findOutMore" class="Link Link--linkType Link--maincolor Link--small Link--icon Link--chevronRight Link--right"><span>Find out more </span></a></span></div></div></article></div> <!----> <!----></div> <!----> <div><div class="FloatingButtons"><!----> <div class="ActionsDropDown"><button aria-label="Open dropdown" data-event="actionsDropDown-button-toggle" class="ActionsDropDown__button ActionsDropDown__button--type ActionsDropDown__button--iconDownload"><span class="ActionsDropDown__button__label">Download article</span></button> <div class="ActionsDropDown__menuWrapper"><div class="ActionsDropDown__mobileTitle"> Download </div> <ul class="ActionsDropDown__menu"><li><a href="/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/pdf" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-pdf" class="ActionsDropDown__option"> Download PDF </a></li><li><a href="http://www.readcube.com/articles/10.3389/fnsys.2015.00156" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-readCube" class="ActionsDropDown__option"> ReadCube </a></li><li><a href="/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/epub" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-epub" class="ActionsDropDown__option"> EPUB </a></li><li><a href="/journals/systems-neuroscience/articles/10.3389/fnsys.2015.00156/xml/nlm" target="_blank" rel="noopener noreferrer" data-event="actionsDropDown-a-nlmXml" class="ActionsDropDown__option"> XML (NLM) </a></li></ul> <button aria-label="Close modal" data-event="actionsDropDown-button-close" class="ActionsDropDown__mobileClose"></button></div></div></div> <!----></div></div></aside></div> <div class="Announcement Announcement--Responsive"><p class="Announcement__title"> Research integrity at Frontiers </p> <article class="CardA"><div class="CardA__wrapper CardA__wrapper--vertical"><figure class="FrontiersImage CardA__img"><picture class="FrontiersImage"><source srcset="https://images-provider.frontiersin.org/api/ipx/w=440&f=webp/https://d2csxpduxe849s.cloudfront.net/media/E32629C6-9347-4F84-81FEAEF7BFA342B3/0B4B1380-42EB-4FD5-9D7E2DBC603E79F8/webimage-C4875379-1478-416F-B03DF68FE3D8DBB5.png" media="(max-width: 563px)"><source srcset="https://images-provider.frontiersin.org/api/ipx/s=320x400&fit=outside&f=webp/https://d2csxpduxe849s.cloudfront.net/media/E32629C6-9347-4F84-81FEAEF7BFA342B3/0B4B1380-42EB-4FD5-9D7E2DBC603E79F8/webimage-C4875379-1478-416F-B03DF68FE3D8DBB5.png" media="(max-width: 1024px)"><source srcset="https://images-provider.frontiersin.org/api/ipx/s=268x280&fit=outside&f=webp/https://d2csxpduxe849s.cloudfront.net/media/E32629C6-9347-4F84-81FEAEF7BFA342B3/0B4B1380-42EB-4FD5-9D7E2DBC603E79F8/webimage-C4875379-1478-416F-B03DF68FE3D8DBB5.png" media="(max-width: 1441px)"><source srcset="https://images-provider.frontiersin.org/api/ipx/s=366x408&fit=outside&f=webp/https://d2csxpduxe849s.cloudfront.net/media/E32629C6-9347-4F84-81FEAEF7BFA342B3/0B4B1380-42EB-4FD5-9D7E2DBC603E79F8/webimage-C4875379-1478-416F-B03DF68FE3D8DBB5.png" media=""><source srcset="https://d2csxpduxe849s.cloudfront.net/media/E32629C6-9347-4F84-81FEAEF7BFA342B3/0B4B1380-42EB-4FD5-9D7E2DBC603E79F8/webimage-C4875379-1478-416F-B03DF68FE3D8DBB5.png" media=""> <img src="https://d2csxpduxe849s.cloudfront.net/media/E32629C6-9347-4F84-81FEAEF7BFA342B3/0B4B1380-42EB-4FD5-9D7E2DBC603E79F8/webimage-C4875379-1478-416F-B03DF68FE3D8DBB5.png" alt="Man ultramarathon runner in the mountains he trains at sunset" loading="lazy" class="is-inside-mask"></picture> <!----></figure> <div class="CardA__info"><h2 class="CardA__title">94% of researchers rate our articles as excellent or good</h2> <p class="CardA__text">Learn more about the work of our research integrity team to safeguard the quality of each article we publish.</p> <br> <span class="Link__wrapper"><a aria-label="About our research integrity team" href="https://www.frontiersin.org/about/research-integrity" target="_self" data-event="customLink-link-a_findOutMore" class="Link Link--linkType Link--maincolor Link--small Link--icon Link--chevronRight Link--right"><span>Find out more </span></a></span></div></div></article></div> <div><!----></div></div></div> <!----> <footer class="Footer"><div class="Footer__wrapper"><div class="Footer__sections"><ul class="Accordion"><li class="Accordion__item"><button class="Accordion__headline"><!----> <div class="Accordion__title">Guidelines</div> <div class="Accordion__space"></div> <div class="Accordion__arrow"></div></button> <div class="Accordion__content Accordion__content--fadeOut" style="height:0px;"><ul><li><a href="https://www.frontiersin.org/guidelines/author-guidelines" target="_self" data-event="footer-block_0-a_authorGuidelines">Author guidelines</a></li><li><a href="https://www.frontiersin.org/guidelines/editor-guidelines" target="_self" data-event="footer-block_0-a_editorGuidelines">Editor guidelines</a></li><li><a href="https://www.frontiersin.org/guidelines/policies-and-publication-ethics" target="_self" data-event="footer-block_0-a_policiesAndPublicationE">Policies and publication ethics</a></li><li><a href="https://www.frontiersin.org/about/fee-policy" target="_self" data-event="footer-block_0-a_feePolicy">Fee policy</a></li></ul></div></li><li class="Accordion__item"><button class="Accordion__headline"><!----> <div class="Accordion__title">Explore</div> <div class="Accordion__space"></div> <div class="Accordion__arrow"></div></button> <div class="Accordion__content Accordion__content--fadeOut" style="height:0px;"><ul><li><a href="https://www.frontiersin.org/articles" target="_self" data-event="footer-block_1-a_articles">Articles</a></li><li><a href="https://www.frontiersin.org/research-topics" target="_self" data-event="footer-block_1-a_researchTopics">Research Topics </a></li><li><a href="https://www.frontiersin.org/journals" target="_self" data-event="footer-block_1-a_journals">Journals</a></li><li><a href="https://www.frontiersin.org/about/how-we-publish" target="_self" data-event="footer-block_1-a_howWePublish">How we publish</a></li></ul></div></li><li class="Accordion__item"><button class="Accordion__headline"><!----> <div class="Accordion__title">Outreach</div> <div class="Accordion__space"></div> <div class="Accordion__arrow"></div></button> <div class="Accordion__content Accordion__content--fadeOut" style="height:0px;"><ul><li><a href="https://forum.frontiersin.org/" target="_blank" data-event="footer-block_2-a_frontiersForum">Frontiers Forum </a></li><li><a href="https://policylabs.frontiersin.org/" target="_blank" data-event="footer-block_2-a_frontiersPolicyLabs">Frontiers Policy Labs </a></li><li><a href="https://kids.frontiersin.org/" target="_blank" data-event="footer-block_2-a_frontiersForYoungMinds">Frontiers for Young Minds</a></li><li><a href="https://www.frontiersplanetprize.org/" target="_blank" data-event="footer-block_2-a_frontiersPlanetPrize">Frontiers Planet Prize</a></li></ul></div></li><li class="Accordion__item"><button class="Accordion__headline"><!----> <div class="Accordion__title">Connect</div> <div class="Accordion__space"></div> <div class="Accordion__arrow"></div></button> <div class="Accordion__content Accordion__content--fadeOut" style="height:0px;"><ul><li><a href="https://helpcenter.frontiersin.org" target="_blank" data-event="footer-block_3-a_helpCenter">Help center</a></li><li><a href="https://loop.frontiersin.org/settings/email-preferences?a=publishers" target="_blank" data-event="footer-block_3-a_emailsAndAlerts">Emails and alerts </a></li><li><a href="https://www.frontiersin.org/about/contact" target="_self" data-event="footer-block_3-a_contactUs">Contact us </a></li><li><a href="https://www.frontiersin.org/submission/submit" target="_self" data-event="footer-block_3-a_submit">Submit</a></li><li><a href="https://careers.frontiersin.org/" target="_blank" data-event="footer-block_3-a_careerOpportunities">Career opportunities</a></li></ul></div></li></ul> <div class="Footer__socialLinks"><div class="Footer__socialLinks__title">Follow us</div> <span class="Link__wrapper"><a aria-label="Frontiers Facebook" href="https://www.facebook.com/Frontiersin" target="_blank" data-event="footer-facebook-a_" class="Link Link--linkType Link--grey Link--medium Link--icon Link--facebook Link--right"><span></span></a></span><span class="Link__wrapper"><a aria-label="Frontiers Twitter" href="https://twitter.com/frontiersin" target="_blank" data-event="footer-twitter-a_" class="Link Link--linkType Link--grey Link--medium Link--icon Link--twitter Link--right"><span></span></a></span><span class="Link__wrapper"><a aria-label="Frontiers LinkedIn" href="https://www.linkedin.com/company/frontiers" target="_blank" data-event="footer-linkedIn-a_" class="Link Link--linkType Link--grey Link--medium Link--icon Link--linkedin Link--right"><span></span></a></span><span class="Link__wrapper"><a aria-label="Frontiers Instagram" href="https://www.instagram.com/frontiersin_" target="_blank" data-event="footer-instagram-a_" class="Link Link--linkType Link--grey Link--medium Link--icon Link--instagram Link--right"><span></span></a></span></div></div> <div class="Footer__copyright"><div><span>© 2025 Frontiers Media S.A. All rights reserved</span></div> <div><a href="https://www.frontiersin.org/legal/privacy-policy" target="_blank">Privacy policy</a> <span>|</span> <a href="https://www.frontiersin.org/legal/terms-and-conditions" target="_blank">Terms and conditions</a></div></div></div></footer> <div class="SnackbarWrapper"><ul class="SnackbarContainer"></ul></div></div></div></div><script>window.__NUXT__=(function(a,b,c,d,e,f,g,h,i,j,k,l,m,n,o,p,q,r,s,t,u,v,w,x,y,z,A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z,_,$,aa,ab,ac,ad,ae,af,ag,ah,ai,aj,ak,al,am,an,ao,ap,aq,ar,as,at,au,av,aw,ax,ay,az,aA,aB,aC,aD,aE,aF,aG,aH,aI,aJ,aK,aL,aM,aN,aO,aP,aQ,aR,aS,aT,aU,aV,aW,aX,aY,aZ,a_,a$,ba,bb,bc,bd,be,bf,bg,bh,bi,bj,bk,bl,bm,bn,bo,bp,bq,br,bs,bt,bu,bv,bw,bx,by,bz,bA,bB,bC,bD,bE,bF,bG,bH,bI,bJ,bK,bL,bM,bN,bO,bP,bQ,bR,bS,bT,bU,bV,bW,bX,bY,bZ,b_,b$,ca,cb,cc,cd,ce,cf,cg,ch,ci,cj,ck,cl,cm,cn,co,cp,cq,cr,cs,ct,cu,cv,cw,cx,cy,cz,cA,cB,cC,cD,cE,cF,cG,cH){return {layout:"ArticleLayout",data:[{}],fetch:{},error:e,state:{currentJournal:{identifier:m,name:o,slug:q,banner:[{id:"B06310AC-9C74-4D95-A83F6460BFCF85C8",src:R,name:"FNSYS_Main Visual_Purple_Website",tags:["connected","signal","medical","network","brainstorm","micro","soma","biological","axon","impulse","nerve","nervous","brain","bio","receptor","detailed","system","anatomy","neural","concept","biology",S,"microscopic","connection","nucleus","synapse","organic","neuron","illustration","body","3d","cell","human","dendrites","electric","neurotransmitter",T,"sensory","structure","biochemistry"],type:B,width:5000,height:3750,idHash:"7e75414a993dce46",archive:n,brandId:U,limited:n,fileSize:11570275,isPublic:c,original:"https:\u002F\u002Fbrand.frontiersin.org\u002Fm\u002F7e75414a993dce46\u002Foriginal\u002FFNSYS_Main-Visual_Purple_Website.jpg",copyright:g,extension:[V],thumbnails:{mini:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002FB06310AC-9C74-4D95-A83F6460BFCF85C8\u002Fmini-DC5F71E4-7006-4507-954D1814CB8735EF.png",thul:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002FB06310AC-9C74-4D95-A83F6460BFCF85C8\u002Fthul-54D4B57C-5988-4B34-964B8ADB801197B4.png",webimage:R,Guidelines:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002FB06310AC-9C74-4D95-A83F6460BFCF85C8\u002F6780BCDF-CFD2-4B7E-9DB31786215F5BF3\u002FGuidelines-FNSYS_Main Visual_Purple_Website.png",WebsiteJpg_XL:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002FB06310AC-9C74-4D95-A83F6460BFCF85C8\u002F6780BCDF-CFD2-4B7E-9DB31786215F5BF3\u002FWebsiteJpg_XL-FNSYS_Main Visual_Purple_Website.jpg",WebsiteWebP_L:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002FB06310AC-9C74-4D95-A83F6460BFCF85C8\u002F6780BCDF-CFD2-4B7E-9DB31786215F5BF3\u002FWebsiteWebP_L-FNSYS_Main Visual_Purple_Website.webp",WebsiteWebP_M:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002FB06310AC-9C74-4D95-A83F6460BFCF85C8\u002F6780BCDF-CFD2-4B7E-9DB31786215F5BF3\u002FWebsiteWebP_M-FNSYS_Main Visual_Purple_Website.webp",WebsiteWebP_XL:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002FB06310AC-9C74-4D95-A83F6460BFCF85C8\u002F6780BCDF-CFD2-4B7E-9DB31786215F5BF3\u002FWebsiteWebP_XL-FNSYS_Main Visual_Purple_Website.webp"},dateCreated:W,description:"Neural network , Brain cells , Human nervous system , Neurons; Shutterstock ID 336721292; purchase_order: Main Visual; job: ; client: ; other:",orientation:C,userCreated:"Caroline Sutter",watermarked:n,dateModified:W,datePublished:"2022-06-27T09:27:09Z",ecsArchiveFiles:[],propertyOptions:["414FB2D4-2283-43FD-BE14E534ECA67928","6C18119B-14BD-4951-B437696F4357BD33","7C692885-DB25-4858-B1FB4FF47B241E9B","D88C0047-EC30-4506-A7DF28A4D765E1CF"],property_Channel:["frontiersin_org"],"property_Sub-Type":["Main_Visual"],property_Asset_Type:["Photography"],activeOriginalFocusPoint:{x:2500,y:1875},property_Office_Department:["Publishing"]}],description:"Part of a popular neuroscience journal series which explores the architecture of brain systems and information processing, storage and retrieval.",mission:"\u003Cp\u003EFrontiers in Systems Neuroscience is a multidisciplinary journal devoted to understanding the whole systems of the brain, such as those involved in spontaneous dynamics in different brain states or in sensation, movement, learning and memory, attention, reward, decision-making, reasoning, executive functions, and emotions.\u003C\u002Fp\u003E\n\n\u003Cp\u003EWith an international Editorial Board led by Chief Editors Maria V Sanchez-Vives (Biomedical Research Institute August Pi i Sunyer, Spain) and Olivia Gosseries (University of Liège, Belgium), this journal addresses the structural and functional architecture and dynamics of brain systems, as well as the principles of information processing, storage, and retrieval at the systems level. Indexed in PubMed Central (PMC), Scopus, Web of Science (SCIE), and the DOAJ, the journal covers the analysis of individual regions and multiple levels and nodes of information processing.\u003C\u002Fp\u003E\n\n\u003Cp\u003EThe journal covers the full range of systems neuroscience research, including:\u003C\u002Fp\u003E\n\n\u003Cul\u003E\n \u003Cli\u003Ecognitive modules that are engaged in specific tasks\u003C\u002Fli\u003E\n \u003Cli\u003Edynamics of neuronal networks\u003C\u002Fli\u003E\n \u003Cli\u003Eexperimental approaches involving different techniques ranging from single neurons recordings to full brain electrophysiology or imaging\u003C\u002Fli\u003E\n \u003Cli\u003Eexperimental, computational and combined approaches to systems neuroscience\u003C\u002Fli\u003E\n \u003Cli\u003Einformation processing and computations in neural circuits\u003C\u002Fli\u003E\n \u003Cli\u003Eshort- and long-term changes in circuits that arise during activity-dependent development and plasticity\u003C\u002Fli\u003E\n \u003Cli\u003Estudies of brain function in physiological versus pathological conditions\u003C\u002Fli\u003E\n \u003Cli\u003Estudies of spontaneous and evoked activity, including sleep studies.\u003C\u002Fli\u003E\n\u003C\u002Ful\u003E\n\n\u003Cp\u003EFrontiers in Systems Neuroscience is also interested in approaches addressing the following:\u003C\u002Fp\u003E\n\n\u003Cul\u003E\n \u003Cli\u003Ecellular analyses of reduced preparations that leave intact key computation elements\u003C\u002Fli\u003E\n \u003Cli\u003Ecomputational studies that complement experiments to better understand the emergent transformations in brain systems\u003C\u002Fli\u003E\n \u003Cli\u003Emultiple analysis approaches to better understand brain function during multiple form and time scales of information processing – in the normal and diseased brain and any species\u003C\u002Fli\u003E\n \u003Cli\u003Erecent advances in high-resolution imaging of brain activity and structure at network, cellular, and sub-cellular levels, along with the application of molecular tools\u003C\u002Fli\u003E\n \u003Cli\u003Esingle-cell responses in alert primates.\u003C\u002Fli\u003E\n\u003C\u002Ful\u003E\n\n\u003Cp\u003EThe journal also welcomes submissions that advance the UN's Sustainable Development Goals (SDGs), specifically SDG 3: good health and well-being, promoting our biological understanding of brain health.\u003C\u002Fp\u003E\n\n\u003Cp\u003EFrontiers in Systems Neuroscience is committed to forwarding the study of the whole systems of the brain by allowing unrestricted access to articles and communicating scientific knowledge to researchers and the public to enable future scientific breakthroughs.\u003C\u002Fp\u003E",palette:"purple",impactFactor:"3 .0",citeScore:"5 .0",citations:"56400",showTagline:e,twitter:"@FrontNeurosci",__typename:"Journal"},currentFrontiersJournal:{id:m,name:o,slug:q,printISSN:e,shortName:D,electronicISSN:E,abbreviation:X,specialtyId:Y,publicationDate:e,isOnline:h,isOpenForSubmissions:h,spaceId:c,field:{id:Z,domainId:c,__typename:_},__typename:a},articleHubSlug:g,articleHubPage:F,currentArticle:{id:163471,doi:$,title:G,acceptanceDate:new Date(1446130384000),receptionDate:new Date(1438247618000),publicationDate:new Date(1447804800000),isPublished:h,abstract:aa,researchTopic:{id:ab,title:"Paradigm shifts and innovations in Neuroscience",articlesCount:89,isMagazinePage:l,slug:"paradigm-shifts-and-innovations-in-neuroscience",isOpenForSubmission:l},articleType:{id:26,name:"Hypothesis and Theory"},stage:{id:H,name:g},keywords:["Vision","Visual Perception","feature detection","Bayesian probability","efficient coding","empirical ranking"],authors:[{id:ac,firstName:ad,lastName:"Purves",givenNames:ad,isCorresponding:h,isProfilePublic:h,userId:ac,affiliations:[{organizationName:ae,countryName:af,cityName:g,stateName:g,zipCode:g}]},{id:ag,firstName:ah,lastName:"Morgenstern",givenNames:ah,isCorresponding:l,isProfilePublic:h,userId:ag,affiliations:[{organizationName:ai,countryName:aj,cityName:g,stateName:g,zipCode:g}]},{id:ak,firstName:al,lastName:"Wojtach",givenNames:al,isCorresponding:l,isProfilePublic:h,userId:ak,affiliations:[{organizationName:ae,countryName:af,cityName:g,stateName:g,zipCode:g},{organizationName:ai,countryName:aj,cityName:g,stateName:g,zipCode:g}]}],editors:[{id:am,firstName:"Chrystalina",lastName:"Antoniades",givenNames:"Chrystalina A",isCorresponding:l,isProfilePublic:h,userId:am,affiliations:[{organizationName:"Nuffield Department of Clinical Neurosciences, Medical Sciences Division, University of Oxford",countryName:"United Kingdom",cityName:g,stateName:g,zipCode:g}]}],reviewers:[{id:an,firstName:ao,lastName:"Jancke",givenNames:ao,isCorresponding:l,isProfilePublic:h,userId:an,affiliations:[{organizationName:"Ruhr University Bochum",countryName:"Germany",cityName:g,stateName:g,zipCode:g}]},{id:ap,firstName:aq,lastName:"Azeredo da Silveira",givenNames:aq,isCorresponding:l,isProfilePublic:h,userId:ap,affiliations:[{organizationName:"École Normale Supérieure",countryName:"France",cityName:g,stateName:g,zipCode:g}]},{id:ar,firstName:as,lastName:"Glannon",givenNames:as,isCorresponding:l,isProfilePublic:h,userId:ar,affiliations:[{organizationName:"University of Calgary",countryName:"Canada",cityName:g,stateName:g,zipCode:g}]}],journal:{id:m,slug:q,name:o,shortName:D,electronicISSN:E,field:{id:Z,domainId:c,__typename:_},specialtyId:Y,journalSectionPaths:[],__typename:a},section:e,impactMetrics:{views:21721,downloads:2710,citations:43},volume:I,articleVolume:"Volume 9 - 2015",relatedArticles:[{id:224020,typeId:29,typeName:"General Commentary",title:"Commentary: Perception and Reality: Why a Wholly Empirical Paradigm is Needed to Understand Vision",url:"\u002Farticles\u002F10.3389\u002Ffnsys.2016.00077",isOriginalArticle:l,recentDate:new Date(1474416000000)}],isPublishedV2:l,contents:{titleHtml:G,fullTextHtml:"\u003Cdiv class=\"JournalAbstract\"\u003E\r\n\u003Ca id=\"h1\" name=\"h1\"\u003E\u003C\u002Fa\u003E\r\n\u003Cdiv class=\"authors\"\u003E\u003Cspan class=\"author-wrapper notranslate\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Floop.frontiersin.org\u002Fpeople\u002F37265\" class=\"user-id-37265\"\u003E\u003Cimg class=\"pr5\" src=\"https:\u002F\u002Floop.frontiersin.org\u002Fimages\u002Fprofile\u002F37265\u002F74\" onerror=\"this.onerror=null;this.src='https:\u002F\u002Floop.frontiersin.org\u002Fcdn\u002Fimages\u002Fprofile\u002Fdefault_32.jpg';\" alt=\"\\r\\nDale Purves*\"\u003EDale Purves\u003C\u002Fa\u003E\u003Csup\u003E1*\u003C\u002Fsup\u003E\u003C\u002Fspan\u003E\u003Cspan class=\"author-wrapper notranslate\"\u003E\u003Ca href=\"https:\u002F\u002Floop.frontiersin.org\u002Fpeople\u002F101928\" class=\"user-id-101928\"\u003E\u003Cimg class=\"pr5\" src=\"https:\u002F\u002Floop.frontiersin.org\u002Fimages\u002Fprofile\u002F101928\u002F74\" onerror=\"this.onerror=null;this.src='https:\u002F\u002Floop.frontiersin.org\u002Fcdn\u002Fimages\u002Fprofile\u002Fdefault_32.jpg';\" alt=\"Yaniv Morgenstern\"\u003EYaniv Morgenstern\u003C\u002Fa\u003E\u003Csup\u003E2\u003C\u002Fsup\u003E\u003C\u002Fspan\u003E\u003Cspan class=\"author-wrapper notranslate\"\u003E\u003Ca href=\"https:\u002F\u002Floop.frontiersin.org\u002Fpeople\u002F149132\" class=\"user-id-149132\"\u003E\u003Cimg class=\"pr5\" src=\"https:\u002F\u002Floop.frontiersin.org\u002Fimages\u002Fprofile\u002F149132\u002F74\" onerror=\"this.onerror=null;this.src='https:\u002F\u002Floop.frontiersin.org\u002Fcdn\u002Fimages\u002Fprofile\u002Fdefault_32.jpg';\" alt=\"William T. Wojtach,\"\u003EWilliam T. Wojtach\u003C\u002Fa\u003E\u003Csup\u003E1,2\u003C\u002Fsup\u003E\u003C\u002Fspan\u003E\u003C\u002Fdiv\u003E\r\n\u003Cul class=\"notes\"\u003E\r\n\u003Cli\u003E\u003Cspan\u003E\u003Csup\u003E1\u003C\u002Fsup\u003E\u003C\u002Fspan\u003EDuke Institute for Brain Sciences, Duke University, Durham, NC, USA\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Cspan\u003E\u003Csup\u003E2\u003C\u002Fsup\u003E\u003C\u002Fspan\u003EDuke-NUS Graduate Medical School, Singapore, Singapore\u003C\u002Fli\u003E\r\n\u003C\u002Ful\u003E \r\n\u003Cp\u003EA central puzzle in vision science is how perceptions that are routinely at odds with physical measurements of real world properties can arise from neural responses that nonetheless lead to effective behaviors. Here we argue that the solution depends on: (1) rejecting the assumption that the goal of vision is to recover, however imperfectly, properties of the world; and (2) replacing it with a paradigm in which perceptions reflect biological utility based on past experience rather than objective features of the environment. Present evidence is consistent with the conclusion that conceiving vision in wholly empirical terms provides a plausible way to understand what we see and why.\u003C\u002Fp\u003E \r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"JournalFullText\"\u003E\r\n\u003Ca id=\"h2\" name=\"h2\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EIntroduction\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb15\"\u003EA widely accepted concept of vision in recent decades stems from studies carried out by Stephen Kuffler, David Hubel and Torsten Wiesel beginning in the 1950s (\u003Ca href=\"#B45\"\u003EKuffler, 1953\u003C\u002Fa\u003E; \u003Ca href=\"#B33\"\u003EHubel and Wiesel, 2005\u003C\u002Fa\u003E). This seminal work showed that neurons in the primary visual pathway of cats and monkeys respond to light stimuli in specific ways, implying that the detection of retinal image features plays a central role in visual perception. Based on the properties of simpler input-level cells, Hubel and Wiesel discovered that neurons in V1 respond selectively to retinal activation elicited by oriented bars of light, bars of a certain length, bars moving in different directions, and stimuli with different spectral properties. These and other findings earned Hubel and Wiesel a Nobel Prize in 1981 (Kuffler had died in 1980), and inspired a generation of scientists to pursue similar electrophysiological and neuroanatomical research in a variety of species in the ongoing effort to reveal how vision works.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EA seemingly straightforward interpretation of these observations is that the visual system operates analytically, extracting features from retinal images, efficiently filtering and processing image features in a series of computational steps, and ultimately combining them to provide a close approximation of physical reality that is then used to guide behavior. This concept of visual perception is logical, accords with electrophysiological and anatomical evidence, and has the further merit of being similar to the operation of computers, providing an analogy that connects biological vision with machine vision and artificial intelligence (\u003Ca href=\"#B50\"\u003EMarr, 1982\u003C\u002Fa\u003E). Finally, this interpretation concurs with the impression that we see the world more or less as it really is and behave accordingly. Indeed, to do otherwise would seem to defy common sense and insure failure.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb0\"\u003EAttractive though it is, this interpretation fails to consider an axiomatic fact about biological vision: retinal images conflate the physical properties of objects, and therefore cannot be used to recover the objective properties of the world (Figure \u003Ca href=\"#F1\"\u003E1\u003C\u002Fa\u003E). Consequently, the basic visual qualities we perceive—e.g., lightness, color, form, distance, depth and motion—cannot specify reality. A further fact that adds to the challenge of understanding how vision works is the discrepancy between these perceived qualities and the physical parameters of objects and conditions in the world (Figure \u003Ca href=\"#F2\"\u003E2\u003C\u002Fa\u003E). As numerous psychophysical studies have shown, lightness and darkness percepts are at odds with luminance, color is at odds with distributions of spectral power, size, distance and depth are at odds with geometrical measurements, and speeds and directions of motion are at odds with measured vectors (\u003Ca href=\"#B25\"\u003EGelb, 1929\u003C\u002Fa\u003E; \u003Ca href=\"#B69\"\u003EStevens, 1975\u003C\u002Fa\u003E; \u003Ca href=\"#B62\"\u003ERock, 1984\u003C\u002Fa\u003E; \u003Ca href=\"#B61\"\u003ERobinson, 1998\u003C\u002Fa\u003E; \u003Ca href=\"#B56\"\u003EPurves and Lotto, 2003\u003C\u002Fa\u003E; \u003Ca href=\"#B77\"\u003EWojtach et al., 2008\u003C\u002Fa\u003E, \u003Ca href=\"#B78\"\u003E2009\u003C\u002Fa\u003E; \u003Ca href=\"#B71\"\u003ESung et al., 2009\u003C\u002Fa\u003E; \u003Ca href=\"#B58\"\u003EPurves et al., 2014\u003C\u002Fa\u003E). These differences between perception and reality cannot be dismissed as minor errors or approximations that are “close enough” to succeed, since the discrepancies are ubiquitous and often profound (see Figure \u003Ca href=\"#F2\"\u003E2A\u003C\u002Fa\u003E, for example).\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003EFIGURE 1\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g001.jpg\" name=\"figure1\" target=\"_blank\"\u003E\r\n\n \u003Cpicture\u003E\n \u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=480&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g001.jpg\" media=\"(max-width: 563px)\"\u003E\u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=370&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g001.jpg\" media=\"(max-width: 1024px)\"\u003E\u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=290&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g001.jpg\" media=\"(max-width: 1441px)\"\u003E\u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=410&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g001.jpg\" media=\"\"\u003E\u003Csource type=\"image\u002Fjpg\" srcset=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g001.jpg\" media=\"\"\u003E \u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g001.jpg\" alt=\"www.frontiersin.org\" id=\"F1\" loading=\"lazy\"\u003E\n \u003C\u002Fpicture\u003E\n\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003EFigure 1. The major obstacle to the concept of vision as feature representation. (A)\u003C\u002Fstrong\u003E Luminance values in retinal stimuli are determined by illumination and reflectance, as well as a host of other factors (e.g., atmospheric transmittance, spectral content, and many more). These physical parameters are conflated in light stimuli, however, precluding biological measurements of the objective world in which perceptions and behaviors must play out. \u003Cstrong\u003E(B)\u003C\u002Fstrong\u003E The analogous conflation of geometrical information in retinal stimuli.\u003C\u002Fp\u003E\u003C\u002Fdiv\u003E \r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine mb15\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003EFIGURE 2\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g002.jpg\" name=\"figure2\" target=\"_blank\"\u003E\r\n\n \u003Cpicture\u003E\n \u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=480&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g002.jpg\" media=\"(max-width: 563px)\"\u003E\u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=370&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g002.jpg\" media=\"(max-width: 1024px)\"\u003E\u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=290&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g002.jpg\" media=\"(max-width: 1441px)\"\u003E\u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=410&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g002.jpg\" media=\"\"\u003E\u003Csource type=\"image\u002Fjpg\" srcset=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g002.jpg\" media=\"\"\u003E \u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g002.jpg\" alt=\"www.frontiersin.org\" id=\"F2\" loading=\"lazy\"\u003E\n \u003C\u002Fpicture\u003E\n\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003EFigure 2. The perception of basic visual qualities is at odds with the world assessed by physical instruments. (A)\u003C\u002Fstrong\u003E One of many examples generated over the last century or more illustrating the discrepancy between luminance and lightness. Although each of the patches indicated in the inset returns the same amount of light to the eye (i.e., they have the same luminance), their apparent lightness values in the scene are very different. \u003Cstrong\u003E(B)\u003C\u002Fstrong\u003E An example of the discrepancy between perceived and measured geometry that has again been repeatedly documented since the mid-19th century. The lines on the left are all of equal length, but, as shown on the right, are perceived differently depending on their orientation (apparent length is expressed in relation to the horizontal line, which is seen as shortest in psychophysical testing).\u003C\u002Fp\u003E\u003C\u002Fdiv\u003E \r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb0 w100pc float_left mt15\"\u003EThe result has been diminished confidence in concepts of vision based on retinal feature detection, opening the door to other ways of understanding visual perception, the purposes of visual circuitry, and the genesis of visually guided behavior. A common denominator of these alternative views is the use of past experience—i.e., empirical evidence—to explain vision.\u003C\u002Fp\u003E\r\n\u003Ca id=\"h3\" name=\"h3\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EEarly Ideas About Vision on an Empirical Basis\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb15\"\u003EThe loss of information due to the transformation of three-dimensional (3-D) Euclidean space into two-dimensional (2-D) images and the introduction of noise inherent in biological processes led some early schools of psychology to advocate theories of vision that included the influence of lifetime experience. This line of thinking began in the mid-19th century when Hermann von Helmholtz proposed that perceptions arising from impoverished images are supplemented by “unconscious inferences” about reality made on the basis of individual experience (\u003Ca href=\"#B29\"\u003EHelmholtz, 1866\u002F1924\u003C\u002Fa\u003E). He added the qualifier “unconscious” because observers are rarely aware of how their past experience could affect perception.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EFor much of the first half of the 20th century the role of empirical information in determining perception was conceived in terms of gestalt laws or other heuristics. The gestalt school was founded shortly after the turn of the century by \u003Ca href=\"#B76\"\u003EMax Wertheimer (1912\u002F1950)\u003C\u002Fa\u003E, and advanced under the aegis of his students \u003Ca href=\"#B41\"\u003EKurt Koffka (1935)\u003C\u002Fa\u003E and \u003Ca href=\"#B42\"\u003EWolfgang Köhler (1947)\u003C\u002Fa\u003E. At the core of gestalt theory is the idea that the “units of experience go with the functional units in the underlying physiological processes” (\u003Ca href=\"#B42\"\u003EWolfgang Köhler, 1947\u003C\u002Fa\u003E, p. 63). In gestalt terms, this influence was codified as the “the law of präganz” (meaning “conciseness”), expressing the idea that, based on experience, any perception would be determined by the simplest possible source of the image in question. Building on some of these ideas Egon Brünswik argued further that, in order to fully understand perception, the connection between the organism and the environment must be clarified. Given that information acquired by sense organs is uncertain, he supposed that visual animals must rely on the statistical nature of environments to achieve their goals. As described in his theory of “probabilistic functionalism” (\u003Ca href=\"#B13\"\u003EBrünswik, 1956\u002F1997\u003C\u002Fa\u003E), Brünswik anticipated some current empirical approaches to vision based on probable world states (see Vision as Bayesian Inference).\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EBrünswik’s emphasis on the environment influenced the subsequent work of \u003Ca href=\"#B26\"\u003EJames Gibson (1966\u003C\u002Fa\u003E, \u003Ca href=\"#B27\"\u003E1979)\u003C\u002Fa\u003E, who carried empirical thinking in yet another direction by arguing that perception is determined by the objects and circumstances observers are exposed to when moving though the world. Gibson proposed that observers could directly perceive their environment by relying on “invariances” in the structure of retinal images, a position similar to the statistical regularity of objects and conditions (e.g., commonly encountered ratios, proportions, and the like) identified by Brünswik. The invariances used by agents exploring the world led Gibson to posit vision as a “perceptual system” that included both the body and its environment—a position fundamentally different from Helmholtz’s idea of empirically modifying retinal image information acquired by a “sensing system”. In the case of size and distance, for example, Gibson maintained that the ratio of object projections to background textures provided the kind of invariant information that would allow an observer to directly apprehend otherwise ambiguous size-distance relationships. He took the mechanism to be one of “resonance” between the activity of a perceptual system and the properties of the environment that gave rise to light stimuli.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EAlthough these early empirical strategies were imaginative and in some ways prescient, they suffered from an absence of ties to the structure and function of animal visual systems. Thus Helmholtz, Wertheimer, Koffka, Köhler, Brünswik, and Gibson were necessarily vague, speculative or simply mute about how empirical information might be usefully implemented in visual system physiology and anatomy. In consequence, empirical approaches to vision began to languish at mid-century, while visual neurobiology with its increasingly concrete evidence about how visual systems operate at the neuronal level came to dominate vision science in the 1960s and for the next several decades (\u003Ca href=\"#B33\"\u003EHubel and Wiesel, 2005\u003C\u002Fa\u003E).\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb0\"\u003EBy the 1990s, however, it was becoming increasingly apparent that, despite key insights into the feature-selective properties of visual neurons, neurophysiological and neuroanatomical approaches to perception were unable to explain how processing retinal image features could, in principle, contend with the inability of visual stimuli to convey information about the objective properties of the world (see Figure \u003Ca href=\"#F1\"\u003E1\u003C\u002Fa\u003E). At the same time, advances in computer hardware and software were rapidly making the evaluation of large datasets relatively easy. Accordingly, investigators began to re-examine the merits of vision determined by past experience. The basis of much of this thinking has been that visual perception could be understood as probabilistic inferences about the most likely physical states of the world.\u003C\u002Fp\u003E\r\n\u003Ca id=\"h4\" name=\"h4\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EVision as Bayesian Inference\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb0\"\u003EThe most popular approach to vision as statistical inference is based on Bayesian decision theory (\u003Ca href=\"#B39\"\u003EKnill and Richards, 1996\u003C\u002Fa\u003E; \u003Ca href=\"#B49\"\u003EMamassian et al., 2002\u003C\u002Fa\u003E; \u003Ca href=\"#B37\"\u003EKersten and Yuille, 2003\u003C\u002Fa\u003E; \u003Ca href=\"#B47\"\u003ELee and Mumford, 2003\u003C\u002Fa\u003E; \u003Ca href=\"#B36\"\u003EKersten et al., 2004\u003C\u002Fa\u003E; \u003Ca href=\"#B38\"\u003EKnill and Pouget, 2004\u003C\u002Fa\u003E; \u003Ca href=\"#B43\"\u003EKörding, 2014\u003C\u002Fa\u003E). In effect, investigators built on Helmholtz’s idea of unconscious inference, formally recasting it in terms of Bayes’ theorem (\u003Ca href=\"#B6\"\u003EBayes, 1763\u003C\u002Fa\u003E), a widely used procedure for assessing the probability of an inference being correct given a set of inconclusive evidence. The theorem states that the probability of inference \u003Ci\u003EA\u003C\u002Fi\u003E being true given evidence \u003Ci\u003EB\u003C\u002Fi\u003E (the posterior probability) depends on the probability of obtaining \u003Ci\u003EB\u003C\u002Fi\u003E given inference \u003Ci\u003EA\u003C\u002Fi\u003E (the likelihood), multiplied by the probability of inference \u003Ci\u003EA\u003C\u002Fi\u003E being true (the prior), these factors typically being normalized by dividing by the probability of evidence \u003Ci\u003EB\u003C\u002Fi\u003E. Thus the theorem can be written as:\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"equationImageholder\"\u003E\r\n\u003Cmath id=\"M1\" dislay=\"block\"\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003Ep\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E(\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003EA\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"9pt\" mathcolor=\"black\"\u003E|\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003EB\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E)\u003C\u002Fmo\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E=\u003C\u002Fmo\u003E\u003Cmfrac\u003E\u003Cmrow\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003Ep\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E(\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003EB\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"9pt\" mathcolor=\"black\"\u003E|\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003EA\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E)\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003Ep\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E(\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003EA\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmrow\u003E\u003Cmrow\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003Ep\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E(\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003EB\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmrow\u003E\u003C\u002Fmfrac\u003E\u003Cmrow\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\" lspace=\"3em\"\u003E(\u003C\u002Fmo\u003E\u003Cmn mathsize=\"11pt\" mathcolor=\"black\"\u003E1\u003C\u002Fmn\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmrow\u003E\u003C\u002Fmath\u003E \r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb0\"\u003ETo illustrate a Bayesian approach to vision, consider a simple example in which an image is generated by a single light source and a given surface reflectance (e.g., \u003Ca href=\"#B12\"\u003EBrainard, 2009\u003C\u002Fa\u003E; see also \u003Ca href=\"#B1\"\u003EAllred and Brainard, 2013\u003C\u002Fa\u003E). Although many physical factors are involved in generating natural images (see Figure \u003Ca href=\"#F1\"\u003E1A\u003C\u002Fa\u003E), the luminance values (L) in an image are primarily the product of the intensity of illumination (I) and reflectance properties of surfaces (R). Thus the first step in validating the idea that vision follows Bayes’ theorem would be to determine the probability distributions of surface reflectance and illumination values—the priors \u003Ci\u003Ep\u003C\u002Fi\u003E(R) and \u003Ci\u003Ep\u003C\u002Fi\u003E(I), respectively—which can be approximated by measurements in the environment. The next step would be to derive the likelihood function \u003Ci\u003Ep\u003C\u002Fi\u003E(L|R, I), i.e., the probability of a specific luminance being generated by various surface reflectance and illumination intensities. The posterior distribution, \u003Ci\u003Ep\u003C\u002Fi\u003E(R, I|L), is then obtained by multiplying the prior distribution by the likelihood function:\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"equationImageholder\"\u003E\r\n\u003Cmath id=\"M2\" dislay=\"block\"\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003Ep\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E(\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003ER\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E,\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003EI\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"9pt\" mathcolor=\"black\"\u003E|\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003EL\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E)\u003C\u002Fmo\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E=\u003C\u002Fmo\u003E\u003Cmfrac\u003E\u003Cmrow\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003Ep\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E(\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003EL\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"9pt\" mathcolor=\"black\"\u003E|\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003ER\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E,\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003EI\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E)\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003Ep\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E(\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003ER\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E)\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003Ep\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E(\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003EI\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmrow\u003E\u003Cmrow\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003Ep\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E(\u003C\u002Fmo\u003E\u003Cmi mathsize=\"11pt\" mathcolor=\"black\"\u003EL\u003C\u002Fmi\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmrow\u003E\u003C\u002Fmfrac\u003E\u003Cmrow\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\" lspace=\"3em\"\u003E(\u003C\u002Fmo\u003E\u003Cmn mathsize=\"11pt\" mathcolor=\"black\"\u003E2\u003C\u002Fmn\u003E\u003Cmo stretchy='false' mathsize=\"11pt\" mathcolor=\"black\"\u003E)\u003C\u002Fmo\u003E\u003C\u002Fmrow\u003E\u003C\u002Fmath\u003E \r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb0\"\u003EBecause the posterior distribution indicates only the relative probabilities of a set of possible sources, a final step is to select particular reflectance and illumination values from the set according to an assumed gain-loss function. The perceptual outcome—the lightness seen—would presumably accord with the surface reflectance at the most likely combination of surface reflectance and illuminant intensity values. Thus, perceived lightness is taken to be an estimate of surface reflectance.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EExperimental assessments of the responses to visual stimuli are made in terms of a Bayesian “ideal observer”, defined as an observer who always responds to the most probable state of the world (\u003Ca href=\"#B21\"\u003EGeisler, 2011\u003C\u002Fa\u003E)—e.g., the most probable surface reflectance value that could have given rise to a retinal luminance value. As indicated in equation (2), an experimenter can validate how well humans approach this ideal by measuring perceptual estimates—in this case, by gauging perceived lightness, which is assumed to be an estimate of surface reflectance—and comparing these to predictions that combine stimulus information in a statistically optical fashion (e.g., \u003Ca href=\"#B18\"\u003EErnst and Banks, 2002\u003C\u002Fa\u003E; \u003Ca href=\"#B75\"\u003EWeiss et al., 2002\u003C\u002Fa\u003E). Studies of this sort have supported the conclusion that vision can indeed be modeled as a system based on Bayesian inferences. Whether estimating surface slant (\u003Ca href=\"#B40\"\u003EKnill and Saunders, 2003\u003C\u002Fa\u003E), responding to apparent motion stimuli (\u003Ca href=\"#B75\"\u003EWeiss et al., 2002\u003C\u002Fa\u003E; \u003Ca href=\"#B70\"\u003EStocker and Simoncelli, 2006\u003C\u002Fa\u003E), planning movements (\u003Ca href=\"#B44\"\u003EKörding and Wolpert, 2004\u003C\u002Fa\u003E; \u003Ca href=\"#B72\"\u003ETassinari et al., 2006\u003C\u002Fa\u003E), integrating somatosensory haptics and visual cues (\u003Ca href=\"#B18\"\u003EErnst and Banks, 2002\u003C\u002Fa\u003E), combining prior real world assumptions with those in the scene at hand (\u003Ca href=\"#B51\"\u003EMorgenstern et al., 2011\u003C\u002Fa\u003E), or reporting lightness (\u003Ca href=\"#B1\"\u003EAllred and Brainard, 2013\u003C\u002Fa\u003E), subjects perform at close to Bayesian optimality.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EThe compelling logic of Bayesian decision theory and its useful formalization of Helmholtz’s concept of empirical inference notwithstanding, Bayesian approaches that rely on estimating properties of the world are at a loss when seeking to understand visual neurobiology and\u002For the neural mechanisms underlying psychophysical functions. The reason is simply that biological visual systems cannot acquire the information that Bayesian decision theory demands: when a Bayesian ideal observer predicts perception, it is because the perceived quality is assumed to estimate the actual properties and conditions in the world. Given the inherent ambiguity of retinal images (see Figure \u003Ca href=\"#F1\"\u003E1\u003C\u002Fa\u003E), however, Bayesian priors and likelihoods of reflectance, illumination or other physical variables are not available to biological visual systems.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb0\"\u003EAlthough it is possible to model how neural activity in different sensory systems could be combined using Bayesian decision theory (\u003Ca href=\"#B19\"\u003EFetsch et al., 2013\u003C\u002Fa\u003E), such models cannot indicate how information about the physical world could be obtained in a way that avoids the quandary illustrated in Figure \u003Ca href=\"#F1\"\u003E1\u003C\u002Fa\u003E. Indeed, any model based on recovering or estimating real-world parameters, statistically or otherwise, will fail as a canonical explanation of visual perception (see also \u003Ca href=\"#B35\"\u003EJones and Love, 2011\u003C\u002Fa\u003E; \u003Ca href=\"#B10\"\u003EBowers and Davis, 2012\u003C\u002Fa\u003E). Biological vision must therefore depend on some other strategy that does not require accessing the real-world parameters of image sources.\u003C\u002Fp\u003E\r\n\u003Ca id=\"h5\" name=\"h5\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EInformation Theoretic Approaches\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb15\"\u003EA different empirical approach to vision is based on information theory. Within a few years of Claude Shannon’s idea of using Boolean algebra to design switching circuits that could make messages transmitted over noisy communication channels more efficient (\u003Ca href=\"#B64\"\u003EShannon, 1948\u003C\u002Fa\u003E; \u003Ca href=\"#B65\"\u003EShannon and Weaver, 1949\u003C\u002Fa\u003E), this framework was applied to vision (\u003Ca href=\"#B3\"\u003EAttneave, 1954\u003C\u002Fa\u003E; \u003Ca href=\"#B5\"\u003EBarlow, 1961\u003C\u002Fa\u003E). The premise of these studies was that the properties of visual and other sensory systems would encode, transmit, and decode the empirical characteristics of naturally occurring stimuli with maximum efficiency. Subsequent approaches in these terms have variously interpreted vision to operate on the basis of predictive coding (\u003Ca href=\"#B68\"\u003ESrinivasan et al., 1982\u003C\u002Fa\u003E; \u003Ca href=\"#B60\"\u003ERao and Ballard, 1999\u003C\u002Fa\u003E; \u003Ca href=\"#B30\"\u003EHosoya et al., 2005\u003C\u002Fa\u003E); coding that de-correlates the information of noisy inputs (\u003Ca href=\"#B5\"\u003EBarlow, 1961\u003C\u002Fa\u003E; \u003Ca href=\"#B46\"\u003ELaughlin, 1981\u003C\u002Fa\u003E); a filtering scheme for ensuring sparse coding (\u003Ca href=\"#B54\"\u003EOlshausen and Field, 1996\u003C\u002Fa\u003E); and\u002For greater efficiency achieved by divisive normalization (\u003Ca href=\"#B63\"\u003ESchwartz and Simoncelli, 2001\u003C\u002Fa\u003E; \u003Ca href=\"#B15\"\u003ECarandini and Heeger, 2012\u003C\u002Fa\u003E).\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EThe overarching theme of this approach is that optimizing information transfer by minimizing the metabolic and other costs of wiring, action potential generation and synaptic transfer—while at the same time maximizing the entropy of neural communication—could rationalize the characteristics of receptive fields in visual animals (\u003Ca href=\"#B28\"\u003EGraham and Field, 2007\u003C\u002Fa\u003E). As it has turned out, the idea that some features of visual systems arise from efficiently encoding the statistical structure of natural environments is consistent with a number of computational (\u003Ca href=\"#B68\"\u003ESrinivasan et al., 1982\u003C\u002Fa\u003E; \u003Ca href=\"#B2\"\u003EAtick and Redlich, 1993\u003C\u002Fa\u003E; \u003Ca href=\"#B54\"\u003EOlshausen and Field, 1996\u003C\u002Fa\u003E; \u003Ca href=\"#B7\"\u003EBell and Sejnowski, 1997\u003C\u002Fa\u003E; \u003Ca href=\"#B73\"\u003Evan Hateren and van der Schaaf, 1998\u003C\u002Fa\u003E; \u003Ca href=\"#B11\"\u003EBrady and Field, 2000\u003C\u002Fa\u003E; \u003Ca href=\"#B63\"\u003ESchwartz and Simoncelli, 2001\u003C\u002Fa\u003E; \u003Ca href=\"#B67\"\u003ESimoncelli and Olshausen, 2001\u003C\u002Fa\u003E) and physiological studies (\u003Ca href=\"#B17\"\u003EDan et al., 1996\u003C\u002Fa\u003E; \u003Ca href=\"#B4\"\u003EBaddeley et al., 1997\u003C\u002Fa\u003E; \u003Ca href=\"#B74\"\u003EVinje and Gallant, 2000\u003C\u002Fa\u003E).\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EAlthough the success of models based on information theory leaves no doubt about the advantages of efficient visual processing, the models do not explain how the inevitable conflation of information in images is dealt with by the visual system (see earlier and Figure \u003Ca href=\"#F1\"\u003E1\u003C\u002Fa\u003E), or why perceived visual qualities do not correspond with measured physical parameters in the visual environment (see Figure \u003Ca href=\"#F2\"\u003E2\u003C\u002Fa\u003E). Nor do they indicate how biological visual systems successfully guide behavior.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb0\"\u003EWhile these deficiencies do not diminish the importance of efficient neural processing conceived in terms of Shannon entropy, efficiency is not directly germane to perception and behavior, just as efficiency in telecommunication is not germane to the content of the messages that are transmitted. Generating perceptions that succeed in a world whose physical parameters cannot be recovered is a different goal, in much the same way that the functional aim of any organ system differs from the concurrent need to achieve its purposes as efficiently as possible.\u003C\u002Fp\u003E\r\n\u003Ca id=\"h6\" name=\"h6\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EA Wholly Empirical Approach\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb15\"\u003EThe aim of the visual system in these approaches is assumed to be the recovery of real world properties, however imperfectly, from information in retinal stimuli. A different supposition is that since retinal images cannot specify the measurable properties of objects (see Figure \u003Ca href=\"#F1\"\u003E1\u003C\u002Fa\u003E), achieving this goal is impossible. It follows that visual perceptions must therefore arise from a strategy that does not rely on real world properties as such. In a wholly empirical conception of vision, the perceptual values we experience are determined by ordering visual qualities according to the frequency of occurrence of image patterns and how this impacts survival (\u003Ca href=\"#B56\"\u003EPurves and Lotto, 2003\u003C\u002Fa\u003E, \u003Ca href=\"#B57\"\u003E2011\u003C\u002Fa\u003E; \u003Ca href=\"#B59\"\u003EPurves et al., 2011\u003C\u002Fa\u003E, \u003Ca href=\"#B58\"\u003E2014\u003C\u002Fa\u003E).\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb0\"\u003EIn general terms, understanding this strategy is straightforward. Imagine a population of primitive organisms whose behavior is dictated by rudimentary collections of photoreceptors and associated neural connections. As stipulated by neo-Darwinian theory, the organization of both the receptors and their connections in the population is subject to small random variations in structure and function that are acted on by natural selection. Based on interactions with the environment, variations of pre-neural and neural configurations that promote survival tend to be passed down to future generations. As a result, the ranks of visual qualities an agent perceives over some evolved range (darkest-lightest, largest-smallest, fastest-slowest, etc.) reflect biological utility rather than the physically measureable properties of objects and conditions in the world. In short, the role of perceptual states is not to reveal the physical world, but to promote useful behaviors. In this scheme, the world is simply the arena in which the utility of perceptions and other behavioral responses pertinent to survival and reproduction is tested, with feedback from the environment acting as the driving force that gradually instantiates the needed circuitry (Figure \u003Ca href=\"#F3\"\u003E3\u003C\u002Fa\u003E).\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003EFIGURE 3\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g003.jpg\" name=\"figure3\" target=\"_blank\"\u003E\r\n\n \u003Cpicture\u003E\n \u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=480&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g003.jpg\" media=\"(max-width: 563px)\"\u003E\u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=370&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g003.jpg\" media=\"(max-width: 1024px)\"\u003E\u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=290&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g003.jpg\" media=\"(max-width: 1441px)\"\u003E\u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=410&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g003.jpg\" media=\"\"\u003E\u003Csource type=\"image\u002Fjpg\" srcset=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g003.jpg\" media=\"\"\u003E \u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g003.jpg\" alt=\"www.frontiersin.org\" id=\"F3\" loading=\"lazy\"\u003E\n \u003C\u002Fpicture\u003E\n\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003EFigure 3. Visual perception based on the frequency of occurrence of patterns and subsequent behavior.\u003C\u002Fstrong\u003E By depending on the frequency of scale-invariant patterns in images, useful perceptions can arise without information about physically measurable properties of the world. The driving force in this understanding of vision is a biological feedback loop that, over time, orders the basic visual qualities we perceive by associating the frequency of recurring image patterns with perceptual qualities according to survival and reproductive success.\u003C\u002Fp\u003E\u003C\u002Fdiv\u003E \r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb0 w100pc float_left mt15\"\u003EIn implementing this strategy, however, vision cannot rely on entire images, as efficient coding theory has long recognized (see Information Theoretic Approaches). The reason is that the extraordinary detail in most retinal images will rarely, if ever, activate the full array of photoreceptors in exactly the same way again. Processes like evolution and lifetime learning, however, depend on repeated trial and error. Thus rather than relying on images \u003Ci\u003Eper se\u003C\u002Fi\u003E, biological vision is better served by relying on the recurring scale-invariant patterns within images to rank perceptual qualities (scale invariance refers to a relationship that does not change when variables such as length and width are multiplied by a common factor). In this way the biological feedback loop diagrammed in Figure \u003Ca href=\"#F3\"\u003E3\u003C\u002Fa\u003E can progressively organize both ordinal (e.g., lighter-darker, larger-smaller) and non-ordinal (e.g., color, direction) visual qualities over useful ranges according to the relative frequency of pattern occurrences and feedback from behavior. This concept is consistent with classical physiological studies demonstrating the transformation of images by the evolved receptive fields of early level visual neurons (\u003Ca href=\"#B32\"\u003EHubel, 1988\u003C\u002Fa\u003E; \u003Ca href=\"#B33\"\u003EHubel and Wiesel, 2005\u003C\u002Fa\u003E), with the goal of reducing the redundancy of image information by efficient coding (\u003Ca href=\"#B28\"\u003EGraham and Field, 2007\u003C\u002Fa\u003E), and with psychophysical studies showing that the frequency of occurrence of image patterns extracted from natural scenes predicts human visual perceptions (\u003Ca href=\"#B79\"\u003EYang and Purves, 2004\u003C\u002Fa\u003E).\u003C\u002Fp\u003E\r\n\u003Ca id=\"h7\" name=\"h7\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EAn Example\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb0\"\u003ETo appreciate how vision can operate in this way, consider the perceptions of lightness-darkness elicited by natural luminance patterns. Figure \u003Ca href=\"#F4\"\u003E4\u003C\u002Fa\u003E shows two simple patterns in which the luminance of the central squares is the same, but the luminance of the surrounding areas differs. As has been noted since Michel Chevreul’s studies in the 19th century, the central squares appear differently light, thus failing to agree with physical measurements.\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003EFIGURE 4\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g004.jpg\" name=\"figure4\" target=\"_blank\"\u003E\r\n\n \u003Cpicture\u003E\n \u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=480&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g004.jpg\" media=\"(max-width: 563px)\"\u003E\u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=370&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g004.jpg\" media=\"(max-width: 1024px)\"\u003E\u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=290&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g004.jpg\" media=\"(max-width: 1441px)\"\u003E\u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=410&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g004.jpg\" media=\"\"\u003E\u003Csource type=\"image\u002Fjpg\" srcset=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g004.jpg\" media=\"\"\u003E \u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g004.jpg\" alt=\"www.frontiersin.org\" id=\"F4\" loading=\"lazy\"\u003E\n \u003C\u002Fpicture\u003E\n\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003EFigure 4. Lightness percepts elicited by luminance patterns.\u003C\u002Fstrong\u003E The two patterns comprise central squares with identical luminance values surrounded by regions that have a lower (left panel) or higher (right panel) luminance. The central squares appear differently light in these contexts, despite the fact that they are physically the same. The inset shows that when placed on the same background the central squares elicit the same lightness, although this percept differs from the lightness of the squares in either of the two patterns above.\u003C\u002Fp\u003E\u003C\u002Fdiv\u003E \r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mt15 w100pc float_left\"\u003EIn wholly empirical terms, the reason for this effect is outlined in Figure \u003Ca href=\"#F5\"\u003E5\u003C\u002Fa\u003E. In the course of maximizing survival and reproductive success in response to scale-invariant patterns of luminance, evolution and lifetime learning will have ranked perceptions of relative lightness-darkness according to the frequency of occurrence of the luminance of any element in a pattern, given the luminance values of the rest of the elements. Absent this ordering according to the frequency of recurring image patterns, the generation of useful perceptions and behaviors would be stymied by the fact that these or any other patterns cannot specify the measured properties of the objects and conditions that gave rise to them (see Figure \u003Ca href=\"#F1\"\u003E1\u003C\u002Fa\u003E).\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"Imageheaders\"\u003EFIGURE 5\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"FigureDesc\"\u003E\r\n\u003Ca href=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g005.jpg\" name=\"figure5\" target=\"_blank\"\u003E\r\n\n \u003Cpicture\u003E\n \u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=480&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g005.jpg\" media=\"(max-width: 563px)\"\u003E\u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=370&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g005.jpg\" media=\"(max-width: 1024px)\"\u003E\u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=290&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g005.jpg\" media=\"(max-width: 1441px)\"\u003E\u003Csource type=\"image\u002Fwebp\" srcset=\"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=410&f=webp\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g005.jpg\" media=\"\"\u003E\u003Csource type=\"image\u002Fjpg\" srcset=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g005.jpg\" media=\"\"\u003E \u003Cimg src=\"https:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g005.jpg\" alt=\"www.frontiersin.org\" id=\"F5\" loading=\"lazy\"\u003E\n \u003C\u002Fpicture\u003E\n\u003C\u002Fa\u003E\r\n\u003Cp\u003E\u003Cstrong\u003EFigure 5. Lightness predicted by the frequency of recurrent luminance patterns.\u003C\u002Fstrong\u003E The contexts of luminance patterns in column 1 are the same as in Figure \u003Ca style=\"color:grey;\" href=\"#F4\"\u003E4\u003C\u002Fa\u003E, with an unspecified central value indicated by the question marks. The frequency of occurrence of central luminance values in these patterns can be determined by repeatedly sampling natural images using the patterns as templates (see column 2). To maximize behavioral success, the lightness elicited by the central luminance value in Figure \u003Ca style=\"color:grey;\" href=\"#F4\"\u003E4\u003C\u002Fa\u003E (indicated by the red ‘Ts’ in column 2) should evolve to accord with its accumulated frequency of occurrence in the two patterns (dashed red lines in the graphs in column 3) rather than with its actual luminance, thus explaining why the same central luminance in Figure \u003Ca style=\"color:grey;\" href=\"#F4\"\u003E4\u003C\u002Fa\u003E is perceived differently. Organisms therefore evolve to match their perceptions to the accumulated frequencies of occurrence of targets given a context through their enhanced survival over evolutionary time (as shown in Figure \u003Ca style=\"color:grey;\" href=\"#F3\"\u003E3\u003C\u002Fa\u003E). (Note that using templates to determine the frequency of occurrence of patterns is simply a convenient way of collecting the pertinent data, and does not imply that the visual system uses templates to sample retinal images.) (Original data is in \u003Ca style=\"color:grey;\" href=\"#B79\"\u003EYang and Purves, 2004\u003C\u002Fa\u003E).\u003C\u002Fp\u003E\u003C\u002Fdiv\u003E \r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"DottedLine\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cp class=\"mb15 w100pc float_left mt15\"\u003EAs shown in Figure \u003Ca href=\"#F5\"\u003E5\u003C\u002Fa\u003E, the empirical incidence of the two patterns arising in retinal images generated by a database of natural images shows that the same central luminance value occurs less often in the context of a lower-luminance surround than in the context of a higher-luminance surround (column 2; \u003Ca href=\"#B79\"\u003EYang and Purves, 2004\u003C\u002Fa\u003E). The reason is that in any non-random pattern, nearby points will tend to have similar luminance values (see Figure \u003Ca href=\"#F4\"\u003E4\u003C\u002Fa\u003E; \u003Ca href=\"#B54\"\u003EOlshausen and Field, 1996\u003C\u002Fa\u003E, \u003Ca href=\"#B540\"\u003E2000\u003C\u002Fa\u003E). Consequently, if the lightness-darkness values of the central squares are ordered according to their relative frequency of occurrence in these patterns (column 3), the same luminance value should elicit a lighter appearance in the context of a less luminant surround when compared to a more luminant surround, as it does (see Figure \u003Ca href=\"#F4\"\u003E4\u003C\u002Fa\u003E).\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb0\"\u003EIn summary, the frequencies of occurrence of luminance values in image patterns responded to over time predict the qualities we see in this example because the range of this basic perceptual quality (lightness-darkness) has been ordered over a useful range (lightest to darkest) according to the relative success of stimulus-response associations. Similar ordering of data arising from the frequency of pattern occurrence in both natural and simulated environments has been used to rationalize more complex stimuli that elicit perceptions of lightness (\u003Ca href=\"#B79\"\u003EYang and Purves, 2004\u003C\u002Fa\u003E), color (\u003Ca href=\"#B48\"\u003ELong et al., 2006\u003C\u002Fa\u003E), interval and angle magnitude (\u003Ca href=\"#B31\"\u003EHowe and Purves, 2005\u003C\u002Fa\u003E), the speed of motion (\u003Ca href=\"#B77\"\u003EWojtach et al., 2008\u003C\u002Fa\u003E, \u003Ca href=\"#B78\"\u003E2009\u003C\u002Fa\u003E), and the direction of motion (\u003Ca href=\"#B71\"\u003ESung et al., 2009\u003C\u002Fa\u003E).\u003C\u002Fp\u003E\r\n\u003Ca id=\"h8\" name=\"h8\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EConsequences of Input-Output Associations on a Wholly Empirical Basis\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb15\"\u003EBehaviorally successful associations generated in this way automatically tie the frequency of occurrence of stimulus patterns to the frequency of occurrence of responses, explaining why relying on the frequency of occurrence of stimulus patterns predicts perception: every time a given image pattern occurs as input, the associated output arises from trial and error feedback, which in biology tracks reproductive success. The result is perceptions that become more and more useful over time. Although in any trial and error process input-output equivalence is never reached, after sufficient evolution the cumulative distribution function of the stimulus input will come to align with the cumulative distribution function of the perceptual output closely enough to predict many of the results of human psychophysics (\u003Ca href=\"#B58\"\u003EPurves et al., 2014\u003C\u002Fa\u003E).\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb0\"\u003EWhen conceived in this way it makes sense that visual perceptions are not correlated with light intensity or any other physical property, as psychophysics amply demonstrates. Although relying on the frequency of occurrence of patterns uncouples perceived values from their measured physical parameters (e.g., surface reflectance), it endows visual agents with the ability to perceive and act in their environments in ways that led to biological success in the past, and are therefore likely to succeed in the present. While this strategy makes it seem that we see the world as it really is, vision on a wholly empirical basis is not veridical and has a different goal: to generate useful perceptions without measuring or recovering real-world properties.\u003C\u002Fp\u003E\r\n\u003Ca id=\"h9\" name=\"h9\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EExploring Neuronal Connectivity in Wholly Empirical Terms\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb15\"\u003EBayesian approaches to perception use inferences about real-world properties as a tool for understanding whatever processing is accomplished by the visual brain. But as has already been emphasized, biological sensing systems cannot recover these properties.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb15\"\u003EThe wholly empirical alternative we describe is generally consistent with other studies that do not assume the recovery of real-world properties (e.g., \u003Ca href=\"#B34\"\u003EJanke et al., 1999\u003C\u002Fa\u003E; \u003Ca href=\"#B55\"\u003EOnat et al., 2011\u003C\u002Fa\u003E). Ultimately, any approach to vision based on empirically successful input-output associations must explain how this strategy is related to the documented physiology and anatomy of the primate and other visual systems. In principle, the most direct way to unravel the circuit mechanics underlying a wholly empirical (or any other) strategy would be to mimic the trial and error process of association on which evolution relies. Until relatively recently, this approach would have been fanciful. But the advent of genetic and other computer algorithms has made simulating the evolution of artificial neural networks in model environments relatively easy. This technology offers a way of linking any empirical understanding of vision to the wealth of information already in hand from physiological and anatomical studies.\u003C\u002Fp\u003E\r\n\u003Cp class=\"mb0\"\u003EA number of studies have shown the feasibility of evolving neural networks on the basis of experience (\u003Ca href=\"#B23\"\u003EGeisler and Diehl, 2002\u003C\u002Fa\u003E; \u003Ca href=\"#B8\"\u003EBoots et al., 2007\u003C\u002Fa\u003E; \u003Ca href=\"#B16\"\u003ECorney and Lotto, 2007\u003C\u002Fa\u003E; \u003Ca href=\"#B24\"\u003EGeisler et al., 2009\u003C\u002Fa\u003E; \u003Ca href=\"#B14\"\u003EBurge and Geisler, 2011\u003C\u002Fa\u003E). More recent work has asked whether the connectivity and operating principles of networks evolved on a wholly empirical basis is similar to that found in biological circuitry. For example, simple networks have been evolved to rank responses according to the frequency of occurrence of patterns extracted from natural and simulated images (\u003Ca href=\"#B53\"\u003ENg et al., 2013\u003C\u002Fa\u003E; \u003Ca href=\"#B52\"\u003EMorgenstern et al., 2014\u003C\u002Fa\u003E). The most obvious feature that emerges is the center-surround receptive field. In addition to efficiency, this organization enables the interaction of targets and contexts, heightens sensitivity to frequently occurring stimuli, and automatically adapts to overall luminance and local contrast. These features are all characteristic of neurons in the early stages of visual systems like ours (\u003Ca href=\"#B630\"\u003ESakmann and Creutzfeldt, 1969\u003C\u002Fa\u003E; \u003Ca href=\"#B22\"\u003EGeisler and Albrecht, 1992\u003C\u002Fa\u003E; \u003Ca href=\"#B9\"\u003EBonin et al., 2005\u003C\u002Fa\u003E; \u003Ca href=\"#B33\"\u003EHubel and Wiesel, 2005\u003C\u002Fa\u003E).\u003C\u002Fp\u003E\r\n\u003Ca id=\"h10\" name=\"h10\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EVision as Reflexive\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb0\"\u003EAny fully empirical account of vision implies that perceptions and their neural underpinnings are reflexive. The term “reflex” alludes to behaviors such as the “knee-jerk” (myotatic) response that depend on the transfer of information from sensory input to motor output via circuitry established by behavioral success over evolutionary time. The advantages of reflex responses are clear: circuitry that links input to output as directly as possible allows the nervous system to respond with maximum speed and accuracy. It does not follow, however, that reflex responses must be “simple”, that they are limited to motor acts, or that they entail only “lower order” neural circuitry. \u003Ca href=\"#B66\"\u003ESherrington (1947)\u003C\u002Fa\u003E, who pioneered the study of reflex circuits, was well aware that the concept of a “simple” reflex is, in his words, a “convenient…fiction”, since “all parts of the nervous system are connected together and no part of it is ever capable of reaction without affecting and being affected by other parts …”. There is no evidence that any response to sensory input differs from a spinal reflex, other than by the number of synaptic connections in the input-output circuitry. Understanding vision as reflexive (i.e., hard-wired at any given moment but subject to modification by subsequent experience) also affords the ability to account for visual perceptions generated within a few tens of milliseconds in response to complex stimuli such as wind-blown leaves, running water, animal movements and numerous other circumstances. Computer vision models that depend on reverse-engineering scenes from images by inferring the large number of real world sources that could have generated these complex image streams would likely require more computational power than is necessary for the tasks that visual and other biological sensing systems routinely carry out. Although it is difficult to imagine how visual systems could generate perceptions of complex scenes almost immediately by a series of hierarchical computations, this problem is resolved if visual “processing” is re-imagined as the result of “computations” that have, in effect, already been accomplished by laying down connectivity instantiated by feedback from empirical success over evolutionary and individual time (see Figure \u003Ca href=\"#F3\"\u003E3\u003C\u002Fa\u003E). This strategic difference is presumably the main reason why machine vision based on logical algorithms cannot match the performance of biological vision on many tasks.\u003C\u002Fp\u003E\r\n\u003Ca id=\"h11\" name=\"h11\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003ELimitations of a Wholly Empirical Approach\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb0\"\u003EAs with any theory, there are limitations to the strategy of visual perception advocated here, both methodological and conceptual. With respect to methodology, when investigating the perception of lightness (see Figures \u003Ca href=\"#F4\"\u003E4\u003C\u002Fa\u003E, \u003Ca href=\"#F5\"\u003E5\u003C\u002Fa\u003E), the luminance values comprising the database were collected from a limited range of environments assumed to be representative of the types of scenes where the human visual system evolved. In addition, the fact that humans and other animals attend to specific aspects of the environment, thus biasing the frequency distribution of sensory input, was not taken into account. While these and other deficiencies are important, given that this strategy successfully predicts the standard simultaneous lightness contrast effect shown in Figure \u003Ca href=\"#F4\"\u003E4\u003C\u002Fa\u003E and a variety of more complex lightness effects (\u003Ca href=\"#B79\"\u003EYang and Purves, 2004\u003C\u002Fa\u003E)—in addition to other puzzling perceptions of color, form and motion (see above)—the empirical framework seems well supported by evidence that has not been supplied by other approaches. This last point stands as a challenge to any theory of perception, including broader unifying concepts such as the idea that the common goal of brain function is to satisfy a “free-energy principle” (\u003Ca href=\"#B20\"\u003EFriston, 2010\u003C\u002Fa\u003E).\u003C\u002Fp\u003E\r\n\u003Ca id=\"h12\" name=\"h12\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EThe Wholly Empirical Theory and Cognition\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb0\"\u003EIt is worth noting that higher order phenomena such as visual attention and visual memory could also arise by associating the relative frequency of recurring scale-invariant image patterns with useful responses. As in the case of the basic visual qualities considered here, the relevant circuitry would also be reflexive, without the need to invoke additional “cognitive” mechanisms: every time a given image pattern occurred the response dictated by association would be further enhanced according to its utility. As a result the foci of visual attention and the visual memories elicited would, like perceptions, gradually become more and more useful over time.\u003C\u002Fp\u003E\r\n\u003Ca id=\"h13\" name=\"h13\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EConclusion\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb0\"\u003EThe idea that vision operates empirically has taken several forms and enjoyed different degrees of enthusiasm since Helmholtz introduced the concept of unconscious inference in the 19th century. Vision on a wholly empirical basis is now seen by some investigators as the most plausible way to understand how stimuli that cannot specify their physical sources can nonetheless give rise to useful perceptions and routinely successful visually guided behaviors. Understanding perception in these terms implies a strategy of nervous system operation that differs fundamentally from the concept of detecting stimulus features and recovering real-world properties by algorithmic computations that in one way or another depend on accessing physical parameters to guide actions. By relying on evolved reflex associations that have ordered visual qualities according to the impact of the relative frequency of occurrence of stimulus patterns on reproductive success, vision can circumvent the inherent uncertainty of retinal images, and explain the qualities we actually see.\u003C\u002Fp\u003E\r\n\u003Ca id=\"h14\" name=\"h14\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EConflict of Interest Statement\u003C\u002Fh2\u003E\r\n\u003Cp class=\"mb0\"\u003EThe authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.\u003C\u002Fp\u003E\r\n\u003Ca id=\"h15\" name=\"h15\"\u003E\u003C\u002Fa\u003E\u003Ch2\u003EReferences\u003C\u002Fh2\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B1\" id=\"B1\"\u003E\u003C\u002Fa\u003EAllred, S. R., and Brainard, D. H. (2013). A bayesian model of lightness perception that incorporates spatial variation in the illumination. \u003Ci\u003EJ. Vis.\u003C\u002Fi\u003E 13:18. doi: 10.1167\u002F13.7.18\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=23814073\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1167\u002F13.7.18\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=A+bayesian+model+of+lightness+perception+that+incorporates+spatial+variation+in+the+illumination&author=Allred+S.+R.&author=Brainard+D.+H.&publication_year=2013&journal=J.+Vis.&volume=13&pages=18\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B2\" id=\"B2\"\u003E\u003C\u002Fa\u003EAtick, J., and Redlich, A. (1993). Convergent algorithm for sensory receptive field development. \u003Ci\u003ENeural Comput.\u003C\u002Fi\u003E 5, 45–60. doi: 10.1162\u002Fneco.1993.5.1.45\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1162\u002Fneco.1993.5.1.45\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Convergent+algorithm+for+sensory+receptive+field+development&author=Atick+J.&author=Redlich+A.&publication_year=1993&journal=Neural+Comput.&volume=5&pages=45-60\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B3\" id=\"B3\"\u003E\u003C\u002Fa\u003EAttneave, F. (1954). Informational aspects of visual perception. \u003Ci\u003EPsychol. Rev.\u003C\u002Fi\u003E 61, 183–193. doi: 10.1037\u002Fh0054663\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=13167245\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1037\u002Fh0054663\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Informational+aspects+of+visual+perception&author=Attneave+F.&publication_year=1954&journal=Psychol.+Rev.&volume=61&pages=183-193\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B4\" id=\"B4\"\u003E\u003C\u002Fa\u003EBaddeley, R., Abbott, L. F., Booth, M. C., Sengpiel, F., Freeman, T., Wakeman, E. A., et al. (1997). Responses of neurons in primary and inferior temporal visual cortices to natural scenes. \u003Ci\u003EProc. Biol. Sci.\u003C\u002Fi\u003E 264, 1775–1783. doi: 10.1098\u002Frspb.1997.0246\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=9447735\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1098\u002Frspb.1997.0246\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Responses+of+neurons+in+primary+and+inferior+temporal+visual+cortices+to+natural+scenes&author=Baddeley+R.&author=Abbott+L.+F.&author=Booth+M.+C.&author=Sengpiel+F.&author=Freeman+T.&author=Wakeman+E.+A.&+&publication_year=1997&journal=Proc.+Biol.+Sci.&volume=264&pages=1775-1783\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B5\" id=\"B5\"\u003E\u003C\u002Fa\u003EBarlow, H. B. (1961). “Possible principles underlying the transformation of sensory messages,” in \u003Ci\u003ESensory Communication\u003C\u002Fi\u003E, ed. W. A. Rosenblith (Cambrdge MA: MIT Press), 217–236.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B6\" id=\"B6\"\u003E\u003C\u002Fa\u003EBayes, T. R. (1763). An essay towards solving a problem in the doctrine of chances. \u003Ci\u003EPhil. Trans. R. Soc. London\u003C\u002Fi\u003E 53, 370–418. doi: 10.1098\u002Frstl.1763.0053\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1098\u002Frstl.1763.0053\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=An+essay+towards+solving+a+problem+in+the+doctrine+of+chances&author=Bayes+T.+R.&publication_year=1763&journal=Phil.+Trans.+R.+Soc.+London&volume=53&pages=370-418\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B7\" id=\"B7\"\u003E\u003C\u002Fa\u003EBell, A. J., and Sejnowski, T. J. (1997). The “independent components” of natural scenes are edge filters. \u003Ci\u003EVision Res.\u003C\u002Fi\u003E 37, 3327–3338. doi: 10.1016\u002Fs0042-6989(97)00121-1\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=9425547\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fs0042-6989(97)00121-1\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=The+“independent+components”+of+natural+scenes+are+edge+filters&author=Bell+A.+J.&author=Sejnowski+T.+J.&publication_year=1997&journal=Vision+Res.&volume=37&pages=3327-3338\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B8\" id=\"B8\"\u003E\u003C\u002Fa\u003EBoots, B., Nundy, S., and Purves, D. (2007). Evolution of visually guided behavior in artificial agents. \u003Ci\u003ENetwork\u003C\u002Fi\u003E 18, 11–34. doi: 10.1080\u002F09548980601113254\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=17454680\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1080\u002F09548980601113254\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Evolution+of+visually+guided+behavior+in+artificial+agents&author=Boots+B.&author=Nundy+S.&author=Purves+D.&publication_year=2007&journal=Network&volume=18&pages=11-34\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B9\" id=\"B9\"\u003E\u003C\u002Fa\u003EBonin, V., Mante, V., and Carandini, M. (2005). The suppressive field of neurons in lateral geniculate nucleus. \u003Ci\u003EJ. Neurosci.\u003C\u002Fi\u003E 25, 10844–10856. doi: 10.1523\u002Fjneurosci.3562-05.2005\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=16306397\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1523\u002Fjneurosci.3562-05.2005\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=The+suppressive+field+of+neurons+in+lateral+geniculate+nucleus&author=Bonin+V.&author=Mante+V.&author=Carandini+M.&publication_year=2005&journal=J.+Neurosci.&volume=25&pages=10844-10856\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B10\" id=\"B10\"\u003E\u003C\u002Fa\u003EBowers, J. S., and Davis, C. J. (2012). Bayesian just-so stories in psychology and neuroscience. \u003Ci\u003EPsychol. Bull.\u003C\u002Fi\u003E 138, 389–414. doi: 10.1037\u002Fa0026450\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=22545686\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1037\u002Fa0026450\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Bayesian+just-so+stories+in+psychology+and+neuroscience&author=Bowers+J.+S.&author=Davis+C.+J.&publication_year=2012&journal=Psychol.+Bull.&volume=138&pages=389-414\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B11\" id=\"B11\"\u003E\u003C\u002Fa\u003EBrady, N., and Field, D. J. (2000). Local contrast in natural images: normalization and coding efficiency. \u003Ci\u003EPerception\u003C\u002Fi\u003E 29, 1041–1056. doi: 10.1068\u002Fp2996\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=11144818\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1068\u002Fp2996\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Local+contrast+in+natural+images%3A+normalization+and+coding+efficiency&author=Brady+N.&author=Field+D.+J.&publication_year=2000&journal=Perception&volume=29&pages=1041-1056\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B12\" id=\"B12\"\u003E\u003C\u002Fa\u003EBrainard, D. H. (2009). “Bayesian approaches to color vision,” in \u003Ci\u003EThe Cognitive Neurosciences, Fourth Edition\u003C\u002Fi\u003E, ed. M. S. Gazzaniga (Cambridge, MA: MIT Press), 395–408.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Bayesian+approaches+to+color+vision&author=Brainard+D.+H.&publication_year=2009&pages=395-408\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E \r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B13\" id=\"B13\"\u003E\u003C\u002Fa\u003EBrünswik, E. (1956\u002F1997). \u003Ci\u003EPerception and the Psychological Design of Representative Experiments.\u003C\u002Fi\u003E 2nd Edn. Berkeley: University of California Press.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B14\" id=\"B14\"\u003E\u003C\u002Fa\u003EBurge, J., and Geisler, W. S. (2011). Optimal defocus estimation in individual natural images. \u003Ci\u003EProc. Natl. Acad. Sci. U S A\u003C\u002Fi\u003E 108, 16849–16854. doi: 10.1073\u002Fpnas.1108491108\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=21930897\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1073\u002Fpnas.1108491108\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Optimal+defocus+estimation+in+individual+natural+images&author=Burge+J.&author=Geisler+W.+S.&publication_year=2011&journal=Proc.+Natl.+Acad.+Sci.+U+S+A&volume=108&pages=16849-16854\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B15\" id=\"B15\"\u003E\u003C\u002Fa\u003ECarandini, M., and Heeger, D. J. (2012). Normalization as a canonical neural computation. \u003Ci\u003ENat. Rev. Neurosci.\u003C\u002Fi\u003E 13, 51–62. doi: 10.1038\u002Fnrn3136\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=22108672\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1038\u002Fnrn3136\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Normalization+as+a+canonical+neural+computation&author=Carandini+M.&author=Heeger+D.+J.&publication_year=2012&journal=Nat.+Rev.+Neurosci.&volume=13&pages=51-62\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B16\" id=\"B16\"\u003E\u003C\u002Fa\u003ECorney, D., and Lotto, R. B. (2007). What are lightness illusions and why do we see them? \u003Ci\u003EPLoS Comput. Biol.\u003C\u002Fi\u003E 3:e180. doi: 10.1371\u002Fjournal.pcbi.0030180\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=17907795\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1371\u002Fjournal.pcbi.0030180\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=What+are+lightness+illusions+and+why+do+we+see+them%3F&author=Corney+D.&author=Lotto+R.+B.&publication_year=2007&journal=PLoS+Comput.+Biol.&volume=3&pages=e180\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B17\" id=\"B17\"\u003E\u003C\u002Fa\u003EDan, Y., Atick, J. J., and Reid, R. C. (1996). Efficient coding of natural scenes in the lateral geniculate nucleus: experimental test of a computational theory. \u003Ci\u003EJ. Neurosci.\u003C\u002Fi\u003E 16, 3351–3362. \u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=8627371\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Efficient+coding+of+natural+scenes+in+the+lateral+geniculate+nucleus%3A+experimental+test+of+a+computational+theory&author=Dan+Y.&author=Atick+J.+J.&author=Reid+R.+C.&publication_year=1996&journal=J.+Neurosci.&volume=16&pages=3351-3362\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B18\" id=\"B18\"\u003E\u003C\u002Fa\u003EErnst, M. O., and Banks, M. S. (2002). Humans integrate visual and haptic information in a statistically optimal fashion. \u003Ci\u003ENature\u003C\u002Fi\u003E 415, 429–433. doi: 10.1038\u002F415429a\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=11807554\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1038\u002F415429a\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Humans+integrate+visual+and+haptic+information+in+a+statistically+optimal+fashion&author=Ernst+M.+O.&author=Banks+M.+S.&publication_year=2002&journal=Nature&volume=415&pages=429-433\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B19\" id=\"B19\"\u003E\u003C\u002Fa\u003EFetsch, C. R., DeAngelis, G. C., and Angelaki, D. E. (2013). Bridging the gap between theories of sensory cue integration and the physiology of multisensory neurons. \u003Ci\u003ENat. Rev. Neurosci.\u003C\u002Fi\u003E 14, 429–442. doi: 10.1038\u002Fnrn3503\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=23686172\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1038\u002Fnrn3503\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Bridging+the+gap+between+theories+of+sensory+cue+integration+and+the+physiology+of+multisensory+neurons&author=Fetsch+C.+R.&author=DeAngelis+G.+C.&author=Angelaki+D.+E.&publication_year=2013&journal=Nat.+Rev.+Neurosci.&volume=14&pages=429-442\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B20\" id=\"B20\"\u003E\u003C\u002Fa\u003EFriston, K. (2010). The free-energy principle: a unified brain theory? \u003Ci\u003ENat. Rev. Neurosci.\u003C\u002Fi\u003E 11, 127–138. doi: 10.1038\u002Fnrn2787\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=20068583\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1038\u002Fnrn2787\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=The+free-energy+principle%3A+a+unified+brain+theory%3F&author=Friston+K.&publication_year=2010&journal=Nat.+Rev.+Neurosci.&volume=11&pages=127-138\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B21\" id=\"B21\"\u003E\u003C\u002Fa\u003EGeisler, W. S. (2011). Contributions of ideal observer theory to vision research. \u003Ci\u003EVision Res.\u003C\u002Fi\u003E 51, 771–781. doi: 10.1016\u002Fj.visres.2010.09.027\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=20920517\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.visres.2010.09.027\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Contributions+of+ideal+observer+theory+to+vision+research&author=Geisler+W.+S.&publication_year=2011&journal=Vision+Res.&volume=51&pages=771-781\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B22\" id=\"B22\"\u003E\u003C\u002Fa\u003EGeisler, W. S., and Albrecht, D. G. (1992). Cortical neurons: isolation of contrast gain control. \u003Ci\u003EVision Res.\u003C\u002Fi\u003E 32, 1409–1410. doi: 10.1016\u002F0042-6989(92)90196-p\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=1455713\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1016\u002F0042-6989(92)90196-p\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Cortical+neurons%3A+isolation+of+contrast+gain+control&author=Geisler+W.+S.&author=Albrecht+D.+G.&publication_year=1992&journal=Vision+Res.&volume=32&pages=1409-1410\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B23\" id=\"B23\"\u003E\u003C\u002Fa\u003EGeisler, W. S., and Diehl, R. L. (2002). Bayesian natural selection and the evolution of perceptual systems. \u003Ci\u003EPhilos. Trans. R. Soc. Lond. B Biol. Sci.\u003C\u002Fi\u003E 357, 419–448. doi: 10.1098\u002Frstb.2001.1055\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=12028784\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1098\u002Frstb.2001.1055\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Bayesian+natural+selection+and+the+evolution+of+perceptual+systems&author=Geisler+W.+S.&author=Diehl+R.+L.&publication_year=2002&journal=Philos.+Trans.+R.+Soc.+Lond.+B+Biol.+Sci.&volume=357&pages=419-448\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B24\" id=\"B24\"\u003E\u003C\u002Fa\u003EGeisler, W. S., Najemnik, J., and Ing, A. D. (2009). Optimal stimulus encoders for natural tasks. \u003Ci\u003EJ. Vis.\u003C\u002Fi\u003E 17, 1–16. doi: 10.1167\u002F9.13.17\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=20055550\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1167\u002F9.13.17\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Optimal+stimulus+encoders+for+natural+tasks&author=Geisler+W.+S.&author=Najemnik+J.&author=Ing+A.+D.&publication_year=2009&journal=J.+Vis.&volume=17&pages=1-16\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B25\" id=\"B25\"\u003E\u003C\u002Fa\u003EGelb, A. (1929). “Die farbenkonstanz der sehdinge,” in \u003Ci\u003EHandbuch Normalen und Pathologischen Psychologie\u003C\u002Fi\u003E, ed. W. A. von Bethe (Berlin: Springer-Verlag), 594–678.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Die+farbenkonstanz+der+sehdinge&author=Gelb+A.&publication_year=1929&pages=594-678\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B26\" id=\"B26\"\u003E\u003C\u002Fa\u003EGibson, J. J. (1966). \u003Ci\u003EThe Senses Considered as Perceptual Systems.\u003C\u002Fi\u003E Boston: Houghton Mifflin.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=The+Senses+Considered+as+Perceptual+Systems&author=Gibson+J.+J.&publication_year=1966\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B27\" id=\"B27\"\u003E\u003C\u002Fa\u003EGibson, J. J. (1979). \u003Ci\u003EThe Ecological Approach to Visual Perception.\u003C\u002Fi\u003E Hillsdale, NJ: Lawrence Erlbaum.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=The+Ecological+Approach+to+Visual+Perception&author=Gibson+J.+J.&publication_year=1979\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B28\" id=\"B28\"\u003E\u003C\u002Fa\u003EGraham, D. J., and Field, D. J. (2007). “Efficient coding of natural images,” in \u003Ci\u003ENew Encyclopedia of Neurosciences\u003C\u002Fi\u003E, L. R. Squire (New York: Elsevier), 19–27.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Efficient+coding+of+natural+images&author=Graham+D.+J.&author=Field+D.+J.&publication_year=2007&pages=19-27\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E \r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B29\" id=\"B29\"\u003E\u003C\u002Fa\u003EHelmholtz, H. (1866\u002F1924). \u003Ci\u003EHelmholtz’s Treatise on Physiological Optics, Third German Edition, Vols. I-III, 1909\u003C\u002Fi\u003E, (J. P. C. Southall translation) (New York: The Optical Society of America).\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B30\" id=\"B30\"\u003E\u003C\u002Fa\u003EHosoya, T., Baccus, S. A., and Meister, M. (2005). Dynamic predictive coding by the retina. \u003Ci\u003ENature\u003C\u002Fi\u003E 436, 71–77. doi: 10.1038\u002Fnature03689\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=16001064\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1038\u002Fnature03689\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Dynamic+predictive+coding+by+the+retina&author=Hosoya+T.&author=Baccus+S.+A.&author=Meister+M.&publication_year=2005&journal=Nature&volume=436&pages=71-77\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B31\" id=\"B31\"\u003E\u003C\u002Fa\u003EHowe, C. Q., and Purves, D. (2005). \u003Ci\u003EPerceiving Geometry: Geometrical Illusions Explained by Natural Scene Statistics.\u003C\u002Fi\u003E New York: Springer Press.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Perceiving+Geometry%3A+Geometrical+Illusions+Explained+by+Natural+Scene+Statistics&author=Howe+C.+Q.&author=Purves+D.&publication_year=2005\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B32\" id=\"B32\"\u003E\u003C\u002Fa\u003EHubel, D. H. (1988). \u003Ci\u003EEye Brain and Vision\u003C\u002Fi\u003E. New York: Scientific American Press.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Eye+Brain+and+Vision&author=Hubel+D.+H.&publication_year=1988\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B33\" id=\"B33\"\u003E\u003C\u002Fa\u003EHubel, D. H., and Wiesel, T. (2005). \u003Ci\u003EBrain and Visual Perception. A story of a 25-year Collaboration.\u003C\u002Fi\u003E New York: Oxford University Press.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Brain+and+Visual+Perception.+A+story+of+a+25-year+Collaboration&author=Hubel+D.+H.&author=Wiesel+T.&publication_year=2005\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B34\" id=\"B34\"\u003E\u003C\u002Fa\u003EJanke, D., Erlhagen, W., Dinse, H. R., Akhavan, A. C., Giese, M., Steinhage, A., et al. (1999). Parametric population representation of retinal location: neuronal interaction dynamics in cat primary visual cortex. \u003Ci\u003EJ. Neurosci.\u003C\u002Fi\u003E 19, 9016–9028. \u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=10516319\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Parametric+population+representation+of+retinal+location%3A+neuronal+interaction+dynamics+in+cat+primary+visual+cortex&author=Janke+D.&author=Erlhagen+W.&author=Dinse+H.+R.&author=Akhavan+A.+C.&author=Giese+M.&author=Steinhage+A.&+&publication_year=1999&journal=J.+Neurosci.&volume=19&pages=9016-9028\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B35\" id=\"B35\"\u003E\u003C\u002Fa\u003EJones, M., and Love, B. C. (2011). Bayesian fundamentalism or enlightenment? on the explanatory status and theoretical contributions of bayesian models of cognition. \u003Ci\u003EBehav. Brain Sci.\u003C\u002Fi\u003E 34, 169–188, disuccsion 188–231. doi: 10.1017\u002Fs0140525x10003134\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=21864419\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1017\u002Fs0140525x10003134\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Bayesian+fundamentalism+or+enlightenment%3F+on+the+explanatory+status+and+theoretical+contributions+of+bayesian+models+of+cognition&author=Jones+M.&author=Love+B.+C.&publication_year=2011&journal=Behav.+Brain+Sci.&volume=34&pages=169-188\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B36\" id=\"B36\"\u003E\u003C\u002Fa\u003EKersten, D., Mamassian, P., and Yuille, A. (2004). Object perception as bayesian inference. \u003Ci\u003EAnnu. Rev. Psychol.\u003C\u002Fi\u003E 55, 271–304. doi: 10.1146\u002Fannurev.psych.55.090902.142005\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=14744217\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1146\u002Fannurev.psych.55.090902.142005\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Object+perception+as+bayesian+inference&author=Kersten+D.&author=Mamassian+P.&author=Yuille+A.&publication_year=2004&journal=Annu.+Rev.+Psychol.&volume=55&pages=271-304\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B37\" id=\"B37\"\u003E\u003C\u002Fa\u003EKersten, D., and Yuille, A. (2003). Bayesian models of object perception. \u003Ci\u003ECurr. Opin. Neurobiol.\u003C\u002Fi\u003E 13, 1–9. doi: 10.1016\u002Fs0959-4388(03)00042-4\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Bayesian+models+of+object+perception&author=Kersten+D.&author=Yuille+A.&publication_year=2003&journal=Curr.+Opin.+Neurobiol.&volume=13&pages=1-9\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B38\" id=\"B38\"\u003E\u003C\u002Fa\u003EKnill, D. C., and Pouget, A. (2004). The bayesian brain: the role of uncertainty in neural coding and computation. \u003Ci\u003ETrends Neurosci.\u003C\u002Fi\u003E 27, 712–719. doi: 10.1016\u002Fj.tins.2004.10.007\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=15541511\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.tins.2004.10.007\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=The+bayesian+brain%3A+the+role+of+uncertainty+in+neural+coding+and+computation&author=Knill+D.+C.&author=Pouget+A.&publication_year=2004&journal=Trends+Neurosci.&volume=27&pages=712-719\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B39\" id=\"B39\"\u003E\u003C\u002Fa\u003EKnill, D. C., and Richards, W. (1996). \u003Ci\u003EPerception as Bayesian Inference.\u003C\u002Fi\u003E Cambridge: Cambridge University Press.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Perception+as+Bayesian+Inference&author=Knill+D.+C.&author=Richards+W.&publication_year=1996\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B40\" id=\"B40\"\u003E\u003C\u002Fa\u003EKnill, D. C., and Saunders, J. A. (2003). Do humans optimally integrate stereo and texture information for judgments of surface slant? \u003Ci\u003EVision Res.\u003C\u002Fi\u003E 43, 2539–2558. doi: 10.1016\u002Fs0042-6989(03)00458-9\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=13129541\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fs0042-6989(03)00458-9\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Do+humans+optimally+integrate+stereo+and+texture+information+for+judgments+of+surface+slant%3F&author=Knill+D.+C.&author=Saunders+J.+A.&publication_year=2003&journal=Vision+Res.&volume=43&pages=2539-2558\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B41\" id=\"B41\"\u003E\u003C\u002Fa\u003EKoffka, K. (1935). \u003Ci\u003EPrincipals of Gestalt Psychology.\u003C\u002Fi\u003E New York: Harcourt Brace.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Principals+of+Gestalt+Psychology&author=Koffka+K.&publication_year=1935\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E \r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B42\" id=\"B42\"\u003E\u003C\u002Fa\u003EKöhler, W. (1947). \u003Ci\u003EGestalt Psychology: An Introduction to New Concepts in Modern Psychology.\u003C\u002Fi\u003E New York: Liveright.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B43\" id=\"B43\"\u003E\u003C\u002Fa\u003EKörding, K. P. (2014). Bayesian statistics: relevant for the brain? \u003Ci\u003ECurr. Opin. Neurobiol.\u003C\u002Fi\u003E 25, 130–133. doi: 10.1016\u002Fj.conb.2014.01.003\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=24463330\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.conb.2014.01.003\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Bayesian+statistics%3A+relevant+for+the+brain%3F&author=Körding+K.+P.&publication_year=2014&journal=Curr.+Opin.+Neurobiol.&volume=25&pages=130-133\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B44\" id=\"B44\"\u003E\u003C\u002Fa\u003EKörding, K. P., and Wolpert, D. M. (2004). Bayesian integration in sensorimotor learning. \u003Ci\u003ENature\u003C\u002Fi\u003E 427, 244–247. doi: 10.1038\u002Fnature02169\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=14724638\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1038\u002Fnature02169\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Bayesian+integration+in+sensorimotor+learning&author=Körding+K.+P.&author=Wolpert+D.+M.&publication_year=2004&journal=Nature&volume=427&pages=244-247\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B45\" id=\"B45\"\u003E\u003C\u002Fa\u003EKuffler, S. W. (1953). Discharge patterns and functional organization of mammalian retina. \u003Ci\u003EJ. Neurophysiol.\u003C\u002Fi\u003E 16, 37–68. \u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=13035466\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Discharge+patterns+and+functional+organization+of+mammalian+retina&author=Kuffler+S.+W.&publication_year=1953&journal=J.+Neurophysiol.&volume=16&pages=37-68\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B46\" id=\"B46\"\u003E\u003C\u002Fa\u003ELaughlin, S. (1981). A simple coding procedure enhances a neuron’s information capacity. \u003Ci\u003EZ. Naturforsch.\u003C\u002Fi\u003E 36, 910–912. \u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=7303823\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=A+simple+coding+procedure+enhances+a+neuron's+information+capacity&author=Laughlin+S.&publication_year=1981&journal=Z.+Naturforsch.&volume=36&pages=910-912\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B47\" id=\"B47\"\u003E\u003C\u002Fa\u003ELee, T. S., and Mumford, D. (2003). Hierarchical bayesian inference in the visual cortex. \u003Ci\u003EJ. Opt. Soc. Am. A\u003C\u002Fi\u003E 20, 1434–1448. doi: 10.1364\u002Fjosaa.20.001434\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=12868647\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1364\u002Fjosaa.20.001434\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Hierarchical+bayesian+inference+in+the+visual+cortex&author=Lee+T.+S.&author=Mumford+D.&publication_year=2003&journal=J.+Opt.+Soc.+Am.+A&volume=20&pages=1434-1448\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B48\" id=\"B48\"\u003E\u003C\u002Fa\u003ELong, F., Yang, Z., and Purves, D. (2006). Spectral statistics in natural scenes predict hue, saturation and brightness. \u003Ci\u003EProc. Natl. Acad. Sci. U S A\u003C\u002Fi\u003E 103, 6013–6018. doi: 10.1073\u002Fpnas.0600890103\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=16595630\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1073\u002Fpnas.0600890103\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Spectral+statistics+in+natural+scenes+predict+hue,+saturation+and+brightness&author=Long+F.&author=Yang+Z.&author=Purves+D.&publication_year=2006&journal=Proc.+Natl.+Acad.+Sci.+U+S+A&volume=103&pages=6013-6018\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B49\" id=\"B49\"\u003E\u003C\u002Fa\u003EMamassian, P., Landy, M., and Maloney, L. T. (2002). “Bayesian modelling of visual perception,” in \u003Ci\u003EProbabilistic Models of the Brain: Perception and Neural Function\u003C\u002Fi\u003E, eds R. P. N. Rao, B. A. Olshausen, and M. S. Lewicki (Cambridge, MA: MIT Press), 13–36.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Bayesian+modelling+of+visual+perception&author=Mamassian+P.&author=Landy+M.&author=Maloney+L.+T.&publication_year=2002&pages=13-36\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E \r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B50\" id=\"B50\"\u003E\u003C\u002Fa\u003EMarr, D. (1982). \u003Ci\u003EVision: A Computational Investigation into Human Representation and Processing of Visual Information.\u003C\u002Fi\u003E San Francisco: W.H. Freeman.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B51\" id=\"B51\"\u003E\u003C\u002Fa\u003EMorgenstern, Y., Murray, R. F., and Harris, L. R. (2011). The human visual system’s assumption that light comes from above is weak. \u003Ci\u003EProc. Natl. Acad. Sci. U S A\u003C\u002Fi\u003E 108, 12551–12553. doi: 10.1073\u002Fpnas.1100794108\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=21746935\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1073\u002Fpnas.1100794108\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=The+human+visual+system's+assumption+that+light+comes+from+above+is+weak&author=Morgenstern+Y.&author=Murray+R.+F.&author=Harris+L.+R.&publication_year=2011&journal=Proc.+Natl.+Acad.+Sci.+U+S+A&volume=108&pages=12551-12553\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B52\" id=\"B52\"\u003E\u003C\u002Fa\u003EMorgenstern, Y., Rukmini, D. V., Monson, B. B., and Purves, D. (2014). Properties of artificial neurons that report lightness based on accumulated experience with luminance. \u003Ci\u003EFront. Comput. Neurosci.\u003C\u002Fi\u003E 8:134. doi: 10.3389\u002Ffncom.2014.00134\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=25404912\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.3389\u002Ffncom.2014.00134\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Properties+of+artificial+neurons+that+report+lightness+based+on+accumulated+experience+with+luminance&author=Morgenstern+Y.&author=Rukmini+D.+V.&author=Monson+B.+B.&author=Purves+D.&publication_year=2014&journal=Front.+Comput.+Neurosci.&volume=8&pages=134\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B53\" id=\"B53\"\u003E\u003C\u002Fa\u003ENg, C., Sundararajan, J., Hogan, M., and Purves, D. (2013). Network connections that evolve to circumvent the inverse optics problem. \u003Ci\u003EPLoS One\u003C\u002Fi\u003E 8:e60490. doi: 10.1371\u002Fjournal.pone.0060490\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=23555981\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1371\u002Fjournal.pone.0060490\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Network+connections+that+evolve+to+circumvent+the+inverse+optics+problem&author=Ng+C.&author=Sundararajan+J.&author=Hogan+M.&author=Purves+D.&publication_year=2013&journal=PLoS+One&volume=8&pages=e60490\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B54\" id=\"B54\"\u003E\u003C\u002Fa\u003EOlshausen, B. A., and Field, D. J. (1996). Emergence of simple-cell receptive field properties by learning a sparse code for natural images. \u003Ci\u003ENature\u003C\u002Fi\u003E 381, 607–609. doi: 10.1038\u002F381607a0\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=8637596\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1038\u002F381607a0\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Emergence+of+simple-cell+receptive+field+properties+by+learning+a+sparse+code+for+natural+images&author=Olshausen+B.+A.&author=Field+D.+J.&publication_year=1996&journal=Nature&volume=381&pages=607-609\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B540\" id=\"B540\"\u003E\u003C\u002Fa\u003EOlshausen, B. A., and Field, D. J. (2000). Vision and the coding of natural images. \u003Ci\u003EAm. Sci.\u003C\u002Fi\u003E 88, 238–245. doi: 10.1511\u002F2000.3.238\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1511\u002F2000.3.238\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Vision+and+the+coding+of+natural+images&author=Olshausen+B.+A.&author=Field+D.+J.&publication_year=2000&journal=Am.+Sci.&volume=88&pages=238-245\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B55\" id=\"B55\"\u003E\u003C\u002Fa\u003EOnat, S., König, P., and Jancke, D. (2011). Natural scene evoked population dynamics across cat primary visual cortex captured with voltage-sensitive dye imaging. \u003Ci\u003ECereb. Cortex\u003C\u002Fi\u003E 21, 2542–2554. doi: 10.1093\u002Fcercor\u002Fbhr038\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=21459837\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1093\u002Fcercor\u002Fbhr038\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Natural+scene+evoked+population+dynamics+across+cat+primary+visual+cortex+captured+with+voltage-sensitive+dye+imaging&author=Onat+S.&author=König+P.&author=Jancke+D.&publication_year=2011&journal=Cereb.+Cortex&volume=21&pages=2542-2554\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B56\" id=\"B56\"\u003E\u003C\u002Fa\u003EPurves, D., and Lotto, R. B. (2003). \u003Ci\u003EWhy We See What We Do: An Empirical Theory of Vision.\u003C\u002Fi\u003E Sunderland, MA: Sinauer Associates.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Why+We+See+What+We+Do%3A+An+Empirical+Theory+of+Vision&author=Purves+D.&author=Lotto+R.+B.&publication_year=2003\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E \r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B57\" id=\"B57\"\u003E\u003C\u002Fa\u003EPurves, D., and Lotto, R. B. (2011). \u003Ci\u003EWhy We See What We Do Redux: A Wholly Empirical Theory of Vision.\u003C\u002Fi\u003E Sunderland, MA: Sinauer Associates.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B58\" id=\"B58\"\u003E\u003C\u002Fa\u003EPurves, D., Monson, B. B., Sundararajan, J., and Wojtach, W. T. (2014). How biological vision succeeds in the physical world. \u003Ci\u003EProc. Natl. Acad. Sci. U S A\u003C\u002Fi\u003E 111, 4750–4755. doi: 10.1073\u002Fpnas.1311309111\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=24639506\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1073\u002Fpnas.1311309111\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=How+biological+vision+succeeds+in+the+physical+world&author=Purves+D.&author=Monson+B.+B.&author=Sundararajan+J.&author=Wojtach+W.+T.&publication_year=2014&journal=Proc.+Natl.+Acad.+Sci.+U+S+A&volume=111&pages=4750-4755\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B59\" id=\"B59\"\u003E\u003C\u002Fa\u003EPurves, D., Wojtach, W. T., and Lotto, R. B. (2011). Understanding vision in wholly empirical terms. \u003Ci\u003EProc. Natl. Acad. Sci. U S A\u003C\u002Fi\u003E 108, 15588–15595. doi: 10.1073\u002Fpnas.1012178108\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=21383192\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1073\u002Fpnas.1012178108\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Understanding+vision+in+wholly+empirical+terms&author=Purves+D.&author=Wojtach+W.+T.&author=Lotto+R.+B.&publication_year=2011&journal=Proc.+Natl.+Acad.+Sci.+U+S+A&volume=108&pages=15588-15595\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B60\" id=\"B60\"\u003E\u003C\u002Fa\u003ERao, R. P., and Ballard, D. H. (1999). Predictive coding in the visual cortex: a functional interpretation of some extra-classical receptive-field effects. \u003Ci\u003ENat. Neurosci.\u003C\u002Fi\u003E 2, 79–87. doi: 10.1038\u002F4580\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=10195184\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1038\u002F4580\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Predictive+coding+in+the+visual+cortex%3A+a+functional+interpretation+of+some+extra-classical+receptive-field+effects&author=Rao+R.+P.&author=Ballard+D.+H.&publication_year=1999&journal=Nat.+Neurosci.&volume=2&pages=79-87\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B61\" id=\"B61\"\u003E\u003C\u002Fa\u003ERobinson, J. O. (1998). \u003Ci\u003EThe Psychology of Visual Illusions.\u003C\u002Fi\u003E New York: Dover (corrected republication of the 1972 edition published by Hutchinson and Co. in England).\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=The+Psychology+of+Visual+Illusions&author=Robinson+J.+O.&publication_year=1998\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B62\" id=\"B62\"\u003E\u003C\u002Fa\u003ERock, I. (1984). \u003Ci\u003EPerception.\u003C\u002Fi\u003E New York: MacMillan. \u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=6442161\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Perception&author=Rock+I.&publication_year=1984\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B630\" id=\"B630\"\u003E\u003C\u002Fa\u003ESakmann, B., and Creutzfeldt, O. D. (1969). Scotopic and mesopic light adaptation in the cat’s retina. \u003Ci\u003EPlügers Archiv.\u003C\u002Fi\u003E 313, 168–185. doi: 10.1007\u002FBF00586245\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1007\u002FBF00586245\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Scotopic+and+mesopic+light+adaptation+in+the+cat's+retina&author=Sakmann+B.&author=Creutzfeldt+O.+D.&publication_year=1969&journal=Plügers+Archiv.&volume=313&pages=168-185\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B63\" id=\"B63\"\u003E\u003C\u002Fa\u003ESchwartz, O., and Simoncelli, E. P. (2001). Natural signal statistics and sensory gain control. \u003Ci\u003ENat. Neurosci.\u003C\u002Fi\u003E 4, 819–825. doi: 10.1038\u002F90526\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=11477428\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1038\u002F90526\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Natural+signal+statistics+and+sensory+gain+control&author=Schwartz+O.&author=Simoncelli+E.+P.&publication_year=2001&journal=Nat.+Neurosci.&volume=4&pages=819-825\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B64\" id=\"B64\"\u003E\u003C\u002Fa\u003EShannon, C. E. (1948). A mathematical theory of communication. \u003Ci\u003EBell Sys. Tech. J.\u003C\u002Fi\u003E 27, 379–423, 623–656.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=A+mathematical+theory+of+communication&author=Shannon+C.+E.&publication_year=1948&journal=Bell+Sys.+Tech.+J.&volume=27&pages=379-423\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B65\" id=\"B65\"\u003E\u003C\u002Fa\u003EShannon, C. E., and Weaver, W. (1949). \u003Ci\u003EThe Mathematical Theory of Communication.\u003C\u002Fi\u003E Chicago: University of Illinois Press.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=The+Mathematical+Theory+of+Communication&author=Shannon+C.+E.&author=Weaver+W.&publication_year=1949\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B66\" id=\"B66\"\u003E\u003C\u002Fa\u003ESherrington, C. S. (1947). \u003Ci\u003EThe Integrative Action of the Nervous System.\u003C\u002Fi\u003E New Haven, CT: Yale University Press.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=The+Integrative+Action+of+the+Nervous+System&author=Sherrington+C.+S.&publication_year=1947\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B67\" id=\"B67\"\u003E\u003C\u002Fa\u003ESimoncelli, E. P., and Olshausen, B. A. (2001). Natural image statistics and neural representation. \u003Ci\u003EAnnu. Rev. Neurosci.\u003C\u002Fi\u003E 24, 1193–1216. doi: 10.1146\u002Fannurev.neuro.24.1.1193\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=11520932\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1146\u002Fannurev.neuro.24.1.1193\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Natural+image+statistics+and+neural+representation&author=Simoncelli+E.+P.&author=Olshausen+B.+A.&publication_year=2001&journal=Annu.+Rev.+Neurosci.&volume=24&pages=1193-1216\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B68\" id=\"B68\"\u003E\u003C\u002Fa\u003ESrinivasan, M. V., Laughlin, S. B., and Dubs, A. (1982). Predictive coding: a fresh view of inhibition in the retina. \u003Ci\u003EProc. R. Lond. B. Biol. Sci.\u003C\u002Fi\u003E 216, 427–459. doi: 10.1098\u002Frspb.1982.0085\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=6129637\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1098\u002Frspb.1982.0085\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Predictive+coding%3A+a+fresh+view+of+inhibition+in+the+retina&author=Srinivasan+M.+V.&author=Laughlin+S.+B.&author=Dubs+A.&publication_year=1982&journal=Proc.+R.+Lond.+B.+Biol.+Sci.&volume=216&pages=427-459\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B69\" id=\"B69\"\u003E\u003C\u002Fa\u003EStevens, S. S. (1975). \u003Ci\u003EPsychophysics.\u003C\u002Fi\u003E New York: John Wiley.\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Psychophysics&author=Stevens+S.+S.&publication_year=1975\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B70\" id=\"B70\"\u003E\u003C\u002Fa\u003EStocker, A. A., and Simoncelli, E. P. (2006). Noise characteristics and prior expectations in human visual speed perception. \u003Ci\u003ENat. Neurosci.\u003C\u002Fi\u003E 9, 578–585. doi: 10.1038\u002Fnn1669\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=16547513\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1038\u002Fnn1669\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Noise+characteristics+and+prior+expectations+in+human+visual+speed+perception&author=Stocker+A.+A.&author=Simoncelli+E.+P.&publication_year=2006&journal=Nat.+Neurosci.&volume=9&pages=578-585\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B71\" id=\"B71\"\u003E\u003C\u002Fa\u003ESung, K., Wojtach, W. T., and Purves, D. (2009). An empirical explanation of aperture effects. \u003Ci\u003EProc. Natl. Acad. Sci. U S A\u003C\u002Fi\u003E 106, 298–303. doi: 10.1073\u002Fpnas.0811702106\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=19114661\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1073\u002Fpnas.0811702106\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=An+empirical+explanation+of+aperture+effects&author=Sung+K.&author=Wojtach+W.+T.&author=Purves+D.&publication_year=2009&journal=Proc.+Natl.+Acad.+Sci.+U+S+A&volume=106&pages=298-303\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B72\" id=\"B72\"\u003E\u003C\u002Fa\u003ETassinari, H., Hudson, T. E., and Landy, M. S. (2006). Combining priors and noisy visual cues in a rapid pointing task. \u003Ci\u003EJ. Neurosci.\u003C\u002Fi\u003E 26, 10154–10163. doi: 10.1523\u002Fjneurosci.2779-06.2006\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=17021171\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1523\u002Fjneurosci.2779-06.2006\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Combining+priors+and+noisy+visual+cues+in+a+rapid+pointing+task&author=Tassinari+H.&author=Hudson+T.+E.&author=Landy+M.+S.&publication_year=2006&journal=J.+Neurosci.&volume=26&pages=10154-10163\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B73\" id=\"B73\"\u003E\u003C\u002Fa\u003Evan Hateren, J. H., and van der Schaaf, A. (1998). Independent component filters of natural images compared with simple cells in primary visual cortex. \u003Ci\u003EProc. R Soc. Lond. B.\u003C\u002Fi\u003E 265, 359–366. doi: 10.1098\u002Frspb.1998.0303\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=9523437\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1098\u002Frspb.1998.0303\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Independent+component+filters+of+natural+images+compared+with+simple+cells+in+primary+visual+cortex&author=van+Hateren+J.+H.&author=van+der+Schaaf+A.&publication_year=1998&journal=Proc.+R+Soc.+Lond.+B.&volume=265&pages=359-366\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B74\" id=\"B74\"\u003E\u003C\u002Fa\u003EVinje, W. E., and Gallant, J. L. (2000). Sparse coding and decorrelation in primary visual cortex during natural vision. \u003Ci\u003EScience\u003C\u002Fi\u003E 287, 1273–1276. doi: 10.1126\u002Fscience.287.5456.1273\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=10678835\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1126\u002Fscience.287.5456.1273\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Sparse+coding+and+decorrelation+in+primary+visual+cortex+during+natural+vision&author=Vinje+W.+E.&author=Gallant+J.+L.&publication_year=2000&journal=Science&volume=287&pages=1273-1276\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B75\" id=\"B75\"\u003E\u003C\u002Fa\u003EWeiss, Y., Simoncelli, E. P., and Adelson, E. H. (2002). Motion illusions as optimal percepts. \u003Ci\u003ENat. Neurosci.\u003C\u002Fi\u003E 5, 598–604. doi: 10.1038\u002Fnn858\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=12021763\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1038\u002Fnn858\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=Motion+illusions+as+optimal+percepts&author=Weiss+Y.&author=Simoncelli+E.+P.&author=Adelson+E.+H.&publication_year=2002&journal=Nat.+Neurosci.&volume=5&pages=598-604\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\" style=\"margin-bottom:0.5em;\"\u003E \r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B76\" id=\"B76\"\u003E\u003C\u002Fa\u003EWertheimer, M. (1912\u002F1950). “Laws of organization in perceptual forms,” in \u003Ci\u003EA Sourcebook of Gestalt Psychology\u003C\u002Fi\u003E, ed. W. D. Ellis Translator (New York: Humanities Press), 71–88.\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B77\" id=\"B77\"\u003E\u003C\u002Fa\u003EWojtach, W. T., Sung, K., Truong, S., and Purves, D. (2008). An empirical explanation of the flash-lag effect. \u003Ci\u003EProc. Natl. Acad. Sci. U S A\u003C\u002Fi\u003E 105, 16338–16343. doi: 10.1073\u002Fpnas.0808916105\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=18852459\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1073\u002Fpnas.0808916105\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=An+empirical+explanation+of+the+flash-lag+effect&author=Wojtach+W.+T.&author=Sung+K.&author=Truong+S.&author=Purves+D.&publication_year=2008&journal=Proc.+Natl.+Acad.+Sci.+U+S+A&volume=105&pages=16338-16343\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B78\" id=\"B78\"\u003E\u003C\u002Fa\u003EWojtach, W. T., Sung, K., and Purves, D. (2009). An empirical explanation of the speed-distance effect. \u003Ci\u003EPLoS One\u003C\u002Fi\u003E 4:e6771. doi: 10.1371\u002Fjournal.pone.0006771\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=19707552\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1371\u002Fjournal.pone.0006771\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=An+empirical+explanation+of+the+speed-distance+effect&author=Wojtach+W.+T.&author=Sung+K.&author=Purves+D.&publication_year=2009&journal=PLoS+One&volume=4&pages=e6771\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"References\"\u003E\r\n\u003Cp class=\"ReferencesCopy1\"\u003E\u003Ca name=\"B79\" id=\"B79\"\u003E\u003C\u002Fa\u003EYang, Z., and Purves, D. (2004). The statistical structure of natural light patterns determines perceived light intensity. \u003Ci\u003EProc. Natl. Acad. Sci. U S A\u003C\u002Fi\u003E 101, 8745–8750. doi: 10.1073\u002Fpnas.0402192101\u003C\u002Fp\u003E\r\n\u003Cp class=\"ReferencesCopy2\"\u003E\u003Ca href=\"http:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fsites\u002Fentrez?Db=pubmed&Cmd=ShowDetailView&TermToSearch=15152077\" target=\"_blank\"\u003EPubMed Abstract\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fdx.doi.org\u002F10.1073\u002Fpnas.0402192101\" target=\"_blank\"\u003ECrossRef Full Text\u003C\u002Fa\u003E | \u003Ca href=\"http:\u002F\u002Fscholar.google.com\u002Fscholar_lookup?title=The+statistical+structure+of+natural+light+patterns+determines+perceived+light+intensity&author=Yang+Z.&author=Purves+D.&publication_year=2004&journal=Proc.+Natl.+Acad.+Sci.+U+S+A&volume=101&pages=8745-8750\" target=\"_blank\"\u003EGoogle Scholar\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"thinLineM20\"\u003E\u003C\u002Fdiv\u003E\r\n\u003Cdiv class=\"AbstractSummary\"\u003E\r\n\u003Cp\u003E\u003Cspan\u003EKeywords:\u003C\u002Fspan\u003E vision, visual perception, feature detection, Bayesian probability, efficient coding, empirical ranking\u003C\u002Fp\u003E \r\n\u003Cp\u003E\u003Cspan\u003ECitation:\u003C\u002Fspan\u003E Purves D, Morgenstern Y and Wojtach WT (2015) Perception and Reality: Why a Wholly Empirical Paradigm is Needed to Understand Vision. Front. Syst. Neurosci. 9:156. doi: 10.3389\u002Ffnsys.2015.00156\u003C\u002Fp\u003E\r\n\u003Cp id=\"timestamps\"\u003E\u003Cspan\u003EReceived:\u003C\u002Fspan\u003E 30 July 2015; \u003Cspan\u003EAccepted:\u003C\u002Fspan\u003E 29 October 2015;\u003Cbr\u003E \u003Cspan\u003EPublished:\u003C\u002Fspan\u003E 18 November 2015.\u003C\u002Fp\u003E\r\n\u003Cdiv\u003E\u003Cp\u003EEdited by:\u003C\u002Fp\u003E \u003Ca href=\"http:\u002F\u002Floop.frontiersin.org\u002Fpeople\u002F114786\u002Foverview\"\u003EChrystalina A. Antoniades\u003C\u002Fa\u003E, University of Oxford, UK\u003C\u002Fdiv\u003E\r\n\u003Cdiv\u003E\u003Cp\u003EReviewed by:\u003C\u002Fp\u003E \u003Ca href=\"http:\u002F\u002Floop.frontiersin.org\u002Fpeople\u002F514\u002Foverview\"\u003EDirk Jancke\u003C\u002Fa\u003E, Ruhr-University Bochum, Germany\u003Cbr\u003E \u003Ca href=\"http:\u002F\u002Floop.frontiersin.org\u002Fpeople\u002F3813\u002Foverview\"\u003ERava Azeredo Da Silveira\u003C\u002Fa\u003E, Ecole Normale Supérieure, France\u003Cbr\u003E \u003Ca href=\"http:\u002F\u002Floop.frontiersin.org\u002Fpeople\u002F30948\u002Foverview\"\u003EWalter Glannon\u003C\u002Fa\u003E, University of Calgary, Canada\u003C\u002Fdiv\u003E\r\n\u003Cp\u003E\u003Cspan\u003ECopyright\u003C\u002Fspan\u003E © 2015 Purves, Morgenstern and Wojtach. This is an open-access article distributed under the terms of the \u003Ca rel=\"license\" href=\"http:\u002F\u002Fcreativecommons.org\u002Flicenses\u002Fby\u002F4.0\u002F\" target=\"_blank\"\u003ECreative Commons Attribution License (CC BY)\u003C\u002Fa\u003E. The use, distribution and reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.\u003C\u002Fp\u003E\r\n\u003Cp\u003E\u003Cspan\u003E*Correspondence:\u003C\u002Fspan\u003E Dale Purves, \u003Ca id=\"encmail\"\u003EcHVydmVzQG5ldXJvLmR1a2UuZWR1\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003Cdiv class=\"clear\"\u003E\u003C\u002Fdiv\u003E\r\n\u003C\u002Fdiv\u003E \r\n",menuHtml:"\u003Cul class=\"flyoutJournal\"\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h1\"\u003EAbstract\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h2\"\u003EIntroduction\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h3\"\u003EEarly Ideas About Vision on an Empirical Basis\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h4\"\u003EVision as Bayesian Inference\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h5\"\u003EInformation Theoretic Approaches\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h6\"\u003EA Wholly Empirical Approach\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h7\"\u003EAn Example\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h8\"\u003EConsequences of Input-Output Associations on a Wholly Empirical Basis\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h9\"\u003EExploring Neuronal Connectivity in Wholly Empirical Terms\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h10\"\u003EVision as Reflexive\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h11\"\u003ELimitations of a Wholly Empirical Approach\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h12\"\u003EThe Wholly Empirical Theory and Cognition\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h13\"\u003EConclusion\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h14\"\u003EConflict of Interest Statement\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca href=\"#h15\"\u003EReferences\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003C\u002Ful\u003E"},files:[{name:"EPUB 1.EPUB",fileServerPackageEntryId:g,type:{code:at,name:at}},{name:"Finalxml.xml",fileServerPackageEntryId:"fnsys-09-00156\u002Ffnsys-09-00156.xml",type:{code:"NLM_XML",name:"XML"}},{name:au,fileServerPackageEntryId:"fnsys-09-00156\u002Ffnsys-09-00156.pdf",type:{code:p,name:p}},{name:au,fileServerPackageEntryId:g,type:{code:p,name:p}},{name:"Provisional PDF.pdf",fileServerPackageEntryId:g,type:{code:p,name:p}}]},currentArticlePageMetaInfo:{title:av,link:[{rel:"canonical",href:aw}],meta:[{hid:u,property:u,name:u,content:ax},{hid:ay,property:ay,name:"title",content:av},{hid:az,property:az,name:u,content:ax},{hid:aA,name:aA,content:"Vision,Visual Perception,feature detection,Bayesian probability,efficient coding,empirical ranking"},{hid:aB,property:aB,name:"site_name",content:v},{hid:aC,property:aC,name:B,content:"https:\u002F\u002Fimages-provider.frontiersin.org\u002Fapi\u002Fipx\u002Fw=1200&f=png\u002Fhttps:\u002F\u002Fwww.frontiersin.org\u002Ffiles\u002FArticles\u002F163471\u002Ffnsys-09-00156-HTML\u002Fimage_m\u002Ffnsys-09-00156-g001.jpg"},{hid:aD,property:aD,name:"type",content:"article"},{hid:aE,property:aE,name:"url",content:aw},{hid:aF,name:aF,content:"summary_large_image"},{hid:aG,name:aG,content:"9"},{hid:aH,name:aH,content:o},{hid:aI,name:aI,content:v},{hid:aJ,name:aJ,content:D},{hid:aK,name:aK,content:E},{hid:aL,name:aL,content:$},{hid:aM,name:aM,content:"163471"},{hid:aN,name:aN,content:"English"},{hid:aO,name:aO,content:G},{hid:aP,name:aP,content:"Vision; Visual Perception; feature detection; Bayesian probability; efficient coding; empirical ranking"},{hid:aQ,name:aQ,content:aa},{hid:aR,name:aR,content:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fsystems-neuroscience\u002Farticles\u002F10.3389\u002Ffnsys.2015.00156\u002Fpdf"},{hid:aS,name:aS,content:"2015\u002F10\u002F29"},{hid:aT,name:aT,content:"2015\u002F11\u002F18"},{hid:"citation_author_0",name:J,content:"Purves, Dale"},{hid:"citation_author_institution_0",name:K,content:aU},{hid:"citation_author_1",name:J,content:"Morgenstern, Yaniv"},{hid:"citation_author_institution_1",name:K,content:"Duke-NUS Graduate Medical School, Singapore, Singapore"},{hid:"citation_author_2",name:J,content:"Wojtach, William T."},{hid:"citation_author_institution_2",name:K,content:aU},{hid:aV,name:aV,content:"doi:10.3389\u002Ffnsys.2015.00156"}],script:[{src:"https:\u002F\u002Fcdnjs.cloudflare.com\u002Fpolyfill\u002Fv3\u002Fpolyfill.min.js?features=es6",body:h,async:h},{src:"https:\u002F\u002Fcdnjs.cloudflare.com\u002Fajax\u002Flibs\u002Fmathjax\u002F2.7.1\u002FMathJax.js?config=TeX-MML-AM_CHTML",body:h,async:h},{src:"https:\u002F\u002Fd1bxh8uas1mnw7.cloudfront.net\u002Fassets\u002Faltmetric_badges-f0bc9b243ff5677d05460c1eb71834ca998946d764eb3bc244ab4b18ba50d21e.js",body:h,async:h},{src:"https:\u002F\u002Fapi.altmetric.com\u002Fv1\u002Fdoi\u002F10.3389\u002Ffnsys.2015.00156?callback=_altmetric.embed_callback&domain=www.frontiersin.org&key=3c130976ca2b8f2e88f8377633751ba1&cache_until=14-15",body:h,async:h},{src:"https:\u002F\u002Fcrossmark-cdn.crossref.org\u002Fwidget\u002Fv2.0\u002Fwidget.js",body:h,async:h}]},articleHubArticlesList:[],showCrossmarkWidget:h,hasSupplementalData:l,isPreviewArticlePage:l,settingsFeaturesSwitchers:{displayTitlePillLabels:h,displayRelatedArticlesBox:h,showEditors:h,showReviewers:h,showLoopImpactLink:h,enableFigshare:l},tenantConfig:{spaceId:c,name:v,availableJournalPages:[aW,aX,aY,"volumes","about"],announcement:{sys:{id:"2tE5oIdYfULBQILAgR2OSx",__typename:"Sys"},preHeader:"Research integrity at Frontiers",title:"94% of researchers rate our articles as excellent or good",description:"Learn more about the work of our research integrity team to safeguard the quality of each article we publish.",image:[{id:"0B4B1380-42EB-4FD5-9D7E2DBC603E79F8",src:aZ,name:a_,tags:["ultra","sunset","achieving","summer","challenge","winning","extreme","workout","hike","path","action","uphill","effort","athlete","physical","height","activity","mountaineering","endurance","mount","runner","nordic","race","male","achieve","nature","run","adventure",C,"perseverance","freedom","fitness","backcountry","altitude","sports","man","hill","mountain","outdoor","exercise","energetic","trail","climb","skyrunning","lifestyle"],type:B,width:7100,height:4733,archive:n,brandId:U,limited:n,fileSize:16838862,isPublic:n,original:e,copyright:e,extension:[V],thumbnails:{mini:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002F0B4B1380-42EB-4FD5-9D7E2DBC603E79F8\u002Fmini-C4875379-1478-416F-B03DF68FE3D8DBB5.png",thul:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002F0B4B1380-42EB-4FD5-9D7E2DBC603E79F8\u002Fthul-C4875379-1478-416F-B03DF68FE3D8DBB5.png",webimage:aZ,Guidelines:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002F0B4B1380-42EB-4FD5-9D7E2DBC603E79F8\u002FGuidelines-C4875379-1478-416F-B03DF68FE3D8DBB5.png",WebsiteJpg_XL:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002F0B4B1380-42EB-4FD5-9D7E2DBC603E79F8\u002FWebsiteJpg_XL-C4875379-1478-416F-B03DF68FE3D8DBB5.jpg",WebsiteWebP_L:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002F0B4B1380-42EB-4FD5-9D7E2DBC603E79F8\u002FWebsiteWebP_L-C4875379-1478-416F-B03DF68FE3D8DBB5.webp",WebsiteWebP_M:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002F0B4B1380-42EB-4FD5-9D7E2DBC603E79F8\u002FWebsiteWebP_M-C4875379-1478-416F-B03DF68FE3D8DBB5.webp",WebsiteWebP_XL:"https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002F0B4B1380-42EB-4FD5-9D7E2DBC603E79F8\u002FWebsiteWebP_XL-C4875379-1478-416F-B03DF68FE3D8DBB5.webp"},dateCreated:"2022-12-14T15:44:00Z",description:a_,orientation:C,watermarked:n,dateModified:"2023-01-26T09:08:47Z",datePublished:"2022-12-14T16:40:06Z",videoPreviewURLs:[],textMetaproperties:[]}],link:{text:"Find out more ",url:a$,target:f,ariaLabel:"About our research integrity team",__typename:r},__typename:"Announcement"}},components:{ibar:{tenantLogo:g,journalLogo:g,aboutUs:[{title:"Who we are",links:[{text:"Mission and values",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fmission",target:f,ariaLabel:e},{text:"History",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fhistory",target:f,ariaLabel:e},{text:"Leadership",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fleadership",target:f,ariaLabel:e},{text:"Awards",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fawards",target:f,ariaLabel:e}]},{title:"Impact and progress",links:[{text:"Frontiers' impact",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fimpact",target:f,ariaLabel:e},{text:"Progress Report 2022",url:"https:\u002F\u002Fprogressreport.frontiersin.org\u002F?utm_source=fweb&utm_medium=frep&utm_campaign=pr20",target:k,ariaLabel:e},{text:"All annual reports",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fannual-reports",target:f,ariaLabel:e}]},{title:"Publishing model",links:[{text:ba,url:bb,target:f,ariaLabel:e},{text:"Open access",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fopen-access",target:f,ariaLabel:e},{text:"Peer review",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fpeer-review",target:f,ariaLabel:e},{text:"Research integrity",url:a$,target:f,ariaLabel:e},{text:bc,url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fresearch-topics",target:f,ariaLabel:e},{text:"FAIR² Data Management",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Ffair-data-management",target:f,ariaLabel:e},{text:bd,url:be,target:f,ariaLabel:e}]},{title:"Services",links:[{text:"Societies",url:"https:\u002F\u002Fpublishingpartnerships.frontiersin.org\u002F",target:k,ariaLabel:e},{text:"National consortia",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fopen-access-agreements\u002Fconsortia",target:f,ariaLabel:e},{text:"Institutional partnerships",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fopen-access-agreements",target:f,ariaLabel:e},{text:"Collaborators",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fcollaborators",target:f,ariaLabel:e}]},{title:"More from Frontiers",links:[{text:"Frontiers Forum",url:bf,target:k,ariaLabel:"this link will take you to the Frontiers Forum website"},{text:bg,url:bh,target:k,ariaLabel:bi},{text:"Press office",url:"https:\u002F\u002Fpressoffice.frontiersin.org\u002F",target:k,ariaLabel:"this link will take you to the Frontiers press office website"},{text:"Sustainability",url:"https:\u002F\u002Fwww.frontiersin.orgabout\u002Fsustainability",target:f,ariaLabel:"link to information about Frontiers' sustainability"},{text:bj,url:bk,target:k,ariaLabel:"this link will take you to the Frontiers careers website"},{text:"Contact us",url:bl,target:f,ariaLabel:"this link will take you to the help pages to contact our support team"}]}],submitUrl:"https:\u002F\u002Fwww.frontiersin.org\u002Fsubmission\u002Fsubmit?domainid=1&fieldid=55&specialtyid=1091&entitytype=1&entityid=5",showSubmitButton:h,journal:{id:m,name:o,slug:q,sections:[]},sectionTerm:"Sections",aboutJournal:[{title:"Scope",links:[{text:"Specialty chief editors",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fsystems-neuroscience\u002Fabout#about-editors",target:f,ariaLabel:e},{text:"Mission & scope",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fsystems-neuroscience\u002Fabout#about-scope",target:f,ariaLabel:e},{text:"Facts",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fsystems-neuroscience\u002Fabout#about-facts",target:f,ariaLabel:e},{text:"Submission",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fsystems-neuroscience\u002Fabout#about-submission",target:f,ariaLabel:e},{text:"Open access statement",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fsystems-neuroscience\u002Fabout#about-open",target:f,ariaLabel:e},{text:"Copyright statement",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fsystems-neuroscience\u002Fabout#copyright-statement",target:f,ariaLabel:e},{text:"Quality",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fsystems-neuroscience\u002Fabout#about-quality",target:f,ariaLabel:e}]},{title:"For authors",links:[{text:"Why submit?",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fsystems-neuroscience\u002Ffor-authors\u002Fwhy-submit",target:f,ariaLabel:e},{text:"Article types",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fsystems-neuroscience\u002Ffor-authors\u002Farticle-types",target:f,ariaLabel:e},{text:bm,url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fsystems-neuroscience\u002Ffor-authors\u002Fauthor-guidelines",target:f,ariaLabel:e},{text:bn,url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fsystems-neuroscience\u002Ffor-authors\u002Feditor-guidelines",target:f,ariaLabel:e},{text:"Publishing fees",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fsystems-neuroscience\u002Ffor-authors\u002Fpublishing-fees",target:f,ariaLabel:e},{text:"Submission checklist",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fsystems-neuroscience\u002Ffor-authors\u002Fsubmission-checklist",target:f,ariaLabel:e},{text:"Contact editorial office",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fsystems-neuroscience\u002Ffor-authors\u002Fcontact-editorial-office",target:f,ariaLabel:e}]}],mainLinks:[{text:"All journals",url:bo,target:f,ariaLabel:e},{text:"All articles",url:bp,target:f,ariaLabel:e}],journalLinks:[{text:bq,url:aW,target:f,ariaLabel:e},{text:bc,url:aY,target:f,ariaLabel:e},{text:"Editorial board",url:aX,target:f,ariaLabel:e}],helpCenterLink:{text:w,url:br,target:k,ariaLabel:w}},footer:{blocks:[{title:"Guidelines",links:[{text:bm,url:"https:\u002F\u002Fwww.frontiersin.org\u002Fguidelines\u002Fauthor-guidelines",target:f,ariaLabel:e},{text:bn,url:"https:\u002F\u002Fwww.frontiersin.org\u002Fguidelines\u002Feditor-guidelines",target:f,ariaLabel:e},{text:"Policies and publication ethics",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fguidelines\u002Fpolicies-and-publication-ethics",target:f,ariaLabel:e},{text:bd,url:be,target:f,ariaLabel:e}]},{title:"Explore",links:[{text:bq,url:bp,target:f,ariaLabel:e},{text:"Research Topics ",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fresearch-topics",target:f,ariaLabel:e},{text:"Journals",url:bo,target:f,ariaLabel:e},{text:ba,url:bb,target:f,ariaLabel:e}]},{title:"Outreach",links:[{text:"Frontiers Forum ",url:bf,target:k,ariaLabel:"Frontiers Forum website"},{text:"Frontiers Policy Labs ",url:"https:\u002F\u002Fpolicylabs.frontiersin.org\u002F",target:k,ariaLabel:e},{text:bs,url:"https:\u002F\u002Fkids.frontiersin.org\u002F",target:k,ariaLabel:"Frontiers for Young Minds journal"},{text:bg,url:bh,target:k,ariaLabel:bi}]},{title:"Connect",links:[{text:w,url:br,target:k,ariaLabel:w},{text:"Emails and alerts ",url:"https:\u002F\u002Floop.frontiersin.org\u002Fsettings\u002Femail-preferences?a=publishers",target:k,ariaLabel:"Subscribe to Frontiers emails"},{text:"Contact us ",url:bl,target:f,ariaLabel:"Subscribe to newsletter"},{text:"Submit",url:"https:\u002F\u002Fwww.frontiersin.org\u002Fsubmission\u002Fsubmit",target:f,ariaLabel:e},{text:bj,url:bk,target:k,ariaLabel:e}]}],socialLinks:[{link:{text:bt,url:"https:\u002F\u002Fwww.facebook.com\u002FFrontiersin",target:k,ariaLabel:bt},type:r,color:x,icon:"Facebook",size:y,hiddenText:h},{link:{text:"Frontiers Twitter",url:"https:\u002F\u002Ftwitter.com\u002Ffrontiersin",target:k,ariaLabel:e},type:r,color:x,icon:"Twitter",size:y,hiddenText:h},{link:{text:"Frontiers LinkedIn",url:"https:\u002F\u002Fwww.linkedin.com\u002Fcompany\u002Ffrontiers",target:k,ariaLabel:e},type:r,color:x,icon:"LinkedIn",size:y,hiddenText:h},{link:{text:"Frontiers Instagram",url:"https:\u002F\u002Fwww.instagram.com\u002Ffrontiersin_",target:k,ariaLabel:e},type:r,color:x,icon:"Instagram",size:y,hiddenText:h}],copyright:"Frontiers Media S.A. All rights reserved",termsAndConditionsUrl:"https:\u002F\u002Fwww.frontiersin.org\u002Flegal\u002Fterms-and-conditions",privacyPolicyUrl:"https:\u002F\u002Fwww.frontiersin.org\u002Flegal\u002Fprivacy-policy"},newsletterComponent:e,snackbarItems:[]},mainHeader:{title:g,image:F,breadcrumbs:[],linksCollection:{total:n,items:[]},metricsCollection:{total:n,items:[]}},user:{loggedUserInfo:F},journals:[{id:I,name:bu,slug:bv,abbreviation:bw,space:{id:i,domainName:j,__typename:b},__typename:a},{id:2445,name:bu,slug:bv,abbreviation:bw,space:{id:c,domainName:d,__typename:b},__typename:a},{id:L,name:"Test SSPH Journal",slug:"test-ssph-journal",abbreviation:"testjournal",space:{id:m,domainName:z,__typename:b},__typename:a},{id:bx,name:"TEST ALF Journal",slug:"test-alf-journal",abbreviation:"talfj",space:{id:s,domainName:M,__typename:b},__typename:a},{id:i,name:by,slug:bz,abbreviation:bA,space:{id:i,domainName:j,__typename:b},__typename:a},{id:2360,name:by,slug:bz,abbreviation:bA,space:{id:c,domainName:d,__typename:b},__typename:a},{id:c,name:"Smoke Test Field",slug:"smoke-test-field",abbreviation:"FJST",space:{id:N,domainName:bB,__typename:b},__typename:a},{id:bx,name:bC,slug:bD,abbreviation:bE,space:{id:m,domainName:z,__typename:b},__typename:a},{id:2077,name:bC,slug:bD,abbreviation:bE,space:{id:c,domainName:d,__typename:b},__typename:a},{id:L,name:bF,slug:bG,abbreviation:bH,space:{id:s,domainName:M,__typename:b},__typename:a},{id:L,name:bF,slug:bG,abbreviation:bH,space:{id:c,domainName:d,__typename:b},__typename:a},{id:bI,name:bJ,slug:bK,abbreviation:bL,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3776,name:bJ,slug:bK,abbreviation:bL,space:{id:c,domainName:d,__typename:b},__typename:a},{id:bM,name:bN,slug:bO,abbreviation:bP,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3765,name:bN,slug:bO,abbreviation:bP,space:{id:c,domainName:d,__typename:b},__typename:a},{id:14,name:bQ,slug:bR,abbreviation:bS,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3414,name:bQ,slug:bR,abbreviation:bS,space:{id:c,domainName:d,__typename:b},__typename:a},{id:20,name:bT,slug:bU,abbreviation:bV,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3754,name:bT,slug:bU,abbreviation:bV,space:{id:c,domainName:d,__typename:b},__typename:a},{id:N,name:bW,slug:bX,abbreviation:bY,space:{id:i,domainName:j,__typename:b},__typename:a},{id:2444,name:bW,slug:bX,abbreviation:bY,space:{id:c,domainName:d,__typename:b},__typename:a},{id:bZ,name:b_,slug:b$,abbreviation:ca,space:{id:m,domainName:z,__typename:b},__typename:a},{id:bZ,name:b_,slug:b$,abbreviation:ca,space:{id:c,domainName:d,__typename:b},__typename:a},{id:i,name:"GSL Test",slug:"gsl-test",abbreviation:"gslt",space:{id:t,domainName:O,__typename:b},__typename:a},{id:2356,name:"Frontiers in the Internet of Things",slug:"the-internet-of-things",abbreviation:"friot",space:{id:c,domainName:d,__typename:b},__typename:a},{id:656,name:"Frontiers in Zoological Science",slug:"zoological-science",abbreviation:"fzoos",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1720,name:"Frontiers in Zoological Research",slug:"zoological-research",abbreviation:"fzolr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3162,name:"Frontiers in Wound Care",slug:"wound-care",abbreviation:"fwoca",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3136,name:"Frontiers in Worm Science",slug:"worm-science",abbreviation:"fwors",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3583,name:"Frontiers in Wind Energy",slug:"wind-energy",abbreviation:"fwinde",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1451,name:"Frontiers in Water",slug:"water",abbreviation:"frwa",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1561,name:"Frontiers in Virtual Reality",slug:"virtual-reality",abbreviation:"frvir",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2000,name:"Frontiers in Virology",slug:"virology",abbreviation:"fviro",space:{id:c,domainName:d,__typename:b},__typename:a},{id:649,name:"Frontiers in Veterinary Science",slug:"veterinary-science",abbreviation:"fvets",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2176,name:"Frontiers in Urology",slug:"urology",abbreviation:"fruro",space:{id:c,domainName:d,__typename:b},__typename:a},{id:ab,name:"Frontiers in Tuberculosis",slug:"tuberculosis",abbreviation:"ftubr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1843,name:"Frontiers in Tropical Diseases",slug:"tropical-diseases",abbreviation:"fitd",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2417,name:"Frontiers in Transplantation",slug:"transplantation",abbreviation:"frtra",space:{id:c,domainName:d,__typename:b},__typename:a},{id:473,name:"Frontiers in Toxicology",slug:"toxicology",abbreviation:"ftox",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2105,name:"Frontiers in Thermal Engineering",slug:"thermal-engineering",abbreviation:"fther",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3190,name:"Frontiers in The Neurobiology of Pain",slug:"the-neurobiology-of-pain",abbreviation:g,space:{id:c,domainName:d,__typename:b},__typename:a},{id:1967,name:"Frontiers in Test_Field_Science_Archive",slug:"testfieldsciencearchive",abbreviation:"fntesc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1347,name:"Frontiers in Test_Field_Humanities_Archive",slug:"testfieldhumanitiesarchive",abbreviation:"fntes",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3573,name:"Frontiers in Taxonomy",slug:"taxonomy",abbreviation:"Front. Taxon.",space:{id:c,domainName:d,__typename:b},__typename:a},{id:m,name:o,slug:q,abbreviation:X,space:{id:c,domainName:d,__typename:b},__typename:a},{id:1721,name:"Frontiers in Systems Biology",slug:"systems-biology",abbreviation:"fsysb",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3381,name:"Frontiers in Synthetic Biology",slug:"synthetic-biology",abbreviation:"fsybi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:22,name:"Frontiers in Synaptic Neuroscience",slug:"synaptic-neuroscience",abbreviation:"fnsyn",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2299,name:"Frontiers in Sustainable Tourism",slug:"sustainable-tourism",abbreviation:"frsut",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2483,name:"Frontiers in Sustainable Resource Management",slug:"sustainable-resource-management",abbreviation:"fsrma",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1335,name:"Frontiers in Sustainable Food Systems",slug:"sustainable-food-systems",abbreviation:"fsufs",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2726,name:"Frontiers in Sustainable Energy Policy",slug:"sustainable-energy-policy",abbreviation:"fsuep",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1468,name:"Frontiers in Sustainable Cities",slug:"sustainable-cities",abbreviation:"frsc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1397,name:"Frontiers in Sustainable Business",slug:"sustainable-business",abbreviation:"fisb",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1547,name:"Frontiers in Sustainability",slug:"sustainability",abbreviation:"frsus",space:{id:c,domainName:d,__typename:b},__typename:a},{id:604,name:"Frontiers in Surgery",slug:"surgery",abbreviation:"fsurg",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2504,name:"Frontiers in Structural Biology",slug:"structural-biology",abbreviation:"frsbi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2497,name:"Frontiers in Stroke",slug:"stroke",abbreviation:"fstro",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3434,name:"Frontiers in Stem Cells",slug:"stem-cells",abbreviation:"fstce",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1482,name:"Frontiers in Sports and Active Living",slug:"sports-and-active-living",abbreviation:"fspor",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1695,name:"Frontiers in Space Technologies",slug:"space-technologies",abbreviation:"frspt",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3519,name:"Frontiers in Solar Energy",slug:"solar-energy",abbreviation:"fsoln",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1718,name:"Frontiers in Soil Science",slug:"soil-science",abbreviation:"fsoil",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2346,name:"Frontiers in Soft Matter",slug:"soft-matter",abbreviation:"frsfm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1213,name:"Frontiers in Sociology",slug:"sociology",abbreviation:"fsoc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:P,name:"Frontiers in Society Journal Archive",slug:"society-journal-archive",abbreviation:Q,space:{id:c,domainName:d,__typename:b},__typename:a},{id:2690,name:"Frontiers in Social Psychology",slug:"social-psychology",abbreviation:"frsps",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2819,name:"Frontiers in Smart Grids",slug:"smart-grids",abbreviation:"frsgr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2288,name:"Frontiers in Sleep",slug:"sleep",abbreviation:"frsle",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2552,name:"Frontiers in Skin Cancer",slug:"skin-cancer",abbreviation:"fskcr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1786,name:"Frontiers in Signal Processing",slug:"signal-processing",abbreviation:"frsip",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1704,name:"Frontiers in Sensors",slug:"sensors",abbreviation:"fsens",space:{id:c,domainName:d,__typename:b},__typename:a},{id:m,name:"Frontiers in Science archive",slug:"science-archive",abbreviation:A,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3737,name:"Frontiers in Science Diplomacy",slug:"science-diplomacy",abbreviation:"fsdip",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2766,name:"Frontiers in Science",slug:S,abbreviation:"fsci",space:{id:c,domainName:d,__typename:b},__typename:a},{id:657,name:"Frontiers in Robotics and AI",slug:"robotics-and-ai",abbreviation:"frobt",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1606,name:"Frontiers in Research Metrics and Analytics",slug:"research-metrics-and-analytics",abbreviation:"frma",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1479,name:"Frontiers in Reproductive Health",slug:"reproductive-health",abbreviation:"frph",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1830,name:"Frontiers in Remote Sensing",slug:"remote-sensing",abbreviation:"frsen",space:{id:c,domainName:d,__typename:b},__typename:a},{id:659,name:"Frontiers in Rehabilitation Sciences",slug:"rehabilitation-sciences",abbreviation:"fresc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3550,name:"Frontiers in Regenerative Medicine",slug:"regenerative-medicine",abbreviation:"fregm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1949,name:"Frontiers in Radiology",slug:"radiology",abbreviation:"fradi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3189,name:"Frontiers in RNA Research",slug:"rna-research",abbreviation:"frnar",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2306,name:"Frontiers in Quantum Science and Technology",slug:"quantum-science-and-technology",abbreviation:"frqst",space:{id:c,domainName:d,__typename:b},__typename:a},{id:P,name:"Frontiers in Public Health Archive",slug:"public-health-archive",abbreviation:Q,space:{id:m,domainName:z,__typename:b},__typename:a},{id:609,name:"Frontiers in Public Health",slug:"public-health",abbreviation:"fpubh",space:{id:c,domainName:d,__typename:b},__typename:a},{id:36,name:"Frontiers in Psychology",slug:"psychology",abbreviation:"fpsyg",space:{id:c,domainName:d,__typename:b},__typename:a},{id:71,name:"Frontiers in Psychiatry",slug:"psychiatry",abbreviation:"fpsyt",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3267,name:"Frontiers in Protistology",slug:"protistology",abbreviation:"frpro",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2452,name:"Frontiers in Proteomics",slug:"proteomics",abbreviation:"fprot",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3171,name:"Frontiers in Prosthetics and Orthotics",slug:"prosthetics-and-orthotics",abbreviation:"fpror ",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3643,name:"Frontiers in Polymer Science",slug:"polymer-science",abbreviation:"fplms",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1558,name:"Frontiers in Political Science",slug:"political-science",abbreviation:"fpos",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3615,name:"Frontiers in Polar Science",slug:"polar-science",abbreviation:"fposc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:373,name:"Frontiers in Plant Science",slug:"plant-science",abbreviation:"fpls",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3477,name:"Frontiers in Plant Physiology",slug:"plant-physiology",abbreviation:"fphgy",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3589,name:"Frontiers in Plant Genomics",slug:"plant-genomics",abbreviation:"fpgen",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3579,name:"Frontiers in Plant Ecology",slug:"plant-ecology",abbreviation:"fpley",space:{id:c,domainName:d,__typename:b},__typename:a},{id:210,name:"Frontiers in Physiology",slug:"physiology",abbreviation:"fphys",space:{id:c,domainName:d,__typename:b},__typename:a},{id:616,name:"Frontiers in Physics",slug:"physics",abbreviation:"fphy",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1803,name:"Frontiers in Photonics",slug:"photonics",abbreviation:"fphot",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3604,name:"Frontiers in Photobiology",slug:"photobiology",abbreviation:"fphbi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:176,name:"Frontiers in Pharmacology",slug:"pharmacology",abbreviation:"fphar",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3388,name:"Frontiers in Personality Disorders",slug:"personality-disorders",abbreviation:"fprsd",space:{id:c,domainName:d,__typename:b},__typename:a},{id:606,name:"Frontiers in Pediatrics",slug:"pediatrics",abbreviation:"fped",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2554,name:"Frontiers in Pediatric Dermatology",slug:"pediatric-dermatology",abbreviation:"fpdm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:P,name:"Frontiers in Pathology and Oncology Archive",slug:"pathology-and-oncology-archive",abbreviation:Q,space:{id:s,domainName:M,__typename:b},__typename:a},{id:610,name:cb,slug:cc,abbreviation:cd,space:{id:c,domainName:d,__typename:b},__typename:a},{id:3351,name:cb,slug:cc,abbreviation:cd,space:{id:c,domainName:d,__typename:b},__typename:a},{id:2705,name:"Frontiers in Parasitology",slug:"parasitology",abbreviation:"fpara",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1727,name:"Frontiers in Pain Research",slug:"pain-research",abbreviation:"fpain",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2679,name:"Frontiers in Organizational Psychology",slug:"organizational-psychology",abbreviation:"forgp",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1702,name:"Frontiers in Oral Health",slug:"oral-health",abbreviation:"froh",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2232,name:"Frontiers in Ophthalmology",slug:"ophthalmology",abbreviation:"fopht",space:{id:c,domainName:d,__typename:b},__typename:a},{id:451,name:"Frontiers in Oncology",slug:"oncology",abbreviation:"fonc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3123,name:"Frontiers in Ocean Sustainability",slug:"ocean-sustainability",abbreviation:"focsu",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2612,name:"Frontiers in Occupational Therapy",slug:"occupational-therapy",abbreviation:"froct",space:{id:c,domainName:d,__typename:b},__typename:a},{id:628,name:"Frontiers in Nutrition",slug:"nutrition",abbreviation:"fnut",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2062,name:"Frontiers in Nuclear Medicine",slug:"nuclear-medicine",abbreviation:"fnume",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2172,name:"Frontiers in Nuclear Engineering",slug:"nuclear-engineering",abbreviation:"fnuen",space:{id:c,domainName:d,__typename:b},__typename:a},{id:c,name:"Frontiers in Neuroscience",slug:"neuroscience",abbreviation:"fnins",space:{id:c,domainName:d,__typename:b},__typename:a},{id:ce,name:"Frontiers in Neurorobotics",slug:"neurorobotics",abbreviation:"fnbot",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3056,name:"Frontiers in Neuropsychiatry",slug:"neuropsychiatry",abbreviation:"fnpsy",space:{id:c,domainName:d,__typename:b},__typename:a},{id:141,name:"Frontiers in Neurology",slug:T,abbreviation:"fneur",space:{id:c,domainName:d,__typename:b},__typename:a},{id:cf,name:"Frontiers in Neuroinformatics",slug:"neuroinformatics",abbreviation:"fninf",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3283,name:"Frontiers in Neuroinflammation",slug:"neuroinflammation",abbreviation:"fnein",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1973,name:"Frontiers in Neuroimaging",slug:"neuroimaging",abbreviation:"fnimg",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1833,name:"Frontiers in Neuroergonomics",slug:"neuroergonomics",abbreviation:"fnrgo",space:{id:c,domainName:d,__typename:b},__typename:a},{id:H,name:"Frontiers in Neuroengineering",slug:"neuroengineering",abbreviation:"fneng",space:{id:c,domainName:d,__typename:b},__typename:a},{id:cg,name:"Frontiers in Neuroenergetics",slug:"neuroenergetics",abbreviation:"fnene",space:{id:c,domainName:d,__typename:b},__typename:a},{id:s,name:"Frontiers in Neuroanatomy",slug:"neuroanatomy",abbreviation:"fnana",space:{id:c,domainName:d,__typename:b},__typename:a},{id:bM,name:"Frontiers in Neural Circuits",slug:"neural-circuits",abbreviation:"fncir",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2021,name:"Frontiers in Network Physiology",slug:"network-physiology",abbreviation:"fnetp",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3130,name:"Frontiers in Network Neuroscience",slug:"network-neuroscience",abbreviation:"fnnsc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2357,name:"Frontiers in Nephrology",slug:"nephrology",abbreviation:"fneph",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2320,name:"Frontiers in Natural Products",slug:"natural-products",abbreviation:"fntpr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1528,name:"Frontiers in Nanotechnology",slug:"nanotechnology",abbreviation:"fnano",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2882,name:"Frontiers in Musculoskeletal Disorders",slug:"musculoskeletal-disorders",abbreviation:"fmscd",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3275,name:"Frontiers in Multiple Sclerosis",slug:"multiple-sclerosis",abbreviation:"fmscr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3152,name:"Frontiers in Mollusk Science",slug:"mollusk-science",abbreviation:"fmlsc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2031,name:"Frontiers in Molecular Neuroscience",slug:"molecular-neuroscience",abbreviation:"fnmol",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2086,name:"Frontiers in Molecular Medicine",slug:"molecular-medicine",abbreviation:"fmmed",space:{id:c,domainName:d,__typename:b},__typename:a},{id:698,name:"Frontiers in Molecular Biosciences",slug:"molecular-biosciences",abbreviation:"fmolb",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2807,name:"Frontiers in Microbiomes",slug:"microbiomes",abbreviation:"frmbi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:310,name:"Frontiers in Microbiology",slug:"microbiology",abbreviation:"fmicb",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2327,name:"Frontiers in Metals and Alloys",slug:"metals-and-alloys",abbreviation:"ftmal",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2307,name:"Frontiers in Membrane Science and Technology",slug:"membrane-science-and-technology",abbreviation:"frmst",space:{id:c,domainName:d,__typename:b},__typename:a},{id:602,name:"Frontiers in Medicine",slug:"medicine",abbreviation:"fmed",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1573,name:"Frontiers in Medical Technology",slug:"medical-technology",abbreviation:"fmedt",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3135,name:"Frontiers in Medical Engineering",slug:"medical-engineering",abbreviation:"fmede",space:{id:c,domainName:d,__typename:b},__typename:a},{id:950,name:"Frontiers in Mechanical Engineering",slug:"mechanical-engineering",abbreviation:"fmech",space:{id:c,domainName:d,__typename:b},__typename:a},{id:608,name:"Frontiers in Materials",slug:"materials",abbreviation:"fmats",space:{id:c,domainName:d,__typename:b},__typename:a},{id:655,name:"Frontiers in Marine Science",slug:"marine-science",abbreviation:"fmars",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2100,name:"Frontiers in Manufacturing Technology",slug:"manufacturing-technology",abbreviation:"fmtec",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2931,name:"Frontiers in Mammal Science",slug:"mammal-science",abbreviation:"fmamm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2896,name:"Frontiers in Malaria",slug:"malaria",abbreviation:"fmala",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3107,name:"Frontiers in Lupus",slug:"lupus",abbreviation:"flupu",space:{id:c,domainName:d,__typename:b},__typename:a},{id:435,name:"Frontiers in Linguistics",slug:"linguistics",abbreviation:"fling",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2636,name:"Frontiers in Language Sciences",slug:"language-sciences",abbreviation:"flang",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2670,name:"Frontiers in Lab on a Chip Technologies",slug:"lab-on-a-chip-technologies",abbreviation:"frlct",space:{id:c,domainName:d,__typename:b},__typename:a},{id:ch,name:"Frontiers in Integrative Neuroscience",slug:"integrative-neuroscience",abbreviation:"fnint",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1723,name:"Frontiers in Insect Science",slug:"insect-science",abbreviation:"finsc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3093,name:"Frontiers in Influenza",slug:"influenza",abbreviation:"finfl",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3073,name:"Frontiers in Inflammation",slug:"inflammation",abbreviation:"finmn",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3200,name:"Frontiers in Industrial Microbiology",slug:"industrial-microbiology",abbreviation:"finmi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3291,name:"Frontiers in Industrial Engineering",slug:"industrial-engineering",abbreviation:"fieng",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2765,name:"Frontiers in Impact Journals",slug:"impact-journals",abbreviation:g,space:{id:c,domainName:d,__typename:b},__typename:a},{id:3078,name:"Frontiers in Immunotherapeutics",slug:"immunotherapeutics",abbreviation:"fimms",space:{id:c,domainName:d,__typename:b},__typename:a},{id:276,name:"Frontiers in Immunology",slug:"immunology",abbreviation:"fimmu",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2379,name:"Frontiers in Imaging",slug:"imaging",abbreviation:"fimag",space:{id:c,domainName:d,__typename:b},__typename:a},{id:629,name:"Frontiers in ICT",slug:"ict",abbreviation:"fict",space:{id:c,domainName:d,__typename:b},__typename:a},{id:16,name:"Frontiers in Humanities and Social Sciences Archive",slug:"humanities-and-social-sciences-archive",abbreviation:A,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3759,name:"Frontiers in Human Rights",slug:"human-rights",abbreviation:g,space:{id:c,domainName:d,__typename:b},__typename:a},{id:1588,name:"Frontiers in Human Neuroscience",slug:"human-neuroscience",abbreviation:"fnhum",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1533,name:"Frontiers in Human Dynamics",slug:"human-dynamics",abbreviation:"fhumd",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2733,name:"Frontiers in Horticulture",slug:"horticulture",abbreviation:"fhort",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3316,name:"Frontiers in Histology",slug:"histology",abbreviation:"frhis",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2378,name:"Frontiers in High Performance Computing",slug:"high-performance-computing",abbreviation:"fhpcp",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2456,name:"Frontiers in Hematology",slug:"hematology",abbreviation:"frhem",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2063,name:"Frontiers in Health Services",slug:"health-services",abbreviation:"frhs",space:{id:c,domainName:d,__typename:b},__typename:a},{id:s,name:"Frontiers in Health Archive",slug:"health-archive",abbreviation:A,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3508,name:"Frontiers in Green Chemistry",slug:"green-chemistry",abbreviation:"fgrch",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1728,name:"Frontiers in Global Women's Health",slug:"global-womens-health",abbreviation:"fgwh",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2918,name:"Frontiers in Geochemistry",slug:"geochemistry",abbreviation:"fgeoc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1540,name:"Frontiers in Genome Editing",slug:"genome-editing",abbreviation:"fgeed",space:{id:c,domainName:d,__typename:b},__typename:a},{id:240,name:"Frontiers in Genetics",slug:"genetics",abbreviation:"fgene",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3496,name:"Frontiers in Genetic Microbiology",slug:"genetic-microbiology",abbreviation:"fgemi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3227,name:"Frontiers in Genetic Disorders",slug:"genetic-disorders",abbreviation:"frged",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2333,name:"Frontiers in Gastroenterology",slug:"gastroenterology",abbreviation:"fgstr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1529,name:"Frontiers in Future Transportation",slug:"future-transportation",abbreviation:"ffutr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1725,name:"Frontiers in Fungal Biology",slug:"fungal-biology",abbreviation:"ffunb",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2826,name:"Frontiers in Fuels",slug:"fuels",abbreviation:"ffuel",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3207,name:"Frontiers in Freshwater Science",slug:"freshwater-science",abbreviation:"ffwsc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1366,name:"Frontiers in Forests and Global Change",slug:"forests-and-global-change",abbreviation:"ffgc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2689,name:"Frontiers in Forensic Science",slug:"forensic-science",abbreviation:g,space:{id:c,domainName:d,__typename:b},__typename:a},{id:2289,name:"Frontiers in Food Science and Technology",slug:"food-science-and-technology",abbreviation:"frfst",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3559,name:"Frontiers in Fluorescence",slug:"fluorescence",abbreviation:"fflur",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2987,name:"Frontiers in Fish Science",slug:"fish-science",abbreviation:"frish",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3489,name:"Frontiers in Fire Science and Technology",slug:"fire-science-and-technology",abbreviation:"firtc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2749,name:"Frontiers in Financial Economics",slug:"financial-economics",abbreviation:"ffecn",space:{id:c,domainName:d,__typename:b},__typename:a},{id:c,name:"Frontiers in FSHIP Test Journal",slug:"fship-test-journal",abbreviation:"ftest",space:{id:i,domainName:j,__typename:b},__typename:a},{id:bI,name:"Frontiers in Evolutionary Neuroscience",slug:"evolutionary-neuroscience",abbreviation:"fnevo",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2955,name:"Frontiers in Ethology",slug:"ethology",abbreviation:"fetho",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3032,name:"Frontiers in Epigenetics and Epigenomics",slug:"epigenetics-and-epigenomics",abbreviation:"freae",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2394,name:"Frontiers in Epidemiology",slug:"epidemiology",abbreviation:"fepid",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3450,name:"Frontiers in Environmental Toxicology",slug:"environmental-toxicology",abbreviation:"fentx",space:{id:c,domainName:d,__typename:b},__typename:a},{id:627,name:"Frontiers in Environmental Science",slug:"environmental-science",abbreviation:"fenvs",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2888,name:"Frontiers in Environmental Health",slug:"environmental-health",abbreviation:"fenvh",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2851,name:"Frontiers in Environmental Engineering",slug:"environmental-engineering",abbreviation:"fenve",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2547,name:"Frontiers in Environmental Economics",slug:"environmental-economics",abbreviation:"frevc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1697,name:"Frontiers in Environmental Chemistry",slug:"environmental-chemistry",abbreviation:"fenvc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2756,name:"Frontiers in Environmental Archaeology",slug:"environmental-archaeology",abbreviation:"fearc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:15,name:"Frontiers in Engineering archive",slug:"engineering-archive",abbreviation:A,space:{id:i,domainName:j,__typename:b},__typename:a},{id:626,name:"Frontiers in Energy Research",slug:"energy-research",abbreviation:"fenrg",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3115,name:"Frontiers in Energy Efficiency",slug:"energy-efficiency",abbreviation:"fenef",space:{id:c,domainName:d,__typename:b},__typename:a},{id:106,name:"Frontiers in Endocrinology",slug:"endocrinology",abbreviation:"fendo",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1696,name:"Frontiers in Electronics",slug:"electronics",abbreviation:"felec",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1800,name:"Frontiers in Electronic Materials",slug:"electronic-materials",abbreviation:"femat",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2998,name:"Frontiers in Educational Psychology",slug:"educational-psychology",abbreviation:"fepys",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1239,name:"Frontiers in Education",slug:"education",abbreviation:"feduc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:625,name:"Frontiers in Economics",slug:"economics",abbreviation:"fecon",space:{id:c,domainName:d,__typename:b},__typename:a},{id:471,name:"Frontiers in Ecology and Evolution",slug:"ecology-and-evolution",abbreviation:"fevo",space:{id:c,domainName:d,__typename:b},__typename:a},{id:c,name:"Frontiers in Earth Science Archive",slug:"earth-science-archive",abbreviation:"gslfj",space:{id:t,domainName:O,__typename:b},__typename:a},{id:654,name:"Frontiers in Earth Science",slug:"earth-science",abbreviation:"feart",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3309,name:"Frontiers in Earth Observation and Land Monitoring",slug:"earth-observation-and-land-monitoring",abbreviation:"feolm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2161,name:"Frontiers in Drug Safety and Regulation",slug:"drug-safety-and-regulation",abbreviation:"fdsfr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2137,name:"Frontiers in Drug Discovery",slug:"drug-discovery",abbreviation:"fddsv",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2136,name:"Frontiers in Drug Delivery",slug:"drug-delivery",abbreviation:"fddev",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2775,name:"Frontiers in Disaster and Emergency Medicine",slug:"disaster-and-emergency-medicine",abbreviation:"femer",space:{id:c,domainName:d,__typename:b},__typename:a},{id:788,name:"Frontiers in Digital Humanities",slug:"digital-humanities",abbreviation:"fdigh",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1534,name:"Frontiers in Digital Health",slug:"digital-health",abbreviation:"fdgth",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2999,name:"Frontiers in Developmental Psychology",slug:"developmental-psychology",abbreviation:"fdpys",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2873,name:"Frontiers in Detector Science and Technology",slug:"detector-science-and-technology",abbreviation:"fdest",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3611,name:"Frontiers in Design Engineering",slug:"design-engineering",abbreviation:"fdese",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2550,name:"Frontiers in Dermatological Research",slug:"dermatological-research",abbreviation:"fdmre",space:{id:c,domainName:d,__typename:b},__typename:a},{id:607,name:"Frontiers in Dental Medicine",slug:"dental-medicine",abbreviation:"fdmed",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2597,name:"Frontiers in Dementia",slug:"dementia",abbreviation:"frdem",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1785,name:"Frontiers in Control Engineering",slug:"control-engineering",abbreviation:"fcteg",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1724,name:"Frontiers in Conservation Science",slug:"conservation-science",abbreviation:"fcosc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3454,name:"Frontiers in Condensed Matter",slug:"condensed-matter",abbreviation:"fconm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1511,name:"Frontiers in Computer Science",slug:"computer-science",abbreviation:"fcomp",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3566,name:"Frontiers in Computational Physiology",slug:"computational-physiology",abbreviation:"fcphy",space:{id:c,domainName:d,__typename:b},__typename:a},{id:I,name:"Frontiers in Computational Neuroscience",slug:"computational-neuroscience",abbreviation:"fncom",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3234,name:"Frontiers in Complex Systems",slug:"complex-systems",abbreviation:"fcpxs",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1787,name:"Frontiers in Communications and Networks",slug:"communications-and-networks",abbreviation:"frcmn",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1238,name:"Frontiers in Communication",slug:"communication",abbreviation:"fcomm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2535,name:"Frontiers in Cognition",slug:"cognition",abbreviation:"fcogn",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2857,name:"Frontiers in Coatings, Dyes and Interface Engineering",slug:"coatings-dyes-and-interface-engineering",abbreviation:"frcdi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3222,name:"Frontiers in Clinical Microbiology",slug:"clinical-microbiology",abbreviation:"fclmi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1729,name:"Frontiers in Clinical Diabetes and Healthcare",slug:"clinical-diabetes-and-healthcare",abbreviation:"fcdhc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2551,name:"Frontiers in Clinical Dermatology",slug:"clinical-dermatology",abbreviation:"fcldm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1490,name:"Frontiers in Climate",slug:"climate",abbreviation:"fclim",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3338,name:"Frontiers in Chromosome Research",slug:"chromosome-research",abbreviation:g,space:{id:c,domainName:d,__typename:b},__typename:a},{id:2587,name:"Frontiers in Child and Adolescent Psychiatry",slug:"child-and-adolescent-psychiatry",abbreviation:"frcha",space:{id:c,domainName:d,__typename:b},__typename:a},{id:601,name:"Frontiers in Chemistry",slug:"chemistry",abbreviation:"fchem",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1532,name:"Frontiers in Chemical Engineering",slug:"chemical-engineering",abbreviation:"fceng",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3038,name:"Frontiers in Chemical Biology",slug:"chemical-biology",abbreviation:"fchbi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3322,name:"Frontiers in Ceramics",slug:"ceramics",abbreviation:"fceic",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1440,name:"Frontiers in Cellular and Infection Microbiology",slug:"cellular-and-infection-microbiology",abbreviation:"fcimb",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1523,name:"Frontiers in Cellular Neuroscience",slug:"cellular-neuroscience",abbreviation:"fncel",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3084,name:"Frontiers in Cellular Immunology",slug:"cellular-immunology",abbreviation:"fcimy",space:{id:c,domainName:d,__typename:b},__typename:a},{id:403,name:"Frontiers in Cell and Developmental Biology",slug:"cell-and-developmental-biology",abbreviation:"fcell",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3178,name:"Frontiers in Cell Signaling",slug:"cell-signaling",abbreviation:"fcsig",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2655,name:"Frontiers in Cell Death",slug:"cell-death",abbreviation:"fceld",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1901,name:"Frontiers in Catalysis",slug:"catalysis",abbreviation:"fctls",space:{id:c,domainName:d,__typename:b},__typename:a},{id:755,name:"Frontiers in Cardiovascular Medicine",slug:"cardiovascular-medicine",abbreviation:"fcvm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2662,name:"Frontiers in Carbon",slug:"carbon",abbreviation:"frcrb",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3513,name:"Frontiers in Cancer Interception",slug:"cancer-interception",abbreviation:"fcint",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3433,name:"Frontiers in Cancer Control and Society",slug:"cancer-control-and-society",abbreviation:"fcacs",space:{id:c,domainName:d,__typename:b},__typename:a},{id:921,name:"Frontiers in Built Environment",slug:"built-environment",abbreviation:"fbuil",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1418,name:"Frontiers in Blockchain",slug:"blockchain",abbreviation:"fbloc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2971,name:"Frontiers in Bird Science",slug:"bird-science",abbreviation:"fbirs",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3300,name:"Frontiers in Biophysics",slug:"biophysics",abbreviation:"frbis",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2222,name:"Frontiers in Biomaterials Science",slug:"biomaterials-science",abbreviation:"fbiom",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1722,name:"Frontiers in Bioinformatics",slug:"bioinformatics",abbreviation:"fbinf",space:{id:c,domainName:d,__typename:b},__typename:a},{id:452,name:"Frontiers in Bioengineering and Biotechnology",slug:"bioengineering-and-biotechnology",abbreviation:"fbioe",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1380,name:"Frontiers in Big Data",slug:"big-data",abbreviation:"fdata",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1589,name:"Frontiers in Behavioral Neuroscience",slug:"behavioral-neuroscience",abbreviation:"fnbeh",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2432,name:"Frontiers in Behavioral Economics",slug:"behavioral-economics",abbreviation:"frbhe",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2796,name:"Frontiers in Bee Science",slug:"bee-science",abbreviation:"frbee",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3214,name:"Frontiers in Batteries and Electrochemistry",slug:"batteries-and-electrochemistry",abbreviation:"fbael",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3011,name:"Frontiers in Bacteriology",slug:"bacteriology",abbreviation:"fbrio",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3040,name:"Frontiers in Audiology and Otology",slug:"audiology-and-otology",abbreviation:"fauot",space:{id:c,domainName:d,__typename:b},__typename:a},{id:603,name:"Frontiers in Astronomy and Space Sciences",slug:"astronomy-and-space-sciences",abbreviation:"fspas",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1437,name:"Frontiers in Artificial Intelligence",slug:"artificial-intelligence",abbreviation:"frai",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2940,name:"Frontiers in Arachnid Science",slug:"arachnid-science",abbreviation:"frchs",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2834,name:"Frontiers in Aquaculture",slug:"aquaculture",abbreviation:"faquc",space:{id:c,domainName:d,__typename:b},__typename:a},{id:981,name:"Frontiers in Applied Mathematics and Statistics",slug:"applied-mathematics-and-statistics",abbreviation:"fams",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3417,name:"Frontiers in Applied Environmental Microbiology",slug:"applied-environmental-microbiology",abbreviation:"faemi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2714,name:"Frontiers in Antibiotics",slug:"antibiotics",abbreviation:"frabi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3443,name:"Frontiers in Anti-Cancer Therapies",slug:"anti-cancer-therapies",abbreviation:"facth",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3253,name:"Frontiers in Antennas and Propagation",slug:"antennas-and-propagation",abbreviation:"fanpr",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1719,name:"Frontiers in Animal Science",slug:"animal-science",abbreviation:"fanim",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2513,name:"Frontiers in Anesthesiology",slug:"anesthesiology",abbreviation:"fanes",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1989,name:"Frontiers in Analytical Science",slug:"analytical-science",abbreviation:"frans",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2909,name:"Frontiers in Amphibian and Reptile Science",slug:"amphibian-and-reptile-science",abbreviation:"famrs",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1705,name:"Frontiers in Allergy",slug:"allergy",abbreviation:"falgy",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1541,name:"Frontiers in Agronomy",slug:"agronomy",abbreviation:"fagro",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3631,name:"Frontiers in Agricultural Engineering",slug:"agricultural-engineering",abbreviation:"faeng",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2477,name:"Frontiers in Aging Neuroscience",slug:"aging-neuroscience",abbreviation:"fnagi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:1566,name:"Frontiers in Aging",slug:"aging",abbreviation:"fragi",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2449,name:"Frontiers in Aerospace Engineering",slug:"aerospace-engineering",abbreviation:"fpace",space:{id:c,domainName:d,__typename:b},__typename:a},{id:2195,name:"Frontiers in Adolescent Medicine",slug:"adolescent-medicine",abbreviation:"fradm",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3426,name:"Frontiers in Acoustics",slug:"acoustics",abbreviation:"facou",space:{id:c,domainName:d,__typename:b},__typename:a},{id:979,name:bs,slug:"frontiers-for-young-minds",abbreviation:"frym",space:{id:c,domainName:d,__typename:b},__typename:a},{id:3260,name:"Frontiers In Ocean Engineering",slug:"frontiers-in-ocean-engineering",abbreviation:"focen",space:{id:c,domainName:d,__typename:b},__typename:a},{id:ce,name:"FSHIP Test Journal 2",slug:"fship-test-journal-2",abbreviation:"FTJ2",space:{id:i,domainName:j,__typename:b},__typename:a},{id:i,name:ci,slug:cj,abbreviation:ck,space:{id:N,domainName:bB,__typename:b},__typename:a},{id:3746,name:ci,slug:cj,abbreviation:ck,space:{id:c,domainName:d,__typename:b},__typename:a},{id:cf,name:cl,slug:cm,abbreviation:cn,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3231,name:cl,slug:cm,abbreviation:cn,space:{id:c,domainName:d,__typename:b},__typename:a},{id:t,name:co,slug:cp,abbreviation:cq,space:{id:t,domainName:O,__typename:b},__typename:a},{id:2078,name:co,slug:cp,abbreviation:cq,space:{id:c,domainName:d,__typename:b},__typename:a},{id:ch,name:cr,slug:cs,abbreviation:ct,space:{id:i,domainName:j,__typename:b},__typename:a},{id:2359,name:cr,slug:cs,abbreviation:ct,space:{id:c,domainName:d,__typename:b},__typename:a},{id:8,name:cu,slug:cv,abbreviation:cw,space:{id:i,domainName:j,__typename:b},__typename:a},{id:2446,name:cu,slug:cv,abbreviation:cw,space:{id:c,domainName:d,__typename:b},__typename:a},{id:10,name:cx,slug:cy,abbreviation:cz,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3230,name:cx,slug:cy,abbreviation:cz,space:{id:c,domainName:d,__typename:b},__typename:a},{id:t,name:cA,slug:cB,abbreviation:cC,space:{id:i,domainName:j,__typename:b},__typename:a},{id:2358,name:cA,slug:cB,abbreviation:cC,space:{id:c,domainName:d,__typename:b},__typename:a},{id:3660,name:"Advanced Optical Technologies",slug:"advanced-optical-technologies",abbreviation:"aot",space:{id:c,domainName:d,__typename:b},__typename:a},{id:cg,name:cD,slug:cE,abbreviation:cF,space:{id:i,domainName:j,__typename:b},__typename:a},{id:3659,name:cD,slug:cE,abbreviation:cF,space:{id:c,domainName:d,__typename:b},__typename:a},{id:H,name:cG,slug:cH,abbreviation:"abp",space:{id:i,domainName:j,__typename:b},__typename:a},{id:3695,name:cG,slug:cH,abbreviation:"ABP",space:{id:c,domainName:d,__typename:b},__typename:a}]},serverRendered:h,routePath:"\u002Fjournals\u002Fsystems-neuroscience\u002Farticles\u002F10.3389\u002Ffnsys.2015.00156\u002Ffull",config:{baseUrl:"https:\u002F\u002Fwww.frontiersin.org",appName:"article-pages-2022",spaceId:c,spaceName:v,domain:d,loopUrl:"https:\u002F\u002Floop.frontiersin.org",ssMainDomain:d,googleRecaptchaKeyName:"FrontiersRecaptchaV2",googleRecaptchaSiteKey:"6LdG3i0UAAAAAOC4qUh35ubHgJotEHp_STXHgr_v",linkedArticleCopyText:"'{\"articleTypeCopyText\":[{\"articleTypeId\":0,\"originalArticleCopyText\":\"Part of this article's content has been mentioned in:\",\"linkedArticleCopyText\":\"This article mentions parts of:\"},{\"articleTypeId\":122,\"originalArticleCopyText\":\"Parts of this article's content have been modified or rectified in:\",\"linkedArticleCopyText\":\"This article is an erratum on:\"},{\"articleTypeId\":129,\"originalArticleCopyText\":\"Parts of this article's content have been modified or rectified in:\",\"linkedArticleCopyText\":\"This article is an addendum to:\"},{\"articleTypeId\":128,\"originalArticleCopyText\":\"A correction has been applied to this article in:\",\"linkedArticleCopyText\":\"This article is a correction to:\"},{\"articleTypeId\":134,\"originalArticleCopyText\":\"A retraction of this article was approved in:\",\"linkedArticleCopyText\":\"This article is a retraction of:\"},{\"articleTypeId\":29,\"originalArticleCopyText\":\"A commentary has been posted on this article:\",\"linkedArticleCopyText\":\"This article is a commentary on:\"},{\"articleTypeId\":30,\"originalArticleCopyText\":\"A commentary has been posted on this article:\",\"linkedArticleCopyText\":\"This article is a commentary on:\"}],\"articleIdCopyText\":[]}'\n",articleTypeConfigurableLabel:"\u003C\u003Carticle-type:uppercase\u003E\u003E article",terminologySettings:"'{\"terms\":[{\"sequenceNumber\":1,\"key\":\"frontiers\",\"tenantTerm\":\"Frontiers\",\"frontiersDefaultTerm\":\"Frontiers\",\"category\":\"Customer\"},{\"sequenceNumber\":2,\"key\":\"submission_system\",\"tenantTerm\":\"submission system\",\"frontiersDefaultTerm\":\"submission system\",\"category\":\"Product\"},{\"sequenceNumber\":3,\"key\":\"public_pages\",\"tenantTerm\":\"public pages\",\"frontiersDefaultTerm\":\"public pages\",\"category\":\"Product\"},{\"sequenceNumber\":4,\"key\":\"my_frontiers\",\"tenantTerm\":\"my frontiers\",\"frontiersDefaultTerm\":\"my frontiers\",\"category\":\"Product\"},{\"sequenceNumber\":5,\"key\":\"digital_editorial_office\",\"tenantTerm\":\"digital editorial office\",\"frontiersDefaultTerm\":\"digital editorial office\",\"category\":\"Product\"},{\"sequenceNumber\":6,\"key\":\"deo\",\"tenantTerm\":\"DEO\",\"frontiersDefaultTerm\":\"DEO\",\"category\":\"Product\"},{\"sequenceNumber\":7,\"key\":\"digital_editorial_office_for_chiefs\",\"tenantTerm\":\"digital editorial office for chiefs\",\"frontiersDefaultTerm\":\"digital editorial office for chiefs\",\"category\":\"Product\"},{\"sequenceNumber\":8,\"key\":\"digital_editorial_office_for_eof\",\"tenantTerm\":\"digital editorial office for eof\",\"frontiersDefaultTerm\":\"digital editorial office for eof\",\"category\":\"Product\"},{\"sequenceNumber\":9,\"key\":\"editorial_office\",\"tenantTerm\":\"editorial office\",\"frontiersDefaultTerm\":\"editorial office\",\"category\":\"Product\"},{\"sequenceNumber\":10,\"key\":\"eof\",\"tenantTerm\":\"EOF\",\"frontiersDefaultTerm\":\"EOF\",\"category\":\"Product\"},{\"sequenceNumber\":11,\"key\":\"research_topic_management\",\"tenantTerm\":\"research topic management\",\"frontiersDefaultTerm\":\"research topic management\",\"category\":\"Product\"},{\"sequenceNumber\":12,\"key\":\"review_forum\",\"tenantTerm\":\"review forum\",\"frontiersDefaultTerm\":\"review forum\",\"category\":\"Product\"},{\"sequenceNumber\":13,\"key\":\"accounting_office\",\"tenantTerm\":\"accounting office\",\"frontiersDefaultTerm\":\"accounting office\",\"category\":\"Product\"},{\"sequenceNumber\":14,\"key\":\"aof\",\"tenantTerm\":\"AOF\",\"frontiersDefaultTerm\":\"AOF\",\"category\":\"Product\"},{\"sequenceNumber\":15,\"key\":\"publishing_office\",\"tenantTerm\":\"publishing office\",\"frontiersDefaultTerm\":\"publishing office\",\"category\":\"Product\"},{\"sequenceNumber\":16,\"key\":\"production_office\",\"tenantTerm\":\"production office forum\",\"frontiersDefaultTerm\":\"production office forum\",\"category\":\"Product\"},{\"sequenceNumber\":17,\"key\":\"pof\",\"tenantTerm\":\"POF\",\"frontiersDefaultTerm\":\"POF\",\"category\":\"Product\"},{\"sequenceNumber\":18,\"key\":\"book_office_forum\",\"tenantTerm\":\"book office forum\",\"frontiersDefaultTerm\":\"book office forum\",\"category\":\"Product\"},{\"sequenceNumber\":19,\"key\":\"bof\",\"tenantTerm\":\"BOF\",\"frontiersDefaultTerm\":\"BOF\",\"category\":\"Product\"},{\"sequenceNumber\":20,\"key\":\"aira\",\"tenantTerm\":\"AIRA\",\"frontiersDefaultTerm\":\"AIRA\",\"category\":\"Product\"},{\"sequenceNumber\":21,\"key\":\"editorial_board_management\",\"tenantTerm\":\"editorial board management\",\"frontiersDefaultTerm\":\"editorial board management\",\"category\":\"Product\"},{\"sequenceNumber\":22,\"key\":\"ebm\",\"tenantTerm\":\"EBM\",\"frontiersDefaultTerm\":\"EBM\",\"category\":\"Product\"},{\"sequenceNumber\":23,\"key\":\"domain\",\"tenantTerm\":\"domain\",\"frontiersDefaultTerm\":\"domain\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":24,\"key\":\"journal\",\"tenantTerm\":\"journal\",\"frontiersDefaultTerm\":\"journal\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":25,\"key\":\"section\",\"tenantTerm\":\"section\",\"frontiersDefaultTerm\":\"section\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":26,\"key\":\"domains\",\"tenantTerm\":\"domains\",\"frontiersDefaultTerm\":\"domains\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":27,\"key\":\"specialty_section\",\"tenantTerm\":\"specialty section\",\"frontiersDefaultTerm\":\"specialty section\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":28,\"key\":\"specialty_journal\",\"tenantTerm\":\"specialty journal\",\"frontiersDefaultTerm\":\"specialty journal\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":29,\"key\":\"journals\",\"tenantTerm\":\"journals\",\"frontiersDefaultTerm\":\"journals\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":30,\"key\":\"sections\",\"tenantTerm\":\"sections\",\"frontiersDefaultTerm\":\"sections\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":31,\"key\":\"specialty_sections\",\"tenantTerm\":\"specialty sections\",\"frontiersDefaultTerm\":\"specialty sections\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":32,\"key\":\"specialty_journals\",\"tenantTerm\":\"specialty journals\",\"frontiersDefaultTerm\":\"specialty journals\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":33,\"key\":\"manuscript\",\"tenantTerm\":\"manuscript\",\"frontiersDefaultTerm\":\"manuscript\",\"category\":\"Core\"},{\"sequenceNumber\":34,\"key\":\"manuscripts\",\"tenantTerm\":\"manuscripts\",\"frontiersDefaultTerm\":\"manuscripts\",\"category\":\"Core\"},{\"sequenceNumber\":35,\"key\":\"article\",\"tenantTerm\":\"article\",\"frontiersDefaultTerm\":\"article\",\"category\":\"Core\"},{\"sequenceNumber\":36,\"key\":\"articles\",\"tenantTerm\":\"articles\",\"frontiersDefaultTerm\":\"articles\",\"category\":\"Core\"},{\"sequenceNumber\":37,\"key\":\"article_type\",\"tenantTerm\":\"article type\",\"frontiersDefaultTerm\":\"article type\",\"category\":\"Core\"},{\"sequenceNumber\":38,\"key\":\"article_types\",\"tenantTerm\":\"article types\",\"frontiersDefaultTerm\":\"article types\",\"category\":\"Core\"},{\"sequenceNumber\":39,\"key\":\"author\",\"tenantTerm\":\"author\",\"frontiersDefaultTerm\":\"author\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":40,\"key\":\"authors\",\"tenantTerm\":\"authors\",\"frontiersDefaultTerm\":\"authors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":41,\"key\":\"authoring\",\"tenantTerm\":\"authoring\",\"frontiersDefaultTerm\":\"authoring\",\"category\":\"Core\"},{\"sequenceNumber\":42,\"key\":\"authored\",\"tenantTerm\":\"authored\",\"frontiersDefaultTerm\":\"authored\",\"category\":\"Core\"},{\"sequenceNumber\":43,\"key\":\"accept\",\"tenantTerm\":\"accept\",\"frontiersDefaultTerm\":\"accept\",\"category\":\"Process\"},{\"sequenceNumber\":44,\"key\":\"accepted\",\"tenantTerm\":\"accepted\",\"frontiersDefaultTerm\":\"accepted\",\"category\":\"Process\"},{\"sequenceNumber\":45,\"key\":\"assistant_field_chief_editor\",\"tenantTerm\":\"Assistant Field Chief Editor\",\"frontiersDefaultTerm\":\"Assistant Field Chief Editor\",\"description\":\"An editorial role on a Field Journal that a Registered User may hold. This gives them rights to different functionality and parts of the platform\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":46,\"key\":\"assistant_specialty_chief_editor\",\"tenantTerm\":\"Assistant Specialty Chief Editor\",\"frontiersDefaultTerm\":\"Assistant Specialty Chief Editor\",\"description\":\"An editorial role on a specialty that a Registered User may hold. This gives them rights to different functionality and parts of the platform\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":47,\"key\":\"assistant_specialty_chief_editors\",\"tenantTerm\":\"Assistant Specialty Chief Editors\",\"frontiersDefaultTerm\":\"Assistant Specialty Chief Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":48,\"key\":\"associate_editor\",\"tenantTerm\":\"Associate Editor\",\"frontiersDefaultTerm\":\"Associate Editor\",\"description\":\"An editorial role on a specialty that a Registered User may hold. This gives them rights to different functionality and parts of the platform\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":49,\"key\":\"specialty_chief_editor\",\"tenantTerm\":\"Specialty Chief Editor\",\"frontiersDefaultTerm\":\"Specialty Chief Editor\",\"description\":\"An editorial role on a specialty that a Registered User may hold. This gives them rights to different functionality and parts of the platform\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":50,\"key\":\"specialty_chief_editors\",\"tenantTerm\":\"Specialty Chief Editors\",\"frontiersDefaultTerm\":\"Specialty Chief Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":51,\"key\":\"chief_editor\",\"tenantTerm\":\"Chief Editor\",\"frontiersDefaultTerm\":\"Chief Editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":52,\"key\":\"chief_editors\",\"tenantTerm\":\"Chief Editors\",\"frontiersDefaultTerm\":\"Chief Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":53,\"key\":\"call_for_participation\",\"tenantTerm\":\"call for participation\",\"frontiersDefaultTerm\":\"call for participation\",\"category\":\"Process\"},{\"sequenceNumber\":54,\"key\":\"citation\",\"tenantTerm\":\"citation\",\"frontiersDefaultTerm\":\"citation\",\"category\":\"Misc.\"},{\"sequenceNumber\":55,\"key\":\"citations\",\"tenantTerm\":\"citations\",\"frontiersDefaultTerm\":\"citations\",\"category\":\"Misc.\"},{\"sequenceNumber\":56,\"key\":\"contributor\",\"tenantTerm\":\"contributor\",\"frontiersDefaultTerm\":\"contributor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":57,\"key\":\"contributors\",\"tenantTerm\":\"contributors\",\"frontiersDefaultTerm\":\"contributors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":58,\"key\":\"corresponding_author\",\"tenantTerm\":\"corresponding author\",\"frontiersDefaultTerm\":\"corresponding author\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":59,\"key\":\"corresponding_authors\",\"tenantTerm\":\"corresponding authors\",\"frontiersDefaultTerm\":\"corresponding authors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":60,\"key\":\"decline\",\"tenantTerm\":\"decline\",\"frontiersDefaultTerm\":\"decline\",\"category\":\"Process\"},{\"sequenceNumber\":61,\"key\":\"declined\",\"tenantTerm\":\"declined\",\"frontiersDefaultTerm\":\"declined\",\"category\":\"Process\"},{\"sequenceNumber\":62,\"key\":\"reject\",\"tenantTerm\":\"reject\",\"frontiersDefaultTerm\":\"reject\",\"category\":\"Process\"},{\"sequenceNumber\":63,\"key\":\"rejected\",\"tenantTerm\":\"rejected\",\"frontiersDefaultTerm\":\"rejected\",\"category\":\"Process\"},{\"sequenceNumber\":64,\"key\":\"publish\",\"tenantTerm\":\"publish\",\"frontiersDefaultTerm\":\"publish\",\"category\":\"Core\"},{\"sequenceNumber\":65,\"key\":\"published\",\"tenantTerm\":\"published\",\"frontiersDefaultTerm\":\"published\",\"category\":\"Core\"},{\"sequenceNumber\":66,\"key\":\"publication\",\"tenantTerm\":\"publication\",\"frontiersDefaultTerm\":\"publication\",\"category\":\"Core\"},{\"sequenceNumber\":67,\"key\":\"peer_review\",\"tenantTerm\":\"peer review\",\"frontiersDefaultTerm\":\"peer review\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":68,\"key\":\"peer_reviewed\",\"tenantTerm\":\"peer reviewed\",\"frontiersDefaultTerm\":\"peer reviewed\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":69,\"key\":\"initial_validation\",\"tenantTerm\":\"initial validation\",\"frontiersDefaultTerm\":\"initial validation\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":70,\"key\":\"editorial_assignment\",\"tenantTerm\":\"editorial assignment\",\"frontiersDefaultTerm\":\"editorial assignment\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":71,\"key\":\"independent_review\",\"tenantTerm\":\"independent review\",\"frontiersDefaultTerm\":\"independent review\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":72,\"key\":\"interactive_review\",\"tenantTerm\":\"interactive review\",\"frontiersDefaultTerm\":\"interactive review\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":73,\"key\":\"review\",\"tenantTerm\":\"review\",\"frontiersDefaultTerm\":\"review\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":74,\"key\":\"reviewing\",\"tenantTerm\":\"reviewing\",\"frontiersDefaultTerm\":\"reviewing\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":75,\"key\":\"reviewer\",\"tenantTerm\":\"reviewer\",\"frontiersDefaultTerm\":\"reviewer\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":76,\"key\":\"reviewers\",\"tenantTerm\":\"reviewers\",\"frontiersDefaultTerm\":\"reviewers\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":77,\"key\":\"review_finalized\",\"tenantTerm\":\"review finalized\",\"frontiersDefaultTerm\":\"review finalized\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":78,\"key\":\"final_decision\",\"tenantTerm\":\"final decision\",\"frontiersDefaultTerm\":\"final decision\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":79,\"key\":\"final_validation\",\"tenantTerm\":\"final validation\",\"frontiersDefaultTerm\":\"final validation\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":80,\"key\":\"ae_accept_manuscript\",\"tenantTerm\":\"recommend to accept manuscript\",\"frontiersDefaultTerm\":\"accept manuscript\",\"category\":\"Process\"},{\"sequenceNumber\":81,\"key\":\"fee\",\"tenantTerm\":\"fee\",\"frontiersDefaultTerm\":\"fee\",\"category\":\"Accounting\"},{\"sequenceNumber\":82,\"key\":\"fees\",\"tenantTerm\":\"fees\",\"frontiersDefaultTerm\":\"fees\",\"category\":\"Accounting\"},{\"sequenceNumber\":83,\"key\":\"guest_associate_editor\",\"tenantTerm\":\"Guest Associate Editor\",\"frontiersDefaultTerm\":\"Guest Associate Editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":84,\"key\":\"guest_associate_editors\",\"tenantTerm\":\"Guest Associate Editors\",\"frontiersDefaultTerm\":\"Guest Associate Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":85,\"key\":\"in_review\",\"tenantTerm\":\"in review\",\"frontiersDefaultTerm\":\"in review\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":86,\"key\":\"institutional_member\",\"tenantTerm\":\"institutional partner\",\"frontiersDefaultTerm\":\"institutional partner\",\"category\":\"Accounting\"},{\"sequenceNumber\":87,\"key\":\"institutional_membership\",\"tenantTerm\":\"institutional partnership\",\"frontiersDefaultTerm\":\"institutional partnership\",\"category\":\"Accounting\"},{\"sequenceNumber\":88,\"key\":\"article_processing_charge\",\"tenantTerm\":\"article processing charge\",\"frontiersDefaultTerm\":\"article processing charge\",\"category\":\"Accounting\"},{\"sequenceNumber\":89,\"key\":\"article_processing_charges\",\"tenantTerm\":\"article processing charges\",\"frontiersDefaultTerm\":\"article processing charges\",\"category\":\"Accounting\"},{\"sequenceNumber\":90,\"key\":\"apcs\",\"tenantTerm\":\"APCs\",\"frontiersDefaultTerm\":\"APCs\",\"category\":\"Accounting\"},{\"sequenceNumber\":91,\"key\":\"apc\",\"tenantTerm\":\"APC\",\"frontiersDefaultTerm\":\"APC\",\"category\":\"Accounting\"},{\"sequenceNumber\":92,\"key\":\"received\",\"tenantTerm\":\"received\",\"frontiersDefaultTerm\":\"received\",\"description\":\"Date manuscript was received on.\",\"category\":\"Core\"},{\"sequenceNumber\":93,\"key\":\"transferred\",\"tenantTerm\":\"transferred\",\"frontiersDefaultTerm\":\"transferred\",\"category\":\"Core\"},{\"sequenceNumber\":94,\"key\":\"transfer\",\"tenantTerm\":\"transfer\",\"frontiersDefaultTerm\":\"transfer\",\"category\":\"Core\"},{\"sequenceNumber\":95,\"key\":\"research_topic\",\"tenantTerm\":\"research topic\",\"frontiersDefaultTerm\":\"research topic\",\"category\":\"Core\"},{\"sequenceNumber\":96,\"key\":\"research_topics\",\"tenantTerm\":\"research topics\",\"frontiersDefaultTerm\":\"research topics\",\"category\":\"Core\"},{\"sequenceNumber\":97,\"key\":\"topic_editor\",\"tenantTerm\":\"Topic Editor\",\"frontiersDefaultTerm\":\"Topic Editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":98,\"key\":\"review_editor\",\"tenantTerm\":\"Review Editor\",\"frontiersDefaultTerm\":\"Review Editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":99,\"key\":\"title\",\"tenantTerm\":\"title\",\"frontiersDefaultTerm\":\"title\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":100,\"key\":\"running_title\",\"tenantTerm\":\"running title\",\"frontiersDefaultTerm\":\"running title\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":101,\"key\":\"submit\",\"tenantTerm\":\"submit\",\"frontiersDefaultTerm\":\"submit\",\"category\":\"Process\"},{\"sequenceNumber\":102,\"key\":\"submitted\",\"tenantTerm\":\"submitted\",\"frontiersDefaultTerm\":\"submitted\",\"category\":\"Process\"},{\"sequenceNumber\":103,\"key\":\"submitting\",\"tenantTerm\":\"submitting\",\"frontiersDefaultTerm\":\"submitting\",\"category\":\"Process\"},{\"sequenceNumber\":104,\"key\":\"t_e\",\"tenantTerm\":\"TE\",\"frontiersDefaultTerm\":\"TE\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":105,\"key\":\"topic\",\"tenantTerm\":\"topic\",\"frontiersDefaultTerm\":\"topic\",\"category\":\"Process\"},{\"sequenceNumber\":106,\"key\":\"topic_summary\",\"tenantTerm\":\"topic summary\",\"frontiersDefaultTerm\":\"topic summary\",\"category\":\"Process\"},{\"sequenceNumber\":107,\"key\":\"figure\",\"tenantTerm\":\"figure\",\"frontiersDefaultTerm\":\"figure\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":108,\"key\":\"figures\",\"tenantTerm\":\"figures\",\"frontiersDefaultTerm\":\"figures\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":109,\"key\":\"editorial_file\",\"tenantTerm\":\"editorial file\",\"frontiersDefaultTerm\":\"editorial file\",\"category\":\"Core\"},{\"sequenceNumber\":110,\"key\":\"editorial_files\",\"tenantTerm\":\"editorial files\",\"frontiersDefaultTerm\":\"editorial files\",\"category\":\"Core\"},{\"sequenceNumber\":111,\"key\":\"e_book\",\"tenantTerm\":\"e-book\",\"frontiersDefaultTerm\":\"e-book\",\"category\":\"Core\"},{\"sequenceNumber\":112,\"key\":\"organization\",\"tenantTerm\":\"organization\",\"frontiersDefaultTerm\":\"organization\",\"category\":\"Core\"},{\"sequenceNumber\":113,\"key\":\"institution\",\"tenantTerm\":\"institution\",\"frontiersDefaultTerm\":\"institution\",\"category\":\"Core\"},{\"sequenceNumber\":114,\"key\":\"reference\",\"tenantTerm\":\"reference\",\"frontiersDefaultTerm\":\"reference\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":115,\"key\":\"references\",\"tenantTerm\":\"references\",\"frontiersDefaultTerm\":\"references\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":116,\"key\":\"sce\",\"tenantTerm\":\"SCE\",\"frontiersDefaultTerm\":\"SCE\",\"description\":\"Abbreviation for Specialty Chief Editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":117,\"key\":\"submission\",\"tenantTerm\":\"submission\",\"frontiersDefaultTerm\":\"submission\",\"category\":\"Process\"},{\"sequenceNumber\":118,\"key\":\"submissions\",\"tenantTerm\":\"submissions\",\"frontiersDefaultTerm\":\"submissions\",\"category\":\"Process\"},{\"sequenceNumber\":119,\"key\":\"editing\",\"tenantTerm\":\"editing\",\"frontiersDefaultTerm\":\"editing\",\"category\":\"Process\"},{\"sequenceNumber\":120,\"key\":\"in_preparation\",\"tenantTerm\":\"in preparation\",\"frontiersDefaultTerm\":\"in preparation\",\"category\":\"Process\"},{\"sequenceNumber\":121,\"key\":\"country_region\",\"tenantTerm\":\"country\u002Fregion\",\"frontiersDefaultTerm\":\"country\u002Fregion\",\"description\":\"Because of political issues, some of the country listings are actually classified as `regions` and we need to include this. However other clients may not want to do this.\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":122,\"key\":\"countries_regions\",\"tenantTerm\":\"countries\u002Fregions\",\"frontiersDefaultTerm\":\"countries\u002Fregions\",\"description\":\"Because of political issues, some of the country listings are actually classified as `regions` and we need to include this. However other clients may not want to do this.\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":123,\"key\":\"specialty\",\"tenantTerm\":\"specialty\",\"frontiersDefaultTerm\":\"specialty\",\"category\":\"Core\"},{\"sequenceNumber\":124,\"key\":\"specialties\",\"tenantTerm\":\"specialties\",\"frontiersDefaultTerm\":\"specialties\",\"category\":\"Core\"},{\"sequenceNumber\":125,\"key\":\"associate_editors\",\"tenantTerm\":\"Associate Editors\",\"frontiersDefaultTerm\":\"Associate Editors\",\"description\":\"An editorial role on a specialty that a Registered User may hold. This gives them rights to different functionality and parts of the platform\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":126,\"key\":\"reviewed\",\"tenantTerm\":\"reviewed\",\"frontiersDefaultTerm\":\"reviewed\",\"category\":\"Peer Review Process\"},{\"sequenceNumber\":127,\"key\":\"institutional_members\",\"tenantTerm\":\"institutional partners\",\"frontiersDefaultTerm\":\"institutional partners\",\"category\":\"Accounting\"},{\"sequenceNumber\":128,\"key\":\"institutional_memberships\",\"tenantTerm\":\"institutional partnerships\",\"frontiersDefaultTerm\":\"institutional partnerships\",\"category\":\"Accounting\"},{\"sequenceNumber\":129,\"key\":\"assistant_field_chief_editors\",\"tenantTerm\":\"Assistant Field Chief Editors\",\"frontiersDefaultTerm\":\"Assistant Field Chief Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":130,\"key\":\"publications\",\"tenantTerm\":\"publications\",\"frontiersDefaultTerm\":\"publications\",\"category\":\"Process\"},{\"sequenceNumber\":131,\"key\":\"ae_accepted\",\"tenantTerm\":\"recommended acceptance\",\"frontiersDefaultTerm\":\"accepted\",\"category\":\"Process\"},{\"sequenceNumber\":132,\"key\":\"field_journal\",\"tenantTerm\":\"field journal\",\"frontiersDefaultTerm\":\"field journal\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":133,\"key\":\"field_journals\",\"tenantTerm\":\"field journals\",\"frontiersDefaultTerm\":\"field journals\",\"category\":\"Taxonomy\"},{\"sequenceNumber\":134,\"key\":\"program_manager\",\"tenantTerm\":\"program manager\",\"frontiersDefaultTerm\":\"program manager\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":135,\"key\":\"journal_manager\",\"tenantTerm\":\"journal manager\",\"frontiersDefaultTerm\":\"journal manager\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":136,\"key\":\"journal_specialist\",\"tenantTerm\":\"journal specialist\",\"frontiersDefaultTerm\":\"journal specialist\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":137,\"key\":\"program_managers\",\"tenantTerm\":\"program managers\",\"frontiersDefaultTerm\":\"program managers\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":138,\"key\":\"journal_managers\",\"tenantTerm\":\"journal managers\",\"frontiersDefaultTerm\":\"journal managers\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":139,\"key\":\"journal_specialists\",\"tenantTerm\":\"journal specialists\",\"frontiersDefaultTerm\":\"journal specialists\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":140,\"key\":\"cover_letter\",\"tenantTerm\":\"manuscript contribution to the field\",\"frontiersDefaultTerm\":\"manuscript contribution to the field\",\"category\":\"Process\"},{\"sequenceNumber\":141,\"key\":\"ae_accepted_manuscript\",\"tenantTerm\":\"recommended to accept manuscript\",\"frontiersDefaultTerm\":\"accepted manuscript\",\"category\":\"Process\"},{\"sequenceNumber\":142,\"key\":\"recommend_for_rejection\",\"tenantTerm\":\"recommend for rejection\",\"frontiersDefaultTerm\":\"recommend for rejection\",\"category\":\"Process\"},{\"sequenceNumber\":143,\"key\":\"recommended_for_rejection\",\"tenantTerm\":\"recommended for rejection\",\"frontiersDefaultTerm\":\"recommended for rejection\",\"category\":\"Process\"},{\"sequenceNumber\":144,\"key\":\"ae\",\"tenantTerm\":\"AE\",\"frontiersDefaultTerm\":\"AE\",\"description\":\"Associate Editor - board member\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":145,\"key\":\"re\",\"tenantTerm\":\"RE\",\"frontiersDefaultTerm\":\"RE\",\"description\":\"Review Editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":146,\"key\":\"rev\",\"tenantTerm\":\"REV\",\"frontiersDefaultTerm\":\"REV\",\"description\":\"Reviewer\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":147,\"key\":\"aut\",\"tenantTerm\":\"AUT\",\"frontiersDefaultTerm\":\"AUT\",\"description\":\"Author\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":148,\"key\":\"coraut\",\"tenantTerm\":\"CORAUT\",\"frontiersDefaultTerm\":\"CORAUT\",\"description\":\"Corresponding author\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":149,\"key\":\"saut\",\"tenantTerm\":\"SAUT\",\"frontiersDefaultTerm\":\"SAUT\",\"description\":\"Submitting author\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":150,\"key\":\"coaut\",\"tenantTerm\":\"COAUT\",\"frontiersDefaultTerm\":\"COAUT\",\"description\":\"co-author\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":151,\"key\":\"tsof\",\"tenantTerm\":\"TSOF\",\"frontiersDefaultTerm\":\"TSOF\",\"description\":\"Typesetter\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":152,\"key\":\"typesetting_office\",\"tenantTerm\":\"typesetting office\",\"frontiersDefaultTerm\":\"typesetting office\",\"category\":\"Product\"},{\"sequenceNumber\":153,\"key\":\"config\",\"tenantTerm\":\"CONFIG\",\"frontiersDefaultTerm\":\"CONFIG\",\"description\":\"Configuration office role\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":154,\"key\":\"jm\",\"tenantTerm\":\"JM\",\"frontiersDefaultTerm\":\"JM\",\"description\":\"Journal Manager\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":155,\"key\":\"rte\",\"tenantTerm\":\"RTE\",\"frontiersDefaultTerm\":\"RTE\",\"description\":\"Research topic editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":156,\"key\":\"organizations\",\"tenantTerm\":\"organizations\",\"frontiersDefaultTerm\":\"organizations\",\"category\":\"Core\"},{\"sequenceNumber\":157,\"key\":\"publishing\",\"tenantTerm\":\"publishing\",\"frontiersDefaultTerm\":\"publishing\",\"category\":\"Core\"},{\"sequenceNumber\":158,\"key\":\"acceptance\",\"tenantTerm\":\"acceptance\",\"frontiersDefaultTerm\":\"acceptance\",\"category\":\"Process\"},{\"sequenceNumber\":159,\"key\":\"preferred_associate_editor\",\"tenantTerm\":\"preferred associate editor\",\"frontiersDefaultTerm\":\"preferred associate editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":160,\"key\":\"topic_editors\",\"tenantTerm\":\"Topic Editors\",\"frontiersDefaultTerm\":\"Topic Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":161,\"key\":\"institutions\",\"tenantTerm\":\"institutions\",\"frontiersDefaultTerm\":\"institutions\",\"category\":\"Core\"},{\"sequenceNumber\":162,\"key\":\"author(s)\",\"tenantTerm\":\"author(s)\",\"frontiersDefaultTerm\":\"author(s)\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":163,\"key\":\"figure(s)\",\"tenantTerm\":\"figure(s)\",\"frontiersDefaultTerm\":\"figure(s)\",\"category\":\"Manuscript Metadata\"},{\"sequenceNumber\":164,\"key\":\"co-authors\",\"tenantTerm\":\"co-authors\",\"frontiersDefaultTerm\":\"co-authors\",\"description\":\"co-authors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":165,\"key\":\"editorial_board_members\",\"tenantTerm\":\"editorial board members\",\"frontiersDefaultTerm\":\"editorial board members\",\"description\":\"editorial board members\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":166,\"key\":\"editorial_board\",\"tenantTerm\":\"editorial board\",\"frontiersDefaultTerm\":\"editorial board\",\"description\":\"editorial board\",\"category\":\"Product\"},{\"sequenceNumber\":167,\"key\":\"co-authorship\",\"tenantTerm\":\"co-authorship\",\"frontiersDefaultTerm\":\"co-authorship\",\"description\":\"co-authorship\",\"category\":\"Misc.\"},{\"sequenceNumber\":168,\"key\":\"role_id_1\",\"tenantTerm\":\"registration office\",\"frontiersDefaultTerm\":\"registration office\",\"category\":\"User Role\"},{\"sequenceNumber\":169,\"key\":\"role_id_2\",\"tenantTerm\":\"editorial office\",\"frontiersDefaultTerm\":\"editorial office\",\"category\":\"User Role\"},{\"sequenceNumber\":170,\"key\":\"role_id_7\",\"tenantTerm\":\"field chief editor\",\"frontiersDefaultTerm\":\"field chief editor\",\"category\":\"User Role\"},{\"sequenceNumber\":171,\"key\":\"role_id_8\",\"tenantTerm\":\"assistant field chief editor\",\"frontiersDefaultTerm\":\"assistant field chief editor\",\"category\":\"User Role\"},{\"sequenceNumber\":172,\"key\":\"role_id_9\",\"tenantTerm\":\"specialty chief editor\",\"frontiersDefaultTerm\":\"specialty chief editor\",\"category\":\"User Role\"},{\"sequenceNumber\":173,\"key\":\"role_id_10\",\"tenantTerm\":\"assistant specialty chief editor\",\"frontiersDefaultTerm\":\"assistant specialty chief editor\",\"category\":\"User Role\"},{\"sequenceNumber\":174,\"key\":\"role_id_11\",\"tenantTerm\":\"associate editor\",\"frontiersDefaultTerm\":\"associate editor\",\"category\":\"User Role\"},{\"sequenceNumber\":175,\"key\":\"role_id_12\",\"tenantTerm\":\"guest associate editor\",\"frontiersDefaultTerm\":\"guest associate editor\",\"category\":\"User Role\"},{\"sequenceNumber\":176,\"key\":\"role_id_13\",\"tenantTerm\":\"review editor\",\"frontiersDefaultTerm\":\"review editor\",\"category\":\"User Role\"},{\"sequenceNumber\":177,\"key\":\"role_id_14\",\"tenantTerm\":\"reviewer\",\"frontiersDefaultTerm\":\"reviewer\",\"category\":\"User Role\"},{\"sequenceNumber\":178,\"key\":\"role_id_15\",\"tenantTerm\":\"author\",\"frontiersDefaultTerm\":\"author\",\"category\":\"User Role\"},{\"sequenceNumber\":179,\"key\":\"role_id_16\",\"tenantTerm\":\"corresponding author\",\"frontiersDefaultTerm\":\"corresponding author\",\"category\":\"User Role\"},{\"sequenceNumber\":180,\"key\":\"role_id_17\",\"tenantTerm\":\"submitting author\",\"frontiersDefaultTerm\":\"submitting author\",\"category\":\"User Role\"},{\"sequenceNumber\":181,\"key\":\"role_id_18\",\"tenantTerm\":\"co-author\",\"frontiersDefaultTerm\":\"co-author\",\"category\":\"User Role\"},{\"sequenceNumber\":182,\"key\":\"role_id_20\",\"tenantTerm\":\"production office\",\"frontiersDefaultTerm\":\"production office\",\"category\":\"User Role\"},{\"sequenceNumber\":183,\"key\":\"role_id_22\",\"tenantTerm\":\"typesetting office (typesetter)\",\"frontiersDefaultTerm\":\"typesetting office (typesetter)\",\"category\":\"User Role\"},{\"sequenceNumber\":184,\"key\":\"role_id_24\",\"tenantTerm\":\"registered user\",\"frontiersDefaultTerm\":\"registered user\",\"category\":\"User Role\"},{\"sequenceNumber\":185,\"key\":\"role_id_35\",\"tenantTerm\":\"job office\",\"frontiersDefaultTerm\":\"job office\",\"category\":\"User Role\"},{\"sequenceNumber\":186,\"key\":\"role_id_41\",\"tenantTerm\":\"special event administrator\",\"frontiersDefaultTerm\":\"special event administrator\",\"category\":\"User Role\"},{\"sequenceNumber\":187,\"key\":\"role_id_42\",\"tenantTerm\":\"special event reviewer\",\"frontiersDefaultTerm\":\"special event reviewer\",\"category\":\"User Role\"},{\"sequenceNumber\":188,\"key\":\"role_id_43\",\"tenantTerm\":\"submit abstract\",\"frontiersDefaultTerm\":\"submit abstract\",\"category\":\"User Role\"},{\"sequenceNumber\":189,\"key\":\"role_id_52\",\"tenantTerm\":\"events office\",\"frontiersDefaultTerm\":\"events office\",\"category\":\"User Role\"},{\"sequenceNumber\":190,\"key\":\"role_id_53\",\"tenantTerm\":\"event administrator\",\"frontiersDefaultTerm\":\"event administrator\",\"category\":\"User Role\"},{\"sequenceNumber\":191,\"key\":\"role_id_89\",\"tenantTerm\":\"content management office\",\"frontiersDefaultTerm\":\"content management office\",\"category\":\"User Role\"},{\"sequenceNumber\":192,\"key\":\"role_id_98\",\"tenantTerm\":\"accounting office\",\"frontiersDefaultTerm\":\"accounting office\",\"category\":\"User Role\"},{\"sequenceNumber\":193,\"key\":\"role_id_99\",\"tenantTerm\":\"projects\",\"frontiersDefaultTerm\":\"projects\",\"category\":\"User Role\"},{\"sequenceNumber\":194,\"key\":\"role_id_103\",\"tenantTerm\":\"configuration office\",\"frontiersDefaultTerm\":\"configuration office\",\"category\":\"User Role\"},{\"sequenceNumber\":195,\"key\":\"role_id_104\",\"tenantTerm\":\"beta user\",\"frontiersDefaultTerm\":\"beta user\",\"category\":\"User Role\"},{\"sequenceNumber\":196,\"key\":\"role_id_106\",\"tenantTerm\":\"wfconf\",\"frontiersDefaultTerm\":\"wfconf\",\"category\":\"User Role\"},{\"sequenceNumber\":197,\"key\":\"role_id_107\",\"tenantTerm\":\"rt management beta user\",\"frontiersDefaultTerm\":\"rt management beta user\",\"category\":\"User Role\"},{\"sequenceNumber\":198,\"key\":\"role_id_108\",\"tenantTerm\":\"deo beta user\",\"frontiersDefaultTerm\":\"deo beta user\",\"category\":\"User Role\"},{\"sequenceNumber\":199,\"key\":\"role_id_109\",\"tenantTerm\":\"search beta user\",\"frontiersDefaultTerm\":\"search beta user\",\"category\":\"User Role\"},{\"sequenceNumber\":200,\"key\":\"role_id_110\",\"tenantTerm\":\"journal manager\",\"frontiersDefaultTerm\":\"journal manager\",\"category\":\"User Role\"},{\"sequenceNumber\":201,\"key\":\"role_id_111\",\"tenantTerm\":\"myfrontiers beta user\",\"frontiersDefaultTerm\":\"myfrontiers beta user\",\"category\":\"User Role\"},{\"sequenceNumber\":202,\"key\":\"role_id_21\",\"tenantTerm\":\"copy editor\",\"frontiersDefaultTerm\":\"copy editor\",\"category\":\"User Role\"},{\"sequenceNumber\":203,\"key\":\"role_id_1_abr\",\"tenantTerm\":\"ROF\",\"frontiersDefaultTerm\":\"ROF\",\"category\":\"User Role\"},{\"sequenceNumber\":204,\"key\":\"role_id_2_abr\",\"tenantTerm\":\"EOF\",\"frontiersDefaultTerm\":\"EOF\",\"category\":\"User Role\"},{\"sequenceNumber\":205,\"key\":\"role_id_7_abr\",\"tenantTerm\":\"FCE\",\"frontiersDefaultTerm\":\"FCE\",\"category\":\"User Role\"},{\"sequenceNumber\":206,\"key\":\"role_id_8_abr\",\"tenantTerm\":\"AFCE\",\"frontiersDefaultTerm\":\"AFCE\",\"category\":\"User Role\"},{\"sequenceNumber\":207,\"key\":\"role_id_9_abr\",\"tenantTerm\":\"SCE\",\"frontiersDefaultTerm\":\"SCE\",\"category\":\"User Role\"},{\"sequenceNumber\":208,\"key\":\"role_id_10_abr\",\"tenantTerm\":\"ASCE\",\"frontiersDefaultTerm\":\"ASCE\",\"category\":\"User Role\"},{\"sequenceNumber\":209,\"key\":\"role_id_11_abr\",\"tenantTerm\":\"AE\",\"frontiersDefaultTerm\":\"AE\",\"category\":\"User Role\"},{\"sequenceNumber\":210,\"key\":\"role_id_12_abr\",\"tenantTerm\":\"GAE\",\"frontiersDefaultTerm\":\"GAE\",\"category\":\"User Role\"},{\"sequenceNumber\":211,\"key\":\"role_id_13_abr\",\"tenantTerm\":\"RE\",\"frontiersDefaultTerm\":\"RE\",\"category\":\"User Role\"},{\"sequenceNumber\":212,\"key\":\"role_id_14_abr\",\"tenantTerm\":\"REV\",\"frontiersDefaultTerm\":\"REV\",\"category\":\"User Role\"},{\"sequenceNumber\":213,\"key\":\"role_id_15_abr\",\"tenantTerm\":\"AUT\",\"frontiersDefaultTerm\":\"AUT\",\"category\":\"User Role\"},{\"sequenceNumber\":214,\"key\":\"role_id_16_abr\",\"tenantTerm\":\"CORAUT\",\"frontiersDefaultTerm\":\"CORAUT\",\"category\":\"User Role\"},{\"sequenceNumber\":215,\"key\":\"role_id_17_abr\",\"tenantTerm\":\"SAUT\",\"frontiersDefaultTerm\":\"SAUT\",\"category\":\"User Role\"},{\"sequenceNumber\":216,\"key\":\"role_id_18_abr\",\"tenantTerm\":\"COAUT\",\"frontiersDefaultTerm\":\"COAUT\",\"category\":\"User Role\"},{\"sequenceNumber\":217,\"key\":\"role_id_20_abr\",\"tenantTerm\":\"POF\",\"frontiersDefaultTerm\":\"POF\",\"category\":\"User Role\"},{\"sequenceNumber\":218,\"key\":\"role_id_22_abr\",\"tenantTerm\":\"TSOF\",\"frontiersDefaultTerm\":\"TSOF\",\"category\":\"User Role\"},{\"sequenceNumber\":219,\"key\":\"role_id_24_abr\",\"tenantTerm\":\"RU\",\"frontiersDefaultTerm\":\"RU\",\"category\":\"User Role\"},{\"sequenceNumber\":220,\"key\":\"role_id_35_abr\",\"tenantTerm\":\"JOF\",\"frontiersDefaultTerm\":\"JOF\",\"category\":\"User Role\"},{\"sequenceNumber\":221,\"key\":\"role_id_41_abr\",\"tenantTerm\":\"SE-ADM\",\"frontiersDefaultTerm\":\"SE-ADM\",\"category\":\"User Role\"},{\"sequenceNumber\":222,\"key\":\"role_id_42_abr\",\"tenantTerm\":\"SE-REV\",\"frontiersDefaultTerm\":\"SE-REV\",\"category\":\"User Role\"},{\"sequenceNumber\":223,\"key\":\"role_id_43_abr\",\"tenantTerm\":\"SE-AUT\",\"frontiersDefaultTerm\":\"SE-AUT\",\"category\":\"User Role\"},{\"sequenceNumber\":224,\"key\":\"role_id_52_abr\",\"tenantTerm\":\"EVOF\",\"frontiersDefaultTerm\":\"EVOF\",\"category\":\"User Role\"},{\"sequenceNumber\":225,\"key\":\"role_id_53_abr\",\"tenantTerm\":\"EV-ADM\",\"frontiersDefaultTerm\":\"EV-ADM\",\"category\":\"User Role\"},{\"sequenceNumber\":226,\"key\":\"role_id_89_abr\",\"tenantTerm\":\"COMOF\",\"frontiersDefaultTerm\":\"COMOF\",\"category\":\"User Role\"},{\"sequenceNumber\":227,\"key\":\"role_id_98_abr\",\"tenantTerm\":\"AOF\",\"frontiersDefaultTerm\":\"AOF\",\"category\":\"User Role\"},{\"sequenceNumber\":228,\"key\":\"role_id_99_abr\",\"tenantTerm\":\"Projects\",\"frontiersDefaultTerm\":\"Projects\",\"category\":\"User Role\"},{\"sequenceNumber\":229,\"key\":\"role_id_103_abr\",\"tenantTerm\":\"CONFIG\",\"frontiersDefaultTerm\":\"CONFIG\",\"category\":\"User Role\"},{\"sequenceNumber\":230,\"key\":\"role_id_104_abr\",\"tenantTerm\":\"BETA\",\"frontiersDefaultTerm\":\"BETA\",\"category\":\"User Role\"},{\"sequenceNumber\":231,\"key\":\"role_id_106_abr\",\"tenantTerm\":\"WFCONF\",\"frontiersDefaultTerm\":\"WFCONF\",\"category\":\"User Role\"},{\"sequenceNumber\":232,\"key\":\"role_id_107_abr\",\"tenantTerm\":\"RTBETA\",\"frontiersDefaultTerm\":\"RTBETA\",\"category\":\"User Role\"},{\"sequenceNumber\":233,\"key\":\"role_id_108_abr\",\"tenantTerm\":\"DEOBETA\",\"frontiersDefaultTerm\":\"DEOBETA\",\"category\":\"User Role\"},{\"sequenceNumber\":234,\"key\":\"role_id_109_abr\",\"tenantTerm\":\"SEARCHBETA\",\"frontiersDefaultTerm\":\"SEARCHBETA\",\"category\":\"User Role\"},{\"sequenceNumber\":235,\"key\":\"role_id_110_abr\",\"tenantTerm\":\"JM\",\"frontiersDefaultTerm\":\"JM\",\"category\":\"User Role\"},{\"sequenceNumber\":236,\"key\":\"role_id_111_abr\",\"tenantTerm\":\"MFBETA\",\"frontiersDefaultTerm\":\"MFBETA\",\"category\":\"User Role\"},{\"sequenceNumber\":237,\"key\":\"role_id_21_abr\",\"tenantTerm\":\"COPED\",\"frontiersDefaultTerm\":\"COPED\",\"category\":\"User Role\"},{\"sequenceNumber\":238,\"key\":\"reviewer_editorial_board\",\"tenantTerm\":\"editorial board\",\"frontiersDefaultTerm\":\"editorial board\",\"description\":\"This is the label for the review editorial board\",\"category\":\"Label\"},{\"sequenceNumber\":239,\"key\":\"field_chief_editor\",\"tenantTerm\":\"Field Chief Editor\",\"frontiersDefaultTerm\":\"Field Chief Editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":240,\"key\":\"field_chief_editors\",\"tenantTerm\":\"Field Chief Editors\",\"frontiersDefaultTerm\":\"Field Chief Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":241,\"key\":\"editor\",\"tenantTerm\":\"editor\",\"frontiersDefaultTerm\":\"editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":242,\"key\":\"editors\",\"tenantTerm\":\"editors\",\"frontiersDefaultTerm\":\"editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":243,\"key\":\"board\",\"tenantTerm\":\"board\",\"frontiersDefaultTerm\":\"board\",\"category\":\"Label\"},{\"sequenceNumber\":244,\"key\":\"boards\",\"tenantTerm\":\"boards\",\"frontiersDefaultTerm\":\"boards\",\"category\":\"Label\"},{\"sequenceNumber\":245,\"key\":\"article_collection\",\"tenantTerm\":\"article collection\",\"frontiersDefaultTerm\":\"article collection\",\"category\":\"Label\"},{\"sequenceNumber\":246,\"key\":\"article_collections\",\"tenantTerm\":\"article collections\",\"frontiersDefaultTerm\":\"article collections\",\"category\":\"Label\"},{\"sequenceNumber\":247,\"key\":\"handling_editor\",\"tenantTerm\":\"handling editor\",\"frontiersDefaultTerm\":\"associate editor\",\"description\":\"This terminology key is for the person assigned to edit a manuscript. It is a label for the temporary handling editor assignment.\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":248,\"key\":\"handling_editors\",\"tenantTerm\":\"handling editors\",\"frontiersDefaultTerm\":\"associate editors\",\"description\":\"This terminology key is for the person assigned to edit a manuscript. It is a label for the temporary handling editor assignment.\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":249,\"key\":\"ae_accept\",\"tenantTerm\":\"recommend acceptance\",\"frontiersDefaultTerm\":\"accept\",\"category\":\"Process\"},{\"sequenceNumber\":250,\"key\":\"rtm\",\"tenantTerm\":\"RTM\",\"frontiersDefaultTerm\":\"RTM\",\"category\":\"Product\"},{\"sequenceNumber\":251,\"key\":\"frontiers_media_sa\",\"tenantTerm\":\"Frontiers Media S.A\",\"frontiersDefaultTerm\":\"Frontiers Media S.A\",\"category\":\"Customer\"},{\"sequenceNumber\":252,\"key\":\"review_editors\",\"tenantTerm\":\"Review Editors\",\"frontiersDefaultTerm\":\"Review Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":253,\"key\":\"journal_card_chief_editor\",\"tenantTerm\":\"Chief Editor\",\"frontiersDefaultTerm\":\"Chief Editor\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":254,\"key\":\"journal_card_chief_editors\",\"tenantTerm\":\"Chief Editors\",\"frontiersDefaultTerm\":\"Chief Editors\",\"category\":\"Label (Role)\"},{\"sequenceNumber\":255,\"key\":\"call_for_papers\",\"tenantTerm\":\"Call for papers\",\"frontiersDefaultTerm\":\"Call for papers\",\"category\":\"Label\"},{\"sequenceNumber\":256,\"key\":\"calls_for_papers\",\"tenantTerm\":\"Calls for papers\",\"frontiersDefaultTerm\":\"Calls for papers\",\"category\":\"Label\"},{\"sequenceNumber\":257,\"key\":\"supervising_editor\",\"tenantTerm\":\"Supervising Editor\",\"frontiersDefaultTerm\":\"Supervising Editor\",\"description\":\"A Chief or Assistant Chief editor who is assigned to a manuscript to supervise.\",\"category\":\"Role\",\"externalKey\":\"supervising_editor\"},{\"sequenceNumber\":258,\"key\":\"supervising_editors\",\"tenantTerm\":\"Supervising Editors\",\"frontiersDefaultTerm\":\"Supervising Editors\",\"description\":\"A Chief or Assistant Chief editor who is assigned to a manuscript to supervise.\",\"category\":\"Role\",\"externalKey\":\"supervising_editors\"},{\"sequenceNumber\":259,\"key\":\"reviewer_endorse\",\"tenantTerm\":\"endorse\",\"frontiersDefaultTerm\":\"endorse\",\"category\":\"Label\"},{\"sequenceNumber\":260,\"key\":\"reviewer_endorsed\",\"tenantTerm\":\"endorsed\",\"frontiersDefaultTerm\":\"endorsed\",\"category\":\"Label\"},{\"sequenceNumber\":261,\"key\":\"reviewer_endorse_publication\",\"tenantTerm\":\"endorse publication\",\"frontiersDefaultTerm\":\"endorse publication\",\"category\":\"Label\"},{\"sequenceNumber\":262,\"key\":\"reviewer_endorsed_publication\",\"tenantTerm\":\"endorsed publication\",\"frontiersDefaultTerm\":\"endorsed publication\",\"category\":\"Label\"},{\"sequenceNumber\":263,\"key\":\"editor_role\",\"tenantTerm\":\"editor role\",\"frontiersDefaultTerm\":\"Editor Role\",\"category\":\"Label\"},{\"sequenceNumber\":264,\"key\":\"editor_roles\",\"tenantTerm\":\"editor roles\",\"frontiersDefaultTerm\":\"Editor Roles\",\"category\":\"Label\"},{\"sequenceNumber\":265,\"key\":\"editorial_role\",\"tenantTerm\":\"editorial role\",\"frontiersDefaultTerm\":\"Editorial Role\",\"category\":\"Label\"},{\"sequenceNumber\":266,\"key\":\"editorial_roles\",\"tenantTerm\":\"editorial roles\",\"frontiersDefaultTerm\":\"Editorial Roles\",\"category\":\"Label\"},{\"sequenceNumber\":267,\"key\":\"call_for_paper\",\"tenantTerm\":\"Call for paper\",\"frontiersDefaultTerm\":\"Call for paper\",\"category\":\"Label\"},{\"sequenceNumber\":268,\"key\":\"research_topic_abstract\",\"tenantTerm\":\"manuscript summary\",\"frontiersDefaultTerm\":\"manuscript summary\",\"category\":\"Process\"},{\"sequenceNumber\":269,\"key\":\"research_topic_abstracts\",\"tenantTerm\":\"manuscript summaries\",\"frontiersDefaultTerm\":\"manuscript summaries\",\"category\":\"Process\"},{\"sequenceNumber\":270,\"key\":\"submissions_team_manager\",\"tenantTerm\":\"Journal Manager\",\"frontiersDefaultTerm\":\"Content Manager\",\"category\":\"Process\"},{\"sequenceNumber\":271,\"key\":\"submissions_team\",\"tenantTerm\":\"Journal Team\",\"frontiersDefaultTerm\":\"Content Team\",\"category\":\"Process\"},{\"sequenceNumber\":272,\"key\":\"topic_coordinator\",\"tenantTerm\":\"topic coordinator\",\"frontiersDefaultTerm\":\"topic coordinator\",\"category\":\"Process\"},{\"sequenceNumber\":273,\"key\":\"topic_coordinators\",\"tenantTerm\":\"topic coordinators\",\"frontiersDefaultTerm\":\"topic coordinators\",\"category\":\"Process\"}]}'\n",gtmId:"GTM-M322FV2",gtmAuth:"owVbWxfaJr21yQv1fe1cAQ",gtmServerUrl:"https:\u002F\u002Ftag-manager.frontiersin.org",gtmPreview:"env-1",faviconSize512:"https:\u002F\u002Fbrand.frontiersin.org\u002Fm\u002Fed3f9ce840a03d7\u002Ffavicon_512-tenantFavicon-Frontiers.png",socialMediaImg:"https:\u002F\u002Fbrand.frontiersin.org\u002Fm\u002F1c8bcb536c789e11\u002FGuidelines-Frontiers_Logo_1200x628_1-91to1.png",_app:{basePath:"\u002F",assetsPath:"\u002Farticle-pages\u002F_nuxt\u002F",cdnURL:e}},apollo:{contentfulJournalsDelivery:Object.create(null),contentfulJournalsPreview:Object.create(null),contentfulHomeDelivery:Object.create(null),contentfulHomePreview:Object.create(null),frontiersGraph:Object.create(null)}}}("journal_journal","public_space",1,"frontiersin.org",null,"_self","",true,3,"frontierspartnerships.org","_blank",false,5,0,"Frontiers in Systems Neuroscience","PDF","systems-neuroscience","Link",4,2,"description","Frontiers","Help center","Grey","Medium","ssph-journal.org","fship","image","landscape","Front. Syst. Neurosci.","1662-5137",void 0,"Perception and Reality: Why a Wholly Empirical Paradigm is Needed to Understand Vision",18,9,"citation_author","citation_author_institution",1920,"por-journal.com",7,"escubed.org",1918,"fipp","https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002FB06310AC-9C74-4D95-A83F6460BFCF85C8\u002Fwebimage-C25AE192-F2F3-4D2F-BA1D6B5A7D5E1994.png","science","neurology","22C10171-81B3-4DA6-99342F272A32E8BB","jpg","2022-06-27T10:00:33Z","fnsys",1091,55,"journal_field","10.3389\u002Ffnsys.2015.00156","\u003Cp\u003EA central puzzle in vision science is how perceptions that are routinely at odds with physical measurements of real world properties can arise from neural responses that nonetheless lead to effective behaviors. Here we argue that the solution depends on: (1) rejecting the assumption that the goal of vision is to recover, however imperfectly, properties of the world; and (2) replacing it with a paradigm in which perceptions reflect biological utility based on past experience rather than objective features of the environment. Present evidence is consistent with the conclusion that conceiving vision in wholly empirical terms provides a plausible way to understand what we see and why.\u003C\u002Fp\u003E",3099,37265,"Dale","Duke Institute for Brain Sciences, Duke University","Durham, NC, USA",101928,"Yaniv","Duke-NUS Graduate Medical School","Singapore, Singapore",149132,"William T.",114786,514,"Dirk",3813,"Rava",30948,"Walter","EPUB","fnsys-09-00156.pdf","Frontiers | Perception and Reality: Why a Wholly Empirical Paradigm is Needed to Understand Vision","https:\u002F\u002Fwww.frontiersin.org\u002Fjournals\u002Fsystems-neuroscience\u002Farticles\u002F10.3389\u002Ffnsys.2015.00156\u002Ffull","A central puzzle in vision science is how perceptions that are routinely at odds with physical measurements of real world properties can arise from neural re...","og:title","og:description","keywords","og:site_name","og:image","og:type","og:url","twitter:card","citation_volume","citation_journal_title","citation_publisher","citation_journal_abbrev","citation_issn","citation_doi","citation_firstpage","citation_language","citation_title","citation_keywords","citation_abstract","citation_pdf_url","citation_online_date","citation_publication_date","Duke Institute for Brain Sciences, Duke University, Durham, NC, USA","dc.identifier","articles","editors","research-topics","https:\u002F\u002Fd2csxpduxe849s.cloudfront.net\u002Fmedia\u002FE32629C6-9347-4F84-81FEAEF7BFA342B3\u002F0B4B1380-42EB-4FD5-9D7E2DBC603E79F8\u002Fwebimage-C4875379-1478-416F-B03DF68FE3D8DBB5.png","Man ultramarathon runner in the mountains he trains at sunset","https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fresearch-integrity","How we publish","https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fhow-we-publish","Research Topics","Fee policy","https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Ffee-policy","https:\u002F\u002Fforum.frontiersin.org\u002F","Frontiers Planet Prize","https:\u002F\u002Fwww.frontiersplanetprize.org\u002F","this link will take you to the Frontiers Planet Prize website","Career opportunities","https:\u002F\u002Fcareers.frontiersin.org\u002F","https:\u002F\u002Fwww.frontiersin.org\u002Fabout\u002Fcontact","Author guidelines","Editor guidelines","https:\u002F\u002Fwww.frontiersin.org\u002Fjournals","https:\u002F\u002Fwww.frontiersin.org\u002Farticles","Articles","https:\u002F\u002Fhelpcenter.frontiersin.org","Frontiers for Young Minds","Frontiers Facebook","Transplant International","transplant-international","ti",1921,"Spanish Journal of Soil Science","spanish-journal-of-soil-science","sjss","ebm-journal.org","Public Health Reviews","public-health-reviews","phrs","Pathology and Oncology Research","pathology-and-oncology-research","pore",21,"Pastoralism: Research, Policy and Practice","pastoralism-research-policy-and-practice","past",11,"Oncology Reviews","oncology-reviews","or","Journal of Pharmacy & Pharmaceutical Sciences","journal-of-pharmacy-pharmaceutical-sciences","jpps","Journal of Cutaneous Immunology and Allergy","journal-of-cutaneous-immunology-and-allergy","JCIA","Journal of Abdominal Wall Surgery","journal-of-abdominal-wall-surgery","jaws",1919,"International Journal of Public Health","international-journal-of-public-health","ijph","Frontiers in Pathology","pathology","fpath",13,12,17,6,"Experimental Biology and Medicine","experimental-biology-and-medicine","EBM","European Journal of Cultural Management and Policy","european-journal-of-cultural-management-and-policy","ejcmp","Earth Science, Systems and Society","earth-science-systems-and-society","esss","Dystonia","dystonia","dyst","British Journal of Biomedical Science","british-journal-of-biomedical-science","bjbs","Aerospace Research Communications","aerospace-research-communications","arc","Advances in Drug and Alcohol Research","advances-in-drug-and-alcohol-research","adar","Acta Virologica","acta-virologica","av","Acta Biochimica Polonica","acta-biochimica-polonica"));</script><script src="/article-pages/_nuxt/e397d1a.js" defer></script><script src="/article-pages/_nuxt/a5e7651.js" defer></script><script src="/article-pages/_nuxt/f548f7f.js" defer></script><script src="/article-pages/_nuxt/e3c5a8f.js" defer></script><script src="/article-pages/_nuxt/0d6d8e5.js" defer></script><script src="/article-pages/_nuxt/ed7fc59.js" defer></script><script src="/article-pages/_nuxt/2abb6c5.js" defer></script><script src="/article-pages/_nuxt/701e3a3.js" defer></script><script src="/article-pages/_nuxt/71728a1.js" defer></script><script data-n-head="ssr" src="https://cdnjs.cloudflare.com/polyfill/v3/polyfill.min.js?features=es6" data-body="true" async></script><script data-n-head="ssr" src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.1/MathJax.js?config=TeX-MML-AM_CHTML" data-body="true" async></script><script data-n-head="ssr" src="https://d1bxh8uas1mnw7.cloudfront.net/assets/altmetric_badges-f0bc9b243ff5677d05460c1eb71834ca998946d764eb3bc244ab4b18ba50d21e.js" data-body="true" async></script><script data-n-head="ssr" src="https://api.altmetric.com/v1/doi/10.3389/fnsys.2015.00156?callback=_altmetric.embed_callback&domain=www.frontiersin.org&key=3c130976ca2b8f2e88f8377633751ba1&cache_until=14-15" data-body="true" async></script><script data-n-head="ssr" src="https://crossmark-cdn.crossref.org/widget/v2.0/widget.js" data-body="true" async></script> </body> </html>