CINXE.COM

David Corina - Academia.edu

<!DOCTYPE html> <html lang="en" xmlns:fb="http://www.facebook.com/2008/fbml" class="wf-loading"> <head prefix="og: https://ogp.me/ns# fb: https://ogp.me/ns/fb# academia: https://ogp.me/ns/fb/academia#"> <meta charset="utf-8"> <meta name=viewport content="width=device-width, initial-scale=1"> <meta rel="search" type="application/opensearchdescription+xml" href="/open_search.xml" title="Academia.edu"> <title>David Corina - Academia.edu</title> <!-- _ _ _ | | (_) | | __ _ ___ __ _ __| | ___ _ __ ___ _ __ _ ___ __| |_ _ / _` |/ __/ _` |/ _` |/ _ \ '_ ` _ \| |/ _` | / _ \/ _` | | | | | (_| | (_| (_| | (_| | __/ | | | | | | (_| || __/ (_| | |_| | \__,_|\___\__,_|\__,_|\___|_| |_| |_|_|\__,_(_)___|\__,_|\__,_| We're hiring! See https://www.academia.edu/hiring --> <link href="//a.academia-assets.com/images/favicons/favicon-production.ico" rel="shortcut icon" type="image/vnd.microsoft.icon"> <link rel="apple-touch-icon" sizes="57x57" href="//a.academia-assets.com/images/favicons/apple-touch-icon-57x57.png"> <link rel="apple-touch-icon" sizes="60x60" href="//a.academia-assets.com/images/favicons/apple-touch-icon-60x60.png"> <link rel="apple-touch-icon" sizes="72x72" href="//a.academia-assets.com/images/favicons/apple-touch-icon-72x72.png"> <link rel="apple-touch-icon" sizes="76x76" href="//a.academia-assets.com/images/favicons/apple-touch-icon-76x76.png"> <link rel="apple-touch-icon" sizes="114x114" href="//a.academia-assets.com/images/favicons/apple-touch-icon-114x114.png"> <link rel="apple-touch-icon" sizes="120x120" href="//a.academia-assets.com/images/favicons/apple-touch-icon-120x120.png"> <link rel="apple-touch-icon" sizes="144x144" href="//a.academia-assets.com/images/favicons/apple-touch-icon-144x144.png"> <link rel="apple-touch-icon" sizes="152x152" href="//a.academia-assets.com/images/favicons/apple-touch-icon-152x152.png"> <link rel="apple-touch-icon" sizes="180x180" href="//a.academia-assets.com/images/favicons/apple-touch-icon-180x180.png"> <link rel="icon" type="image/png" href="//a.academia-assets.com/images/favicons/favicon-32x32.png" sizes="32x32"> <link rel="icon" type="image/png" href="//a.academia-assets.com/images/favicons/favicon-194x194.png" sizes="194x194"> <link rel="icon" type="image/png" href="//a.academia-assets.com/images/favicons/favicon-96x96.png" sizes="96x96"> <link rel="icon" type="image/png" href="//a.academia-assets.com/images/favicons/android-chrome-192x192.png" sizes="192x192"> <link rel="icon" type="image/png" href="//a.academia-assets.com/images/favicons/favicon-16x16.png" sizes="16x16"> <link rel="manifest" href="//a.academia-assets.com/images/favicons/manifest.json"> <meta name="msapplication-TileColor" content="#2b5797"> <meta name="msapplication-TileImage" content="//a.academia-assets.com/images/favicons/mstile-144x144.png"> <meta name="theme-color" content="#ffffff"> <script> window.performance && window.performance.measure && window.performance.measure("Time To First Byte", "requestStart", "responseStart"); </script> <script> (function() { if (!window.URLSearchParams || !window.history || !window.history.replaceState) { return; } var searchParams = new URLSearchParams(window.location.search); var paramsToDelete = [ 'fs', 'sm', 'swp', 'iid', 'nbs', 'rcc', // related content category 'rcpos', // related content carousel position 'rcpg', // related carousel page 'rchid', // related content hit id 'f_ri', // research interest id, for SEO tracking 'f_fri', // featured research interest, for SEO tracking (param key without value) 'f_rid', // from research interest directory for SEO tracking 'f_loswp', // from research interest pills on LOSWP sidebar for SEO tracking 'rhid', // referrring hit id ]; if (paramsToDelete.every((key) => searchParams.get(key) === null)) { return; } paramsToDelete.forEach((key) => { searchParams.delete(key); }); var cleanUrl = new URL(window.location.href); cleanUrl.search = searchParams.toString(); history.replaceState({}, document.title, cleanUrl); })(); </script> <script async src="https://www.googletagmanager.com/gtag/js?id=G-5VKX33P2DS"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-5VKX33P2DS', { cookie_domain: 'academia.edu', send_page_view: false, }); gtag('event', 'page_view', { 'controller': "profiles/works", 'action': "summary", 'controller_action': 'profiles/works#summary', 'logged_in': 'false', 'edge': 'unknown', // Send nil if there is no A/B test bucket, in case some records get logged // with missing data - that way we can distinguish between the two cases. // ab_test_bucket should be of the form <ab_test_name>:<bucket> 'ab_test_bucket': null, }) </script> <script type="text/javascript"> window.sendUserTiming = function(timingName) { if (!(window.performance && window.performance.measure)) return; var entries = window.performance.getEntriesByName(timingName, "measure"); if (entries.length !== 1) return; var timingValue = Math.round(entries[0].duration); gtag('event', 'timing_complete', { name: timingName, value: timingValue, event_category: 'User-centric', }); }; window.sendUserTiming("Time To First Byte"); </script> <meta name="csrf-param" content="authenticity_token" /> <meta name="csrf-token" content="NWB8Af/SYa+2KNnzeCDqsyPQfanbbjOK5s661fNcvjrXL4qynif6549rq2HkupUE/BfAPyy70YO2h/9IoMjVRQ==" /> <link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/wow-77f7b87cb1583fc59aa8f94756ebfe913345937eb932042b4077563bebb5fb4b.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/social/home-9e8218e1301001388038e3fc3427ed00d079a4760ff7745d1ec1b2d59103170a.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/heading-b2b823dd904da60a48fd1bfa1defd840610c2ff414d3f39ed3af46277ab8df3b.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/button-3cea6e0ad4715ed965c49bfb15dedfc632787b32ff6d8c3a474182b231146ab7.css" /><link crossorigin="" href="https://fonts.gstatic.com/" rel="preconnect" /><link href="https://fonts.googleapis.com/css2?family=DM+Sans:ital,opsz,wght@0,9..40,100..1000;1,9..40,100..1000&amp;family=Gupter:wght@400;500;700&amp;family=IBM+Plex+Mono:wght@300;400&amp;family=Material+Symbols+Outlined:opsz,wght,FILL,GRAD@20,400,0,0&amp;display=swap" rel="stylesheet" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/common-10fa40af19d25203774df2d4a03b9b5771b45109c2304968038e88a81d1215c5.css" /> <meta name="author" content="david corina" /> <meta name="description" content="David Corina: 9 Followers, 1 Following, 99 Research papers. Research interests: Phonotactics, Cantonese Linguistics, and Big Five Personality Traits." /> <meta name="google-site-verification" content="bKJMBZA7E43xhDOopFZkssMMkBRjvYERV-NaN4R6mrs" /> <script> var $controller_name = 'works'; var $action_name = "summary"; var $rails_env = 'production'; var $app_rev = '9387f500ddcbb8d05c67bef28a2fe0334f1aafb8'; var $domain = 'academia.edu'; var $app_host = "academia.edu"; var $asset_host = "academia-assets.com"; var $start_time = new Date().getTime(); var $recaptcha_key = "6LdxlRMTAAAAADnu_zyLhLg0YF9uACwz78shpjJB"; var $recaptcha_invisible_key = "6Lf3KHUUAAAAACggoMpmGJdQDtiyrjVlvGJ6BbAj"; var $disableClientRecordHit = false; </script> <script> window.Aedu = { hit_data: null }; window.Aedu.SiteStats = {"premium_universities_count":15268,"monthly_visitors":"113 million","monthly_visitor_count":113692424,"monthly_visitor_count_in_millions":113,"user_count":277704744,"paper_count":55203019,"paper_count_in_millions":55,"page_count":432000000,"page_count_in_millions":432,"pdf_count":16500000,"pdf_count_in_millions":16}; window.Aedu.serverRenderTime = new Date(1733050121000); window.Aedu.timeDifference = new Date().getTime() - 1733050121000; window.Aedu.isUsingCssV1 = false; window.Aedu.enableLocalization = true; window.Aedu.activateFullstory = false; window.Aedu.serviceAvailability = { status: {"attention_db":"on","bibliography_db":"on","contacts_db":"on","email_db":"on","indexability_db":"on","mentions_db":"on","news_db":"on","notifications_db":"on","offsite_mentions_db":"on","redshift":"on","redshift_exports_db":"on","related_works_db":"on","ring_db":"on","user_tests_db":"on"}, serviceEnabled: function(service) { return this.status[service] === "on"; }, readEnabled: function(service) { return this.serviceEnabled(service) || this.status[service] === "read_only"; }, }; window.Aedu.viewApmTrace = function() { // Check if x-apm-trace-id meta tag is set, and open the trace in APM // in a new window if it is. var apmTraceId = document.head.querySelector('meta[name="x-apm-trace-id"]'); if (apmTraceId) { var traceId = apmTraceId.content; // Use trace ID to construct URL, an example URL looks like: // https://app.datadoghq.com/apm/traces?query=trace_id%31298410148923562634 var apmUrl = 'https://app.datadoghq.com/apm/traces?query=trace_id%3A' + traceId; window.open(apmUrl, '_blank'); } }; </script> <!--[if lt IE 9]> <script src="//cdnjs.cloudflare.com/ajax/libs/html5shiv/3.7.2/html5shiv.min.js"></script> <![endif]--> <link href="https://fonts.googleapis.com/css?family=Roboto:100,100i,300,300i,400,400i,500,500i,700,700i,900,900i" rel="stylesheet"> <link href="//maxcdn.bootstrapcdn.com/font-awesome/4.3.0/css/font-awesome.min.css" rel="stylesheet"> <link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/libraries-a9675dcb01ec4ef6aa807ba772c7a5a00c1820d3ff661c1038a20f80d06bb4e4.css" /> <link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/academia-bdb9e8c097f01e611f2fc5e2f1a9dc599beede975e2ae5629983543a1726e947.css" /> <link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system_legacy-056a9113b9a0f5343d013b29ee1929d5a18be35fdcdceb616600b4db8bd20054.css" /> <script src="//a.academia-assets.com/assets/webpack_bundles/runtime-bundle-005434038af4252ca37c527588411a3d6a0eabb5f727fac83f8bbe7fd88d93bb.js"></script> <script src="//a.academia-assets.com/assets/webpack_bundles/webpack_libraries_and_infrequently_changed.wjs-bundle-3e572e3b706c3ed2ec5b2c1cb44a411fadc81f62a97963cb7bd9c327a0a9d5f2.js"></script> <script src="//a.academia-assets.com/assets/webpack_bundles/core_webpack.wjs-bundle-2e8d3f30eaaddd1debd6ce4630b3453b23a23c91ac7c823ddf8822879835b029.js"></script> <script src="//a.academia-assets.com/assets/webpack_bundles/sentry.wjs-bundle-5fe03fddca915c8ba0f7edbe64c194308e8ce5abaed7bffe1255ff37549c4808.js"></script> <script> jade = window.jade || {}; jade.helpers = window.$h; jade._ = window._; </script> <!-- Google Tag Manager --> <script id="tag-manager-head-root">(function(w,d,s,l,i){w[l]=w[l]||[];w[l].push({'gtm.start': new Date().getTime(),event:'gtm.js'});var f=d.getElementsByTagName(s)[0], j=d.createElement(s),dl=l!='dataLayer'?'&l='+l:'';j.async=true;j.src= 'https://www.googletagmanager.com/gtm.js?id='+i+dl;f.parentNode.insertBefore(j,f); })(window,document,'script','dataLayer_old','GTM-5G9JF7Z');</script> <!-- End Google Tag Manager --> <script> window.gptadslots = []; window.googletag = window.googletag || {}; window.googletag.cmd = window.googletag.cmd || []; </script> <script type="text/javascript"> // TODO(jacob): This should be defined, may be rare load order problem. // Checking if null is just a quick fix, will default to en if unset. // Better fix is to run this immedietely after I18n is set. if (window.I18n != null) { I18n.defaultLocale = "en"; I18n.locale = "en"; I18n.fallbacks = true; } </script> <link rel="canonical" href="https://independent.academia.edu/DavidCorina2" /> </head> <!--[if gte IE 9 ]> <body class='ie ie9 c-profiles/works a-summary logged_out'> <![endif]--> <!--[if !(IE) ]><!--> <body class='c-profiles/works a-summary logged_out'> <!--<![endif]--> <div id="fb-root"></div><script>window.fbAsyncInit = function() { FB.init({ appId: "2369844204", version: "v8.0", status: true, cookie: true, xfbml: true }); // Additional initialization code. if (window.InitFacebook) { // facebook.ts already loaded, set it up. window.InitFacebook(); } else { // Set a flag for facebook.ts to find when it loads. window.academiaAuthReadyFacebook = true; } };</script><script>window.fbAsyncLoad = function() { // Protection against double calling of this function if (window.FB) { return; } (function(d, s, id){ var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) {return;} js = d.createElement(s); js.id = id; js.src = "//connect.facebook.net/en_US/sdk.js"; fjs.parentNode.insertBefore(js, fjs); }(document, 'script', 'facebook-jssdk')); } if (!window.defer_facebook) { // Autoload if not deferred window.fbAsyncLoad(); } else { // Defer loading by 5 seconds setTimeout(function() { window.fbAsyncLoad(); }, 5000); }</script> <div id="google-root"></div><script>window.loadGoogle = function() { if (window.InitGoogle) { // google.ts already loaded, set it up. window.InitGoogle("331998490334-rsn3chp12mbkiqhl6e7lu2q0mlbu0f1b"); } else { // Set a flag for google.ts to use when it loads. window.GoogleClientID = "331998490334-rsn3chp12mbkiqhl6e7lu2q0mlbu0f1b"; } };</script><script>window.googleAsyncLoad = function() { // Protection against double calling of this function (function(d) { var js; var id = 'google-jssdk'; var ref = d.getElementsByTagName('script')[0]; if (d.getElementById(id)) { return; } js = d.createElement('script'); js.id = id; js.async = true; js.onload = loadGoogle; js.src = "https://accounts.google.com/gsi/client" ref.parentNode.insertBefore(js, ref); }(document)); } if (!window.defer_google) { // Autoload if not deferred window.googleAsyncLoad(); } else { // Defer loading by 5 seconds setTimeout(function() { window.googleAsyncLoad(); }, 5000); }</script> <div id="tag-manager-body-root"> <!-- Google Tag Manager (noscript) --> <noscript><iframe src="https://www.googletagmanager.com/ns.html?id=GTM-5G9JF7Z" height="0" width="0" style="display:none;visibility:hidden"></iframe></noscript> <!-- End Google Tag Manager (noscript) --> <!-- Event listeners for analytics --> <script> window.addEventListener('load', function() { if (document.querySelector('input[name="commit"]')) { document.querySelector('input[name="commit"]').addEventListener('click', function() { gtag('event', 'click', { event_category: 'button', event_label: 'Log In' }) }) } }); </script> </div> <script>var _comscore = _comscore || []; _comscore.push({ c1: "2", c2: "26766707" }); (function() { var s = document.createElement("script"), el = document.getElementsByTagName("script")[0]; s.async = true; s.src = (document.location.protocol == "https:" ? "https://sb" : "http://b") + ".scorecardresearch.com/beacon.js"; el.parentNode.insertBefore(s, el); })();</script><img src="https://sb.scorecardresearch.com/p?c1=2&amp;c2=26766707&amp;cv=2.0&amp;cj=1" style="position: absolute; visibility: hidden" /> <div id='react-modal'></div> <div class='DesignSystem'> <a class='u-showOnFocus' href='#site'> Skip to main content </a> </div> <div id="upgrade_ie_banner" style="display: none;"><p>Academia.edu no longer supports Internet Explorer.</p><p>To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to&nbsp;<a href="https://www.academia.edu/upgrade-browser">upgrade your browser</a>.</p></div><script>// Show this banner for all versions of IE if (!!window.MSInputMethodContext || /(MSIE)/.test(navigator.userAgent)) { document.getElementById('upgrade_ie_banner').style.display = 'block'; }</script> <div class="DesignSystem bootstrap ShrinkableNav"><div class="navbar navbar-default main-header"><div class="container-wrapper" id="main-header-container"><div class="container"><div class="navbar-header"><div class="nav-left-wrapper u-mt0x"><div class="nav-logo"><a data-main-header-link-target="logo_home" href="https://www.academia.edu/"><img class="visible-xs-inline-block" style="height: 24px;" alt="Academia.edu" src="//a.academia-assets.com/images/academia-logo-redesign-2015-A.svg" width="24" height="24" /><img width="145.2" height="18" class="hidden-xs" style="height: 24px;" alt="Academia.edu" src="//a.academia-assets.com/images/academia-logo-redesign-2015.svg" /></a></div><div class="nav-search"><div class="SiteSearch-wrapper select2-no-default-pills"><form class="js-SiteSearch-form DesignSystem" action="https://www.academia.edu/search" accept-charset="UTF-8" method="get"><input name="utf8" type="hidden" value="&#x2713;" autocomplete="off" /><i class="SiteSearch-icon fa fa-search u-fw700 u-positionAbsolute u-tcGrayDark"></i><input class="js-SiteSearch-form-input SiteSearch-form-input form-control" data-main-header-click-target="search_input" name="q" placeholder="Search" type="text" value="" /></form></div></div></div><div class="nav-right-wrapper pull-right"><ul class="NavLinks js-main-nav list-unstyled"><li class="NavLinks-link"><a class="js-header-login-url Button Button--inverseGray Button--sm u-mb4x" id="nav_log_in" rel="nofollow" href="https://www.academia.edu/login">Log In</a></li><li class="NavLinks-link u-p0x"><a class="Button Button--inverseGray Button--sm u-mb4x" rel="nofollow" href="https://www.academia.edu/signup">Sign Up</a></li></ul><button class="hidden-lg hidden-md hidden-sm u-ml4x navbar-toggle collapsed" data-target=".js-mobile-header-links" data-toggle="collapse" type="button"><span class="icon-bar"></span><span class="icon-bar"></span><span class="icon-bar"></span></button></div></div><div class="collapse navbar-collapse js-mobile-header-links"><ul class="nav navbar-nav"><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/login">Log In</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/signup">Sign Up</a></li><li class="u-borderColorGrayLight u-borderBottom1 js-mobile-nav-expand-trigger"><a href="#">more&nbsp<span class="caret"></span></a></li><li><ul class="js-mobile-nav-expand-section nav navbar-nav u-m0x collapse"><li class="u-borderColorGrayLight u-borderBottom1"><a rel="false" href="https://www.academia.edu/about">About</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/press">Press</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://medium.com/@academia">Blog</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="false" href="https://www.academia.edu/documents">Papers</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/terms">Terms</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/privacy">Privacy</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/copyright">Copyright</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/hiring"><i class="fa fa-briefcase"></i>&nbsp;We're Hiring!</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://support.academia.edu/"><i class="fa fa-question-circle"></i>&nbsp;Help Center</a></li><li class="js-mobile-nav-collapse-trigger u-borderColorGrayLight u-borderBottom1 dropup" style="display:none"><a href="#">less&nbsp<span class="caret"></span></a></li></ul></li></ul></div></div></div><script>(function(){ var $moreLink = $(".js-mobile-nav-expand-trigger"); var $lessLink = $(".js-mobile-nav-collapse-trigger"); var $section = $('.js-mobile-nav-expand-section'); $moreLink.click(function(ev){ ev.preventDefault(); $moreLink.hide(); $lessLink.show(); $section.collapse('show'); }); $lessLink.click(function(ev){ ev.preventDefault(); $moreLink.show(); $lessLink.hide(); $section.collapse('hide'); }); })() if ($a.is_logged_in() || false) { new Aedu.NavigationController({ el: '.js-main-nav', showHighlightedNotification: false }); } else { $(".js-header-login-url").attr("href", $a.loginUrlWithRedirect()); } Aedu.autocompleteSearch = new AutocompleteSearch({el: '.js-SiteSearch-form'});</script></div></div> <div id='site' class='fixed'> <div id="content" class="clearfix"> <script>document.addEventListener('DOMContentLoaded', function(){ var $dismissible = $(".dismissible_banner"); $dismissible.click(function(ev) { $dismissible.hide(); }); });</script> <script src="//a.academia-assets.com/assets/webpack_bundles/profile.wjs-bundle-ae3d0ee232cd83d11499343688b0160a3c7db15e95cb2d0844cae78d49ea53f1.js" defer="defer"></script><script>Aedu.rankings = { showPaperRankingsLink: false } $viewedUser = Aedu.User.set_viewed( {"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2","photo":"https://0.academia-photos.com/142673724/44016356/34789376/s65_david.corina.jpg","has_photo":true,"is_analytics_public":false,"interests":[{"id":73860,"name":"Phonotactics","url":"https://www.academia.edu/Documents/in/Phonotactics"},{"id":646584,"name":"Cantonese Linguistics","url":"https://www.academia.edu/Documents/in/Cantonese_Linguistics"},{"id":115190,"name":"Big Five Personality Traits","url":"https://www.academia.edu/Documents/in/Big_Five_Personality_Traits"},{"id":8538,"name":"Working Memory","url":"https://www.academia.edu/Documents/in/Working_Memory"},{"id":8938,"name":"Sociophonetics","url":"https://www.academia.edu/Documents/in/Sociophonetics"}]} ); if ($a.is_logged_in() && $viewedUser.is_current_user()) { $('body').addClass('profile-viewed-by-owner'); } $socialProfiles = []</script><div id="js-react-on-rails-context" style="display:none" data-rails-context="{&quot;inMailer&quot;:false,&quot;i18nLocale&quot;:&quot;en&quot;,&quot;i18nDefaultLocale&quot;:&quot;en&quot;,&quot;href&quot;:&quot;https://independent.academia.edu/DavidCorina2&quot;,&quot;location&quot;:&quot;/DavidCorina2&quot;,&quot;scheme&quot;:&quot;https&quot;,&quot;host&quot;:&quot;independent.academia.edu&quot;,&quot;port&quot;:null,&quot;pathname&quot;:&quot;/DavidCorina2&quot;,&quot;search&quot;:null,&quot;httpAcceptLanguage&quot;:null,&quot;serverSide&quot;:false}"></div> <div class="js-react-on-rails-component" style="display:none" data-component-name="ProfileCheckPaperUpdate" data-props="{}" data-trace="false" data-dom-id="ProfileCheckPaperUpdate-react-component-5ca5aef0-ad7d-445d-8440-e655fc077ac2"></div> <div id="ProfileCheckPaperUpdate-react-component-5ca5aef0-ad7d-445d-8440-e655fc077ac2"></div> <div class="DesignSystem"><div class="onsite-ping" id="onsite-ping"></div></div><div class="profile-user-info DesignSystem"><div class="social-profile-container"><div class="left-panel-container"><div class="user-info-component-wrapper"><div class="user-summary-cta-container"><div class="user-summary-container"><div class="social-profile-avatar-container"><img class="profile-avatar u-positionAbsolute" alt="David Corina" border="0" onerror="if (this.src != &#39;//a.academia-assets.com/images/s200_no_pic.png&#39;) this.src = &#39;//a.academia-assets.com/images/s200_no_pic.png&#39;;" width="200" height="200" src="https://0.academia-photos.com/142673724/44016356/34789376/s200_david.corina.jpg" /></div><div class="title-container"><h1 class="ds2-5-heading-sans-serif-sm">David Corina</h1><div class="affiliations-container fake-truncate js-profile-affiliations"></div></div></div><div class="sidebar-cta-container"><button class="ds2-5-button hidden profile-cta-button grow js-profile-follow-button" data-broccoli-component="user-info.follow-button" data-click-track="profile-user-info-follow-button" data-follow-user-fname="David" data-follow-user-id="142673724" data-follow-user-source="profile_button" data-has-google="false"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">add</span>Follow</button><button class="ds2-5-button hidden profile-cta-button grow js-profile-unfollow-button" data-broccoli-component="user-info.unfollow-button" data-click-track="profile-user-info-unfollow-button" data-unfollow-user-id="142673724"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">done</span>Following</button></div></div><div class="user-stats-container"><a><div class="stat-container js-profile-followers"><p class="label">Followers</p><p class="data">9</p></div></a><a><div class="stat-container js-profile-followees" data-broccoli-component="user-info.followees-count" data-click-track="profile-expand-user-info-following"><p class="label">Following</p><p class="data">1</p></div></a><span><div class="stat-container"><p class="label"><span class="js-profile-total-view-text">Public Views</span></p><p class="data"><span class="js-profile-view-count"></span></p></div></span></div><div class="ri-section"><div class="ri-section-header"><span>Interests</span></div><div class="ri-tags-container"><a data-click-track="profile-user-info-expand-research-interests" data-has-card-for-ri-list="142673724" href="https://www.academia.edu/Documents/in/Phonotactics"><div id="js-react-on-rails-context" style="display:none" data-rails-context="{&quot;inMailer&quot;:false,&quot;i18nLocale&quot;:&quot;en&quot;,&quot;i18nDefaultLocale&quot;:&quot;en&quot;,&quot;href&quot;:&quot;https://independent.academia.edu/DavidCorina2&quot;,&quot;location&quot;:&quot;/DavidCorina2&quot;,&quot;scheme&quot;:&quot;https&quot;,&quot;host&quot;:&quot;independent.academia.edu&quot;,&quot;port&quot;:null,&quot;pathname&quot;:&quot;/DavidCorina2&quot;,&quot;search&quot;:null,&quot;httpAcceptLanguage&quot;:null,&quot;serverSide&quot;:false}"></div> <div class="js-react-on-rails-component" style="display:none" data-component-name="Pill" data-props="{&quot;color&quot;:&quot;gray&quot;,&quot;children&quot;:[&quot;Phonotactics&quot;]}" data-trace="false" data-dom-id="Pill-react-component-868f3289-4247-44b0-a5b4-6e4ad4de525a"></div> <div id="Pill-react-component-868f3289-4247-44b0-a5b4-6e4ad4de525a"></div> </a><a data-click-track="profile-user-info-expand-research-interests" data-has-card-for-ri-list="142673724" href="https://www.academia.edu/Documents/in/Cantonese_Linguistics"><div class="js-react-on-rails-component" style="display:none" data-component-name="Pill" data-props="{&quot;color&quot;:&quot;gray&quot;,&quot;children&quot;:[&quot;Cantonese Linguistics&quot;]}" data-trace="false" data-dom-id="Pill-react-component-73287165-552c-4652-abf5-efd4852023c8"></div> <div id="Pill-react-component-73287165-552c-4652-abf5-efd4852023c8"></div> </a><a data-click-track="profile-user-info-expand-research-interests" data-has-card-for-ri-list="142673724" href="https://www.academia.edu/Documents/in/Big_Five_Personality_Traits"><div class="js-react-on-rails-component" style="display:none" data-component-name="Pill" data-props="{&quot;color&quot;:&quot;gray&quot;,&quot;children&quot;:[&quot;Big Five Personality Traits&quot;]}" data-trace="false" data-dom-id="Pill-react-component-1522351c-7c04-4a7b-bb9d-9563029a1843"></div> <div id="Pill-react-component-1522351c-7c04-4a7b-bb9d-9563029a1843"></div> </a><a data-click-track="profile-user-info-expand-research-interests" data-has-card-for-ri-list="142673724" href="https://www.academia.edu/Documents/in/Working_Memory"><div class="js-react-on-rails-component" style="display:none" data-component-name="Pill" data-props="{&quot;color&quot;:&quot;gray&quot;,&quot;children&quot;:[&quot;Working Memory&quot;]}" data-trace="false" data-dom-id="Pill-react-component-7f86e3d0-981f-4f76-9bbd-505d9a9f1709"></div> <div id="Pill-react-component-7f86e3d0-981f-4f76-9bbd-505d9a9f1709"></div> </a><a data-click-track="profile-user-info-expand-research-interests" data-has-card-for-ri-list="142673724" href="https://www.academia.edu/Documents/in/Sociophonetics"><div class="js-react-on-rails-component" style="display:none" data-component-name="Pill" data-props="{&quot;color&quot;:&quot;gray&quot;,&quot;children&quot;:[&quot;Sociophonetics&quot;]}" data-trace="false" data-dom-id="Pill-react-component-7049f8f5-9319-471e-90aa-aa02db53d4e8"></div> <div id="Pill-react-component-7049f8f5-9319-471e-90aa-aa02db53d4e8"></div> </a></div></div></div></div><div class="right-panel-container"><div class="user-content-wrapper"><div class="uploads-container" id="social-redesign-work-container"><div class="upload-header"><h2 class="ds2-5-heading-sans-serif-xs">Uploads</h2></div><div class="documents-container backbone-social-profile-documents" style="width: 100%;"><div class="u-taCenter"></div><div class="profile--tab_content_container js-tab-pane tab-pane active" id="all"><div class="profile--tab_heading_container js-section-heading" data-section="Papers" id="Papers"><h3 class="profile--tab_heading_container">Papers by David Corina</h3></div><div class="js-work-strip profile--work_container" data-work-id="119246721"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/119246721/Human_temporal_cortical_single_neuron_activity_during_working_memory_maintenance"><img alt="Research paper thumbnail of Human temporal cortical single neuron activity during working memory maintenance" class="work-thumbnail" src="https://attachments.academia-assets.com/114662216/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/119246721/Human_temporal_cortical_single_neuron_activity_during_working_memory_maintenance">Human temporal cortical single neuron activity during working memory maintenance</a></div><div class="wp-workCard_item"><span>Neuropsychologia</span><span>, Jun 1, 2016</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="024296379f3e988c5ba5835422d82d58" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:114662216,&quot;asset_id&quot;:119246721,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/114662216/download_file?st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="119246721"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="119246721"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 119246721; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=119246721]").text(description); $(".js-view-count[data-work-id=119246721]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 119246721; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='119246721']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 119246721, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "024296379f3e988c5ba5835422d82d58" } } $('.js-work-strip[data-work-id=119246721]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":119246721,"title":"Human temporal cortical single neuron activity during working memory maintenance","translated_title":"","metadata":{"publisher":"Elsevier BV","grobid_abstract":"The Working Memory model of human memory, first introduced by Baddeley and Hitch (1974) , has been one of the most influential psychological constructs in cognitive psychology and human neuroscience. However the neuronal correlates of core components of this model have yet to be fully elucidated. Here we present data from two studies where human temporal cortical single neuron activity was recorded during tasks differentially affecting the maintenance component of verbal working memory. In Study One we vary the presence or absence of distracting items for the entire period of memory storage. In Study Two we vary the duration of storage so that distractors filled all, or only one-third of the time the memory was stored. Extracellular single neuron recordings were obtained from 36 subjects undergoing awake temporal lobe resections for epilepsy, 25 in Study one, 11 in Study two. Recordings were obtained from a total of 166 lateral temporal cortex neurons during performance of one of these two tasks, 86 study one, 80 study two. Significant changes in activity with distractor manipulation were present in 74 of these neurons (45%), 38 Study one, 36 Study two. In 48 (65%) of those there was increased activity during the period when distracting items were absent, 26 Study One, 22 Study Two. The magnitude of this increase was greater for Study One, 47.6%, than Study Two, 8.1%, paralleling the reduction in memory errors in the absence of distracters, for Study One of 70.3%, Study Two 26.3% These findings establish that human lateral temporal cortex is part of the neural system for working memory, with activity during maintenance of that memory that parallels performance, suggesting it represents active rehearsal. In 31 of these neurons (65%) this activity was an extension of that during working memory encoding that differed significantly from the neural processes recorded during overt and silent language tasks without a recent memory component, 17 Study one, 14 Study two. Contrary to the Baddeley model, that activity during verbal working memory maintenance often represented activity specific to working memory rather than speech or language.","publication_date":{"day":1,"month":6,"year":2016,"errors":{}},"publication_name":"Neuropsychologia","grobid_abstract_attachment_id":114662216},"translated_abstract":null,"internal_url":"https://www.academia.edu/119246721/Human_temporal_cortical_single_neuron_activity_during_working_memory_maintenance","translated_internal_url":"","created_at":"2024-05-17T09:37:19.254-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":114662216,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/114662216/thumbnails/1.jpg","file_name":"pmc4899132.pdf","download_url":"https://www.academia.edu/attachments/114662216/download_file?st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Human_temporal_cortical_single_neuron_ac.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/114662216/pmc4899132-libre.pdf?1715964031=\u0026response-content-disposition=attachment%3B+filename%3DHuman_temporal_cortical_single_neuron_ac.pdf\u0026Expires=1733053720\u0026Signature=cn1fQftdcPvbtwie1TAFvrkTLV4IUpck3eqSw0n~MkiLYfmfbF0KtJpct5QNkpnQ7mNrf8dQbKPKQd7kuj6bB4ekb82NTHqbThNLsE978XIK1FgmbfHMlybHbSp1X6UmadcLV4nsGHjuZ~6Au1Mp8gEl-NA4ywQUShuAL0rzQ-Wgaow16iMOEvgBhhhtgKMYQpHP5opxOLeKXSVFjMsCaBacm1wAWCEtOeEWm8Xas~NgJa2fOL0v1ZFtnVq5rMw7b8OZtLdX41O5HdSSRyDBh~T1Fros1VNriwyZ23IsALiuouC7N0E9N75cH2NiyI5fInDyYcHnIuA4fXa-7dYvYQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Human_temporal_cortical_single_neuron_activity_during_working_memory_maintenance","translated_slug":"","page_count":29,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":114662216,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/114662216/thumbnails/1.jpg","file_name":"pmc4899132.pdf","download_url":"https://www.academia.edu/attachments/114662216/download_file?st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Human_temporal_cortical_single_neuron_ac.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/114662216/pmc4899132-libre.pdf?1715964031=\u0026response-content-disposition=attachment%3B+filename%3DHuman_temporal_cortical_single_neuron_ac.pdf\u0026Expires=1733053720\u0026Signature=cn1fQftdcPvbtwie1TAFvrkTLV4IUpck3eqSw0n~MkiLYfmfbF0KtJpct5QNkpnQ7mNrf8dQbKPKQd7kuj6bB4ekb82NTHqbThNLsE978XIK1FgmbfHMlybHbSp1X6UmadcLV4nsGHjuZ~6Au1Mp8gEl-NA4ywQUShuAL0rzQ-Wgaow16iMOEvgBhhhtgKMYQpHP5opxOLeKXSVFjMsCaBacm1wAWCEtOeEWm8Xas~NgJa2fOL0v1ZFtnVq5rMw7b8OZtLdX41O5HdSSRyDBh~T1Fros1VNriwyZ23IsALiuouC7N0E9N75cH2NiyI5fInDyYcHnIuA4fXa-7dYvYQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"},{"id":114662215,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/114662215/thumbnails/1.jpg","file_name":"pmc4899132.pdf","download_url":"https://www.academia.edu/attachments/114662215/download_file","bulk_download_file_name":"Human_temporal_cortical_single_neuron_ac.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/114662215/pmc4899132-libre.pdf?1715964031=\u0026response-content-disposition=attachment%3B+filename%3DHuman_temporal_cortical_single_neuron_ac.pdf\u0026Expires=1733053720\u0026Signature=T2WHVX1bhbE4-PDBuZfQ-lcQ8YEgmU-0yaxG9k7l~p9bZuXNBH-nAPKtxHFjEGxk54agIFHooPtzyCzaivaXVASDe87M7T7e1~6xU47lKKjeFnjMScD0JwbiuwgJbvRoZ~nB~2pFGGdZLd61d-VEZYtAgBDxcHNcf5Ln~qzTSw6CP7cNIU4ok98ncOQ~kR8Ngv~~nDG8x71CUm6UxkOfCDOQyqeZYAbcxQrA0xph-1Dt2NlmiNOIzA4eoz4Pw4CEXx0xc60cH-d~OM2hZk6AFc2GDgk0qLjdqK6NHpclhQweoF8ycSqackeDMjyvWjnovFK0wyxzCla9h3YfJI65Yw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":161,"name":"Neuroscience","url":"https://www.academia.edu/Documents/in/Neuroscience"},{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":237,"name":"Cognitive Science","url":"https://www.academia.edu/Documents/in/Cognitive_Science"},{"id":8538,"name":"Working Memory","url":"https://www.academia.edu/Documents/in/Working_Memory"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":57557,"name":"Temporal Lobe","url":"https://www.academia.edu/Documents/in/Temporal_Lobe"},{"id":452621,"name":"Neuropsychologia","url":"https://www.academia.edu/Documents/in/Neuropsychologia"},{"id":1239755,"name":"Neurosciences","url":"https://www.academia.edu/Documents/in/Neurosciences"}],"urls":[{"id":42052901,"url":"https://europepmc.org/articles/pmc4899132?pdf=render"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="119246720"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/119246720/Evaluating_spatial_normalization_methods_for_the_human_brain"><img alt="Research paper thumbnail of Evaluating spatial normalization methods for the human brain" class="work-thumbnail" src="https://attachments.academia-assets.com/114662260/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/119246720/Evaluating_spatial_normalization_methods_for_the_human_brain">Evaluating spatial normalization methods for the human brain</a></div><div class="wp-workCard_item"><span>Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual Conference</span><span>, 2005</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Cortical mapping (CSM) studies have shown cortical locations for language function are highly var...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Cortical mapping (CSM) studies have shown cortical locations for language function are highly variable from one subject to the next. If individual variation can be normalized, patterns of language organization may emerge that were heretofore hidden. In order to uncover these patterns, computer-aided spatial normalization to a common atlas is required. Our goal was to determine a methodology by which spatial normalization methods could be evaluated and compared. We developed key metrics to measure accuracy of a surface-based (Caret) and volume-based (SPM2) method. We specified that the optimal method would i) minimize variation as measured by spread reduction between CSM language sites across subjects while also ii) preserving anatomical localization of all CSM sites. Eleven subject&amp;#39;s structural MR image sets and corresponding CSM site coordinates were registered to the colin27 human brain atlas using each method. Local analysis showed that mapping error rates were highest in mor...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="391d6740af3f89be07e8bb574ccbaa89" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:114662260,&quot;asset_id&quot;:119246720,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/114662260/download_file?st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="119246720"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="119246720"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 119246720; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=119246720]").text(description); $(".js-view-count[data-work-id=119246720]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 119246720; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='119246720']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 119246720, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "391d6740af3f89be07e8bb574ccbaa89" } } $('.js-work-strip[data-work-id=119246720]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":119246720,"title":"Evaluating spatial normalization methods for the human brain","translated_title":"","metadata":{"abstract":"Cortical mapping (CSM) studies have shown cortical locations for language function are highly variable from one subject to the next. If individual variation can be normalized, patterns of language organization may emerge that were heretofore hidden. In order to uncover these patterns, computer-aided spatial normalization to a common atlas is required. Our goal was to determine a methodology by which spatial normalization methods could be evaluated and compared. We developed key metrics to measure accuracy of a surface-based (Caret) and volume-based (SPM2) method. We specified that the optimal method would i) minimize variation as measured by spread reduction between CSM language sites across subjects while also ii) preserving anatomical localization of all CSM sites. Eleven subject\u0026#39;s structural MR image sets and corresponding CSM site coordinates were registered to the colin27 human brain atlas using each method. Local analysis showed that mapping error rates were highest in mor...","publication_date":{"day":null,"month":null,"year":2005,"errors":{}},"publication_name":"Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual Conference"},"translated_abstract":"Cortical mapping (CSM) studies have shown cortical locations for language function are highly variable from one subject to the next. If individual variation can be normalized, patterns of language organization may emerge that were heretofore hidden. In order to uncover these patterns, computer-aided spatial normalization to a common atlas is required. Our goal was to determine a methodology by which spatial normalization methods could be evaluated and compared. We developed key metrics to measure accuracy of a surface-based (Caret) and volume-based (SPM2) method. We specified that the optimal method would i) minimize variation as measured by spread reduction between CSM language sites across subjects while also ii) preserving anatomical localization of all CSM sites. Eleven subject\u0026#39;s structural MR image sets and corresponding CSM site coordinates were registered to the colin27 human brain atlas using each method. Local analysis showed that mapping error rates were highest in mor...","internal_url":"https://www.academia.edu/119246720/Evaluating_spatial_normalization_methods_for_the_human_brain","translated_internal_url":"","created_at":"2024-05-17T09:37:18.985-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":114662260,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/114662260/thumbnails/1.jpg","file_name":"download.pdf","download_url":"https://www.academia.edu/attachments/114662260/download_file?st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Evaluating_spatial_normalization_methods.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/114662260/download-libre.pdf?1715964060=\u0026response-content-disposition=attachment%3B+filename%3DEvaluating_spatial_normalization_methods.pdf\u0026Expires=1733053720\u0026Signature=Bb3aCMqitiRCUW004ft8jEgfCs5Y8pEDW5jJ9jWAIXnYnItiIsVhTaqJHOCqR6TtTl5F-iBahr9Zo6BZInUGTFRo~3rL~tGXXLDehbL1oig5Q16AE1zGCID2bydEPe-FQ9UtZ0zvex7q3Aon~VS1kO0c2eYN9ZkLbUjdwHyy2uGNiOkPKLxtsz0V68go~YfssCMlPGVZKuF4Nd4bBL8A71Ty2HJqErYW6gi-Dd7B1y7qDFUvTYouRSRbf1t9SHM4ea3mqpoMSPtr2n3WUXEwxpej7m1cPlXk1OSCkF8txWvoq6KVQ71QjPCq9nMzFTuWjz~XFdOk-0IJKb~DrJYmxQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Evaluating_spatial_normalization_methods_for_the_human_brain","translated_slug":"","page_count":133,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":114662260,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/114662260/thumbnails/1.jpg","file_name":"download.pdf","download_url":"https://www.academia.edu/attachments/114662260/download_file?st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Evaluating_spatial_normalization_methods.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/114662260/download-libre.pdf?1715964060=\u0026response-content-disposition=attachment%3B+filename%3DEvaluating_spatial_normalization_methods.pdf\u0026Expires=1733053720\u0026Signature=Bb3aCMqitiRCUW004ft8jEgfCs5Y8pEDW5jJ9jWAIXnYnItiIsVhTaqJHOCqR6TtTl5F-iBahr9Zo6BZInUGTFRo~3rL~tGXXLDehbL1oig5Q16AE1zGCID2bydEPe-FQ9UtZ0zvex7q3Aon~VS1kO0c2eYN9ZkLbUjdwHyy2uGNiOkPKLxtsz0V68go~YfssCMlPGVZKuF4Nd4bBL8A71Ty2HJqErYW6gi-Dd7B1y7qDFUvTYouRSRbf1t9SHM4ea3mqpoMSPtr2n3WUXEwxpej7m1cPlXk1OSCkF8txWvoq6KVQ71QjPCq9nMzFTuWjz~XFdOk-0IJKb~DrJYmxQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":465,"name":"Artificial Intelligence","url":"https://www.academia.edu/Documents/in/Artificial_Intelligence"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":29731,"name":"Image Registration","url":"https://www.academia.edu/Documents/in/Image_Registration"},{"id":54589,"name":"Anatomy","url":"https://www.academia.edu/Documents/in/Anatomy"},{"id":125564,"name":"Statistical Significance","url":"https://www.academia.edu/Documents/in/Statistical_Significance"},{"id":137633,"name":"Feedback","url":"https://www.academia.edu/Documents/in/Feedback"},{"id":152918,"name":"Error Analysis","url":"https://www.academia.edu/Documents/in/Error_Analysis"},{"id":164637,"name":"Bit Error Rate","url":"https://www.academia.edu/Documents/in/Bit_Error_Rate"},{"id":179931,"name":"Conference Proceedings","url":"https://www.academia.edu/Documents/in/Conference_Proceedings"},{"id":198377,"name":"Individual variation","url":"https://www.academia.edu/Documents/in/Individual_variation"},{"id":203010,"name":"Human Brain","url":"https://www.academia.edu/Documents/in/Human_Brain"},{"id":1122411,"name":"Mr Imaging","url":"https://www.academia.edu/Documents/in/Mr_Imaging"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="119246713"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/119246713/A_novel_EEG_paradigm_to_simultaneously_and_rapidly_assess_the_functioning_of_auditory_and_visual_pathways"><img alt="Research paper thumbnail of A novel EEG paradigm to simultaneously and rapidly assess the functioning of auditory and visual pathways" class="work-thumbnail" src="https://attachments.academia-assets.com/114662242/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/119246713/A_novel_EEG_paradigm_to_simultaneously_and_rapidly_assess_the_functioning_of_auditory_and_visual_pathways">A novel EEG paradigm to simultaneously and rapidly assess the functioning of auditory and visual pathways</a></div><div class="wp-workCard_item"><span>Journal of Neurophysiology</span><span>, 2019</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Objective assessment of the sensory pathways is crucial for understanding their development acros...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Objective assessment of the sensory pathways is crucial for understanding their development across the life span and how they may be affected by neurodevelopmental disorders (e.g., autism spectrum) and neurological pathologies (e.g., stroke, multiple sclerosis, etc.). Quick and passive measurements, for example, using electroencephalography (EEG), are especially important when working with infants and young children and with patient populations having communication deficits (e.g., aphasia). However, many EEG paradigms are limited to measuring activity from one sensory domain at a time, may be time consuming, and target only a subset of possible responses from that particular sensory domain (e.g., only auditory brainstem responses or only auditory P1-N1-P2 evoked potentials). Thus we developed a new multisensory paradigm that enables simultaneous, robust, and rapid (6–12 min) measurements of both auditory and visual EEG activity, including auditory brainstem responses, auditory and v...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="89f3ad46808fae12abd7e500df343f77" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:114662242,&quot;asset_id&quot;:119246713,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/114662242/download_file?st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="119246713"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="119246713"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 119246713; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=119246713]").text(description); $(".js-view-count[data-work-id=119246713]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 119246713; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='119246713']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 119246713, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "89f3ad46808fae12abd7e500df343f77" } } $('.js-work-strip[data-work-id=119246713]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":119246713,"title":"A novel EEG paradigm to simultaneously and rapidly assess the functioning of auditory and visual pathways","translated_title":"","metadata":{"abstract":"Objective assessment of the sensory pathways is crucial for understanding their development across the life span and how they may be affected by neurodevelopmental disorders (e.g., autism spectrum) and neurological pathologies (e.g., stroke, multiple sclerosis, etc.). Quick and passive measurements, for example, using electroencephalography (EEG), are especially important when working with infants and young children and with patient populations having communication deficits (e.g., aphasia). However, many EEG paradigms are limited to measuring activity from one sensory domain at a time, may be time consuming, and target only a subset of possible responses from that particular sensory domain (e.g., only auditory brainstem responses or only auditory P1-N1-P2 evoked potentials). Thus we developed a new multisensory paradigm that enables simultaneous, robust, and rapid (6–12 min) measurements of both auditory and visual EEG activity, including auditory brainstem responses, auditory and v...","publisher":"American Physiological Society","publication_date":{"day":null,"month":null,"year":2019,"errors":{}},"publication_name":"Journal of Neurophysiology"},"translated_abstract":"Objective assessment of the sensory pathways is crucial for understanding their development across the life span and how they may be affected by neurodevelopmental disorders (e.g., autism spectrum) and neurological pathologies (e.g., stroke, multiple sclerosis, etc.). Quick and passive measurements, for example, using electroencephalography (EEG), are especially important when working with infants and young children and with patient populations having communication deficits (e.g., aphasia). However, many EEG paradigms are limited to measuring activity from one sensory domain at a time, may be time consuming, and target only a subset of possible responses from that particular sensory domain (e.g., only auditory brainstem responses or only auditory P1-N1-P2 evoked potentials). Thus we developed a new multisensory paradigm that enables simultaneous, robust, and rapid (6–12 min) measurements of both auditory and visual EEG activity, including auditory brainstem responses, auditory and v...","internal_url":"https://www.academia.edu/119246713/A_novel_EEG_paradigm_to_simultaneously_and_rapidly_assess_the_functioning_of_auditory_and_visual_pathways","translated_internal_url":"","created_at":"2024-05-17T09:36:59.676-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":114662242,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/114662242/thumbnails/1.jpg","file_name":"231868537.pdf","download_url":"https://www.academia.edu/attachments/114662242/download_file?st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"A_novel_EEG_paradigm_to_simultaneously_a.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/114662242/231868537-libre.pdf?1715964183=\u0026response-content-disposition=attachment%3B+filename%3DA_novel_EEG_paradigm_to_simultaneously_a.pdf\u0026Expires=1733053720\u0026Signature=VnMzHOnQSI92D9fS5V3p4BGusvw5BDyd0OR5L~P0JU26AO-EI0nSq8-GIU3iKZtSg4r4skNvOUPVRgrNScH8Zx7IoC9wBrhgzhGXdbOGu1nGIqoek00DSIFvEjQMV6Os~na7I8-uWvS1S0jHkC94t9x04Vhgqqx8zfL9oYZeVTLaF8sZwbp2L5X4HjMd5vpxKrO1Kg-hPftflqbNe8Y4izHLoKcWISzf-X01jfTm~MFVKYAzhzcytd3~Sc8zrt1CfNyh6Po4APEh93KjSN-WMrFUlvgM7yCm8iXQZcqNjSiuwEzg9ToTTMZ7WJiXLvzswu8U3lqhkZYxg98iAkL2qw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"A_novel_EEG_paradigm_to_simultaneously_and_rapidly_assess_the_functioning_of_auditory_and_visual_pathways","translated_slug":"","page_count":60,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":114662242,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/114662242/thumbnails/1.jpg","file_name":"231868537.pdf","download_url":"https://www.academia.edu/attachments/114662242/download_file?st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"A_novel_EEG_paradigm_to_simultaneously_a.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/114662242/231868537-libre.pdf?1715964183=\u0026response-content-disposition=attachment%3B+filename%3DA_novel_EEG_paradigm_to_simultaneously_a.pdf\u0026Expires=1733053720\u0026Signature=VnMzHOnQSI92D9fS5V3p4BGusvw5BDyd0OR5L~P0JU26AO-EI0nSq8-GIU3iKZtSg4r4skNvOUPVRgrNScH8Zx7IoC9wBrhgzhGXdbOGu1nGIqoek00DSIFvEjQMV6Os~na7I8-uWvS1S0jHkC94t9x04Vhgqqx8zfL9oYZeVTLaF8sZwbp2L5X4HjMd5vpxKrO1Kg-hPftflqbNe8Y4izHLoKcWISzf-X01jfTm~MFVKYAzhzcytd3~Sc8zrt1CfNyh6Po4APEh93KjSN-WMrFUlvgM7yCm8iXQZcqNjSiuwEzg9ToTTMZ7WJiXLvzswu8U3lqhkZYxg98iAkL2qw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":161,"name":"Neuroscience","url":"https://www.academia.edu/Documents/in/Neuroscience"},{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":10904,"name":"Electroencephalography","url":"https://www.academia.edu/Documents/in/Electroencephalography"},{"id":22272,"name":"Neurophysiology","url":"https://www.academia.edu/Documents/in/Neurophysiology"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":2922956,"name":"Psychology and Cognitive Sciences","url":"https://www.academia.edu/Documents/in/Psychology_and_Cognitive_Sciences"},{"id":3763225,"name":"Medical and Health Sciences","url":"https://www.academia.edu/Documents/in/Medical_and_Health_Sciences"}],"urls":[{"id":42052897,"url":"https://www.physiology.org/doi/pdf/10.1152/jn.00868.2018"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554085"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554085/in_American_Sign_Language"><img alt="Research paper thumbnail of in American Sign Language?" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554085/in_American_Sign_Language">in American Sign Language?</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Perceptual invariance or orientation specificity</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554085"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554085"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554085; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554085]").text(description); $(".js-view-count[data-work-id=77554085]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554085; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554085']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554085, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=77554085]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554085,"title":"in American Sign Language?","translated_title":"","metadata":{"abstract":"Perceptual invariance or orientation specificity","publication_date":{"day":null,"month":null,"year":2015,"errors":{}}},"translated_abstract":"Perceptual invariance or orientation specificity","internal_url":"https://www.academia.edu/77554085/in_American_Sign_Language","translated_internal_url":"","created_at":"2022-04-25T04:29:43.425-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"in_American_Sign_Language","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[],"research_interests":[{"id":319041,"name":"Repetition Priming","url":"https://www.academia.edu/Documents/in/Repetition_Priming"}],"urls":[{"id":19887687,"url":"http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.602.5112\u0026rep=rep1\u0026type=pdf"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554084"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554084/Characterization_Of_Visual_Properties_Of_Spatial_Frequency_And_Speed_in"><img alt="Research paper thumbnail of Characterization Of Visual Properties Of Spatial Frequency And Speed in" class="work-thumbnail" src="https://attachments.academia-assets.com/84887712/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554084/Characterization_Of_Visual_Properties_Of_Spatial_Frequency_And_Speed_in">Characterization Of Visual Properties Of Spatial Frequency And Speed in</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Careful measurements of the dynamics of speech production have provided important insights into p...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Careful measurements of the dynamics of speech production have provided important insights into phonetic properties of spoken languages. By contrast, analytic quantification of the visual properties of signed languages remains largely unexplored. The purpose of this study was to characterize the spatial and temporal visual properties of American Sign Language (ASL). Novel measurement techniques were used to analyze the spatial frequency of signs and the speed of the hands as they move through space. In Study 1, the amount of energy (or &amp;quot;contrast&amp;quot;) as a function of spatial frequency was determined for various sign categories by applying a Fourier transform to static photographs of twoASL signers. In order to determine whether signing produces unique spatial frequency information, amplitude spectra of a person signing were compared to those of a &amp;quot;neutral&amp;quot; image of a person at rest (not signing). The results of this study reveal only small differences in the amplitu...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="bb5f98517fb67ff226fa08d06f3eb288" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84887712,&quot;asset_id&quot;:77554084,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84887712/download_file?st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554084"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554084"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554084; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554084]").text(description); $(".js-view-count[data-work-id=77554084]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554084; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554084']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554084, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "bb5f98517fb67ff226fa08d06f3eb288" } } $('.js-work-strip[data-work-id=77554084]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554084,"title":"Characterization Of Visual Properties Of Spatial Frequency And Speed in","translated_title":"","metadata":{"abstract":"Careful measurements of the dynamics of speech production have provided important insights into phonetic properties of spoken languages. By contrast, analytic quantification of the visual properties of signed languages remains largely unexplored. The purpose of this study was to characterize the spatial and temporal visual properties of American Sign Language (ASL). Novel measurement techniques were used to analyze the spatial frequency of signs and the speed of the hands as they move through space. In Study 1, the amount of energy (or \u0026quot;contrast\u0026quot;) as a function of spatial frequency was determined for various sign categories by applying a Fourier transform to static photographs of twoASL signers. In order to determine whether signing produces unique spatial frequency information, amplitude spectra of a person signing were compared to those of a \u0026quot;neutral\u0026quot; image of a person at rest (not signing). The results of this study reveal only small differences in the amplitu...","publication_date":{"day":null,"month":null,"year":2003,"errors":{}}},"translated_abstract":"Careful measurements of the dynamics of speech production have provided important insights into phonetic properties of spoken languages. By contrast, analytic quantification of the visual properties of signed languages remains largely unexplored. The purpose of this study was to characterize the spatial and temporal visual properties of American Sign Language (ASL). Novel measurement techniques were used to analyze the spatial frequency of signs and the speed of the hands as they move through space. In Study 1, the amount of energy (or \u0026quot;contrast\u0026quot;) as a function of spatial frequency was determined for various sign categories by applying a Fourier transform to static photographs of twoASL signers. In order to determine whether signing produces unique spatial frequency information, amplitude spectra of a person signing were compared to those of a \u0026quot;neutral\u0026quot; image of a person at rest (not signing). The results of this study reveal only small differences in the amplitu...","internal_url":"https://www.academia.edu/77554084/Characterization_Of_Visual_Properties_Of_Spatial_Frequency_And_Speed_in","translated_internal_url":"","created_at":"2022-04-25T04:29:43.293-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84887712,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887712/thumbnails/1.jpg","file_name":"BosworthManuscript.pdf","download_url":"https://www.academia.edu/attachments/84887712/download_file?st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Characterization_Of_Visual_Properties_Of.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887712/BosworthManuscript-libre.pdf?1650938759=\u0026response-content-disposition=attachment%3B+filename%3DCharacterization_Of_Visual_Properties_Of.pdf\u0026Expires=1733053720\u0026Signature=PbC~WefGAVBMWIU2wvDI8yBPqOIt0pYCx-7LF8CWQoiwCKjW1~l6bmRO83EVGeUCqYrwMxRBassdtDkwEf7IFuFb~lZMbGz6yKP8PSrNrC0ggAut9libYOW0a2DS3-cguxDTj64Dbqthld0c~53OG0lxoYRGbxuRkYimshVJSPUOTn3-CJufaYCGbGGQlCGsufmPZgY4CYmJhcZaALhXIC0cBw0vw9X11i8MjJbK0HjeVL03U5pNdBFS8ImbhfZpxktddQws-f1YSf7Jbb9kj76SABEgPKrxvbcgQQwMbdj8bd1NDv2ZeSaHtOSuad7cOO1lAjttd2Nx~sUJXFcb0A__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Characterization_Of_Visual_Properties_Of_Spatial_Frequency_And_Speed_in","translated_slug":"","page_count":17,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84887712,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887712/thumbnails/1.jpg","file_name":"BosworthManuscript.pdf","download_url":"https://www.academia.edu/attachments/84887712/download_file?st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Characterization_Of_Visual_Properties_Of.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887712/BosworthManuscript-libre.pdf?1650938759=\u0026response-content-disposition=attachment%3B+filename%3DCharacterization_Of_Visual_Properties_Of.pdf\u0026Expires=1733053720\u0026Signature=PbC~WefGAVBMWIU2wvDI8yBPqOIt0pYCx-7LF8CWQoiwCKjW1~l6bmRO83EVGeUCqYrwMxRBassdtDkwEf7IFuFb~lZMbGz6yKP8PSrNrC0ggAut9libYOW0a2DS3-cguxDTj64Dbqthld0c~53OG0lxoYRGbxuRkYimshVJSPUOTn3-CJufaYCGbGGQlCGsufmPZgY4CYmJhcZaALhXIC0cBw0vw9X11i8MjJbK0HjeVL03U5pNdBFS8ImbhfZpxktddQws-f1YSf7Jbb9kj76SABEgPKrxvbcgQQwMbdj8bd1NDv2ZeSaHtOSuad7cOO1lAjttd2Nx~sUJXFcb0A__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":237,"name":"Cognitive Science","url":"https://www.academia.edu/Documents/in/Cognitive_Science"},{"id":1214,"name":"Sign Language","url":"https://www.academia.edu/Documents/in/Sign_Language"},{"id":8054,"name":"Speech Production","url":"https://www.academia.edu/Documents/in/Speech_Production"},{"id":9781,"name":"American Sign Language","url":"https://www.academia.edu/Documents/in/American_Sign_Language"},{"id":155840,"name":"Spatial Frequency","url":"https://www.academia.edu/Documents/in/Spatial_Frequency"},{"id":267802,"name":"Dimensional","url":"https://www.academia.edu/Documents/in/Dimensional"},{"id":390056,"name":"Fourier transform","url":"https://www.academia.edu/Documents/in/Fourier_transform"},{"id":3007616,"name":"Measurement technique","url":"https://www.academia.edu/Documents/in/Measurement_technique"}],"urls":[{"id":19887686,"url":"http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.5.4932\u0026rep=rep1\u0026type=pdf"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554083"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554083/Full_title_Perceptual_invariance_or_orientation_specificity_in_American_Sign_Language_Evidence_from_repetition_priming_for_signs_and_gestures_Short_title_ASL_repetition_priming"><img alt="Research paper thumbnail of Full title : Perceptual invariance or orientation specificity in American Sign Language ? Evidence from repetition priming for signs and gestures Short title : ASL repetition priming" class="work-thumbnail" src="https://attachments.academia-assets.com/84847728/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554083/Full_title_Perceptual_invariance_or_orientation_specificity_in_American_Sign_Language_Evidence_from_repetition_priming_for_signs_and_gestures_Short_title_ASL_repetition_priming">Full title : Perceptual invariance or orientation specificity in American Sign Language ? Evidence from repetition priming for signs and gestures Short title : ASL repetition priming</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Repetition priming has been successfully employed to examine stages of processing in a wide varie...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Repetition priming has been successfully employed to examine stages of processing in a wide variety of cognitive domains including language, object recognition and memory. This study uses a novel repetition priming paradigm in the context of a categorization task to explore early stages in the processing of American Sign Language signs and self-grooming gestures. Specifically, we investigated the degree to which deaf signers&amp;#39; and hearing non-signers&amp;#39; perception of these linguistic or non-linguistic actions might be differentially robust to changes in perceptual viewpoint. We conjectured that to the extent that signers were accessing language-specific representations in their performance of the task, they might show more similar priming effects under different viewing conditions than hearing subjects. In essence, this would provide evidence for a visually-based &amp;quot; lack of invariance &amp;quot; phenomenon. However, if the early stages of visual action processing are similar fo...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="87f339653d0cb53bb97eedd3097a2642" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84847728,&quot;asset_id&quot;:77554083,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84847728/download_file?st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554083"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554083"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554083; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554083]").text(description); $(".js-view-count[data-work-id=77554083]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554083; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554083']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554083, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "87f339653d0cb53bb97eedd3097a2642" } } $('.js-work-strip[data-work-id=77554083]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554083,"title":"Full title : Perceptual invariance or orientation specificity in American Sign Language ? Evidence from repetition priming for signs and gestures Short title : ASL repetition priming","translated_title":"","metadata":{"abstract":"Repetition priming has been successfully employed to examine stages of processing in a wide variety of cognitive domains including language, object recognition and memory. This study uses a novel repetition priming paradigm in the context of a categorization task to explore early stages in the processing of American Sign Language signs and self-grooming gestures. Specifically, we investigated the degree to which deaf signers\u0026#39; and hearing non-signers\u0026#39; perception of these linguistic or non-linguistic actions might be differentially robust to changes in perceptual viewpoint. We conjectured that to the extent that signers were accessing language-specific representations in their performance of the task, they might show more similar priming effects under different viewing conditions than hearing subjects. In essence, this would provide evidence for a visually-based \u0026quot; lack of invariance \u0026quot; phenomenon. However, if the early stages of visual action processing are similar fo...","ai_title_tag":"Perceptual Invariance in ASL: Evidence from Repetition Priming","publication_date":{"day":null,"month":null,"year":2010,"errors":{}}},"translated_abstract":"Repetition priming has been successfully employed to examine stages of processing in a wide variety of cognitive domains including language, object recognition and memory. This study uses a novel repetition priming paradigm in the context of a categorization task to explore early stages in the processing of American Sign Language signs and self-grooming gestures. Specifically, we investigated the degree to which deaf signers\u0026#39; and hearing non-signers\u0026#39; perception of these linguistic or non-linguistic actions might be differentially robust to changes in perceptual viewpoint. We conjectured that to the extent that signers were accessing language-specific representations in their performance of the task, they might show more similar priming effects under different viewing conditions than hearing subjects. In essence, this would provide evidence for a visually-based \u0026quot; lack of invariance \u0026quot; phenomenon. However, if the early stages of visual action processing are similar fo...","internal_url":"https://www.academia.edu/77554083/Full_title_Perceptual_invariance_or_orientation_specificity_in_American_Sign_Language_Evidence_from_repetition_priming_for_signs_and_gestures_Short_title_ASL_repetition_priming","translated_internal_url":"","created_at":"2022-04-25T04:29:43.165-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84847728,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84847728/thumbnails/1.jpg","file_name":"Corina_20FLR_20Paper_20August_202010.pdf","download_url":"https://www.academia.edu/attachments/84847728/download_file?st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Full_title_Perceptual_invariance_or_orie.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84847728/Corina_20FLR_20Paper_20August_202010-libre.pdf?1650886665=\u0026response-content-disposition=attachment%3B+filename%3DFull_title_Perceptual_invariance_or_orie.pdf\u0026Expires=1733053720\u0026Signature=TKIiFIP5tLDxTF8U-qHGwWnhQ5kg3kFts~DbB7G8aX0NZX6HyMYNcbUJ-rVDfFAI28C9n95PnKuTvVMElGxZoRh8iGNyuYapTrYQokiXAAITw-TZGEpaaF0lhcBf962aKfdSs~zUxtfO03RJn~mXc4b8iv6bG55yA~oejXe1bSANYx4M6lHXh8J~H~mCbLy-w4f-Rt2J1v2LPs11nZdxp57Zr29BhLEXBcQiuGz9qXTZun6O~JwkfLJhP3yPlNuM00dUXOrZGmJWKgT~Mphr7Sve9pGUKjqEi0UKDQOx7RSbuNS-Hbgn84OyLkFMJQOPHrS18C6kr9PMWFHbRzsppg__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Full_title_Perceptual_invariance_or_orientation_specificity_in_American_Sign_Language_Evidence_from_repetition_priming_for_signs_and_gestures_Short_title_ASL_repetition_priming","translated_slug":"","page_count":36,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84847728,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84847728/thumbnails/1.jpg","file_name":"Corina_20FLR_20Paper_20August_202010.pdf","download_url":"https://www.academia.edu/attachments/84847728/download_file?st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Full_title_Perceptual_invariance_or_orie.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84847728/Corina_20FLR_20Paper_20August_202010-libre.pdf?1650886665=\u0026response-content-disposition=attachment%3B+filename%3DFull_title_Perceptual_invariance_or_orie.pdf\u0026Expires=1733053720\u0026Signature=TKIiFIP5tLDxTF8U-qHGwWnhQ5kg3kFts~DbB7G8aX0NZX6HyMYNcbUJ-rVDfFAI28C9n95PnKuTvVMElGxZoRh8iGNyuYapTrYQokiXAAITw-TZGEpaaF0lhcBf962aKfdSs~zUxtfO03RJn~mXc4b8iv6bG55yA~oejXe1bSANYx4M6lHXh8J~H~mCbLy-w4f-Rt2J1v2LPs11nZdxp57Zr29BhLEXBcQiuGz9qXTZun6O~JwkfLJhP3yPlNuM00dUXOrZGmJWKgT~Mphr7Sve9pGUKjqEi0UKDQOx7RSbuNS-Hbgn84OyLkFMJQOPHrS18C6kr9PMWFHbRzsppg__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"},{"id":84847729,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84847729/thumbnails/1.jpg","file_name":"Corina_20FLR_20Paper_20August_202010.pdf","download_url":"https://www.academia.edu/attachments/84847729/download_file","bulk_download_file_name":"Full_title_Perceptual_invariance_or_orie.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84847729/Corina_20FLR_20Paper_20August_202010-libre.pdf?1650886665=\u0026response-content-disposition=attachment%3B+filename%3DFull_title_Perceptual_invariance_or_orie.pdf\u0026Expires=1733053720\u0026Signature=CsRC1GJyneuC6EnLw6DkIxQUSrYbKefVEpGcdVZ4-viXE8eMmzl2bKrxHWJpUgU-cbI17tAJxOfr0c1yUZnW7efOKWk0CRYQ-wPRQTDL10fqSLET9TV8rvJyRV4jCC-DkeXOWB7tfgInEJr6IOYD8kZPReKy5Ii0Htf9tLeK1Oy0I1wlrMJXYZneeDjKWy4RSBtu7eyXl9e9dXbxvpdhaeM-dLvhffkEQyYOiLXVQ2IicUQwJxHChFAau5NtY1OJwjRmn5Xh0clzEgDe9QqAbe7YtfjfcRuXzEmieXvfvP9tSfPs3ofd5X4auUxoPNClQOnHmev3aBm0Ij80tlAftg__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[],"urls":[{"id":19887685,"url":"http://lcn.salk.edu/publications/2009%202010/Corina%20FLR%20Paper%20August%202010.pdf"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554082"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554082/Differential_Processing_of_Topographic_and_Referential_Functions_of_Space"><img alt="Research paper thumbnail of Differential Processing of Topographic and Referential Functions of Space" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554082/Differential_Processing_of_Topographic_and_Referential_Functions_of_Space">Differential Processing of Topographic and Referential Functions of Space</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">... involving 37 deaf adult signers and 1 hearing adult signer with a predominantly mesial superi...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">... involving 37 deaf adult signers and 1 hearing adult signer with a predominantly mesial superior occipital-parietal lesion] that investigate: (1) the neural underpinnings of topographic and referential spatial functions, (2) on-line processing of these different uses of space, and (3 ...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554082"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554082"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554082; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554082]").text(description); $(".js-view-count[data-work-id=77554082]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554082; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554082']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554082, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=77554082]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554082,"title":"Differential Processing of Topographic and Referential Functions of Space","translated_title":"","metadata":{"abstract":"... involving 37 deaf adult signers and 1 hearing adult signer with a predominantly mesial superior occipital-parietal lesion] that investigate: (1) the neural underpinnings of topographic and referential spatial functions, (2) on-line processing of these different uses of space, and (3 ...","publication_date":{"day":null,"month":null,"year":2013,"errors":{}}},"translated_abstract":"... involving 37 deaf adult signers and 1 hearing adult signer with a predominantly mesial superior occipital-parietal lesion] that investigate: (1) the neural underpinnings of topographic and referential spatial functions, (2) on-line processing of these different uses of space, and (3 ...","internal_url":"https://www.academia.edu/77554082/Differential_Processing_of_Topographic_and_Referential_Functions_of_Space","translated_internal_url":"","created_at":"2022-04-25T04:29:43.059-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Differential_Processing_of_Topographic_and_Referential_Functions_of_Space","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[],"research_interests":[{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554081"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554081/Procesamiento_neural_de_un_lenguaje_silbado"><img alt="Research paper thumbnail of Procesamiento neural de un lenguaje silbado" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554081/Procesamiento_neural_de_un_lenguaje_silbado">Procesamiento neural de un lenguaje silbado</a></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554081"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554081"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554081; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554081]").text(description); $(".js-view-count[data-work-id=77554081]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554081; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554081']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554081, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=77554081]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554081,"title":"Procesamiento neural de un lenguaje silbado","translated_title":"","metadata":{"publication_date":{"day":null,"month":null,"year":2005,"errors":{}}},"translated_abstract":null,"internal_url":"https://www.academia.edu/77554081/Procesamiento_neural_de_un_lenguaje_silbado","translated_internal_url":"","created_at":"2022-04-25T04:29:42.968-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Procesamiento_neural_de_un_lenguaje_silbado","translated_slug":"","page_count":null,"language":"es","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2","email":"akk2STJxV015dnVhOGxvdmZydzY2d1NhSE5MRis4NlpualJjTmZlNEQyOD0tLXAzVnV1eFl0VkxaVVZDRk0vQnJCWlE9PQ==--391f32be9bca30489f63a7cdd92ac71a49be911d"},"attachments":[],"research_interests":[],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554080"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554080/Visual_Attention_to_the_Periphery_Is_Enhanced_in_Congenitally_Deaf_Individuals"><img alt="Research paper thumbnail of Visual Attention to the Periphery Is Enhanced in Congenitally Deaf Individuals" class="work-thumbnail" src="https://attachments.academia-assets.com/84887760/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554080/Visual_Attention_to_the_Periphery_Is_Enhanced_in_Congenitally_Deaf_Individuals">Visual Attention to the Periphery Is Enhanced in Congenitally Deaf Individuals</a></div><div class="wp-workCard_item"><span>The Journal of Neuroscience</span><span>, 2000</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="5b7c5858843660adeef687ccfb1fd6b4" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84887760,&quot;asset_id&quot;:77554080,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84887760/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554080"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554080"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554080; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554080]").text(description); $(".js-view-count[data-work-id=77554080]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554080; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554080']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554080, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "5b7c5858843660adeef687ccfb1fd6b4" } } $('.js-work-strip[data-work-id=77554080]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554080,"title":"Visual Attention to the Periphery Is Enhanced in Congenitally Deaf Individuals","translated_title":"","metadata":{"publisher":"Society for Neuroscience","publication_date":{"day":null,"month":null,"year":2000,"errors":{}},"publication_name":"The Journal of Neuroscience"},"translated_abstract":null,"internal_url":"https://www.academia.edu/77554080/Visual_Attention_to_the_Periphery_Is_Enhanced_in_Congenitally_Deaf_Individuals","translated_internal_url":"","created_at":"2022-04-25T04:29:42.832-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84887760,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887760/thumbnails/1.jpg","file_name":"RC93.full.pdf","download_url":"https://www.academia.edu/attachments/84887760/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Visual_Attention_to_the_Periphery_Is_Enh.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887760/RC93.full-libre.pdf?1650938757=\u0026response-content-disposition=attachment%3B+filename%3DVisual_Attention_to_the_Periphery_Is_Enh.pdf\u0026Expires=1733053721\u0026Signature=XfEUr6WsiZXSR-AW8GxUdoA2~Rgv3MlSGQQjc7cGunecFFvAws7Zjw6iq5lu1RqHAfmy8ri0VRD-75gW2XGIdaqO5rcaw7QKr4d~KibndHy4gk0KVt-Z12SdZPxuW8msTTaF28HhzL6egovwvdVBt94Yq6D~SVAxwsjPRZE8heqWPaD6sfhG3rVp8WOUQ4CE9AcQxWEKlXIGjdOotsX032~6gii-qYyfyntP2EDBh9xTuc7VAYmCIikPQ023rxBperVzSkVwlbM625KMBttXJTQ9TioYHftkfr1Io5LURkvLZYFuFUxdHhAXcBZ0Sk-d0~ZOLE9cZDnDnpLC0SB7pg__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Visual_Attention_to_the_Periphery_Is_Enhanced_in_Congenitally_Deaf_Individuals","translated_slug":"","page_count":6,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84887760,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887760/thumbnails/1.jpg","file_name":"RC93.full.pdf","download_url":"https://www.academia.edu/attachments/84887760/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Visual_Attention_to_the_Periphery_Is_Enh.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887760/RC93.full-libre.pdf?1650938757=\u0026response-content-disposition=attachment%3B+filename%3DVisual_Attention_to_the_Periphery_Is_Enh.pdf\u0026Expires=1733053721\u0026Signature=XfEUr6WsiZXSR-AW8GxUdoA2~Rgv3MlSGQQjc7cGunecFFvAws7Zjw6iq5lu1RqHAfmy8ri0VRD-75gW2XGIdaqO5rcaw7QKr4d~KibndHy4gk0KVt-Z12SdZPxuW8msTTaF28HhzL6egovwvdVBt94Yq6D~SVAxwsjPRZE8heqWPaD6sfhG3rVp8WOUQ4CE9AcQxWEKlXIGjdOotsX032~6gii-qYyfyntP2EDBh9xTuc7VAYmCIikPQ023rxBperVzSkVwlbM625KMBttXJTQ9TioYHftkfr1Io5LURkvLZYFuFUxdHhAXcBZ0Sk-d0~ZOLE9cZDnDnpLC0SB7pg__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":161,"name":"Neuroscience","url":"https://www.academia.edu/Documents/in/Neuroscience"},{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":4008,"name":"Visual attention","url":"https://www.academia.edu/Documents/in/Visual_attention"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":123274,"name":"Parietal Cortex","url":"https://www.academia.edu/Documents/in/Parietal_Cortex"},{"id":123277,"name":"Posterior Parietal Cortex","url":"https://www.academia.edu/Documents/in/Posterior_Parietal_Cortex"},{"id":124951,"name":"Effective Connectivity","url":"https://www.academia.edu/Documents/in/Effective_Connectivity"},{"id":214510,"name":"Structural Equation Model","url":"https://www.academia.edu/Documents/in/Structural_Equation_Model"},{"id":298502,"name":"Area Mt","url":"https://www.academia.edu/Documents/in/Area_Mt"},{"id":914074,"name":"Visual Field","url":"https://www.academia.edu/Documents/in/Visual_Field"},{"id":2922956,"name":"Psychology and Cognitive Sciences","url":"https://www.academia.edu/Documents/in/Psychology_and_Cognitive_Sciences"},{"id":3095916,"name":"normal hearing","url":"https://www.academia.edu/Documents/in/normal_hearing"},{"id":3763225,"name":"Medical and Health Sciences","url":"https://www.academia.edu/Documents/in/Medical_and_Health_Sciences"}],"urls":[{"id":19887684,"url":"https://syndication.highwire.org/content/doi/10.1523/JNEUROSCI.20-17-j0001.2000"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554079"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554079/Real_time_lexical_comprehension_in_young_children_learning_American_Sign_Language"><img alt="Research paper thumbnail of Real-time lexical comprehension in young children learning American Sign Language" class="work-thumbnail" src="https://attachments.academia-assets.com/84887708/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554079/Real_time_lexical_comprehension_in_young_children_learning_American_Sign_Language">Real-time lexical comprehension in young children learning American Sign Language</a></div><div class="wp-workCard_item"><span>Developmental science</span><span>, Jan 16, 2018</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">When children interpret spoken language in real time, linguistic information drives rapid shifts ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">When children interpret spoken language in real time, linguistic information drives rapid shifts in visual attention to objects in the visual world. This language-vision interaction can provide insights into children&amp;#39;s developing efficiency in language comprehension. But how does language influence visual attention when the linguistic signal and the visual world are both processed via the visual channel? Here, we measured eye movements during real-time comprehension of a visual-manual language, American Sign Language (ASL), by 29 native ASL-learning children (16-53 mos, 16 deaf, 13 hearing) and 16 fluent deaf adult signers. All signers showed evidence of rapid, incremental language comprehension, tending to initiate an eye movement before sign offset. Deaf and hearing ASL-learners showed similar gaze patterns, suggesting that the in-the-moment dynamics of eye movements during ASL processing are shaped by the constraints of processing a visual language in real time and not by dif...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="1628de48f5b20bb931119fa7077a82b1" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84887708,&quot;asset_id&quot;:77554079,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84887708/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554079"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554079"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554079; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554079]").text(description); $(".js-view-count[data-work-id=77554079]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554079; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554079']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554079, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "1628de48f5b20bb931119fa7077a82b1" } } $('.js-work-strip[data-work-id=77554079]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554079,"title":"Real-time lexical comprehension in young children learning American Sign Language","translated_title":"","metadata":{"abstract":"When children interpret spoken language in real time, linguistic information drives rapid shifts in visual attention to objects in the visual world. This language-vision interaction can provide insights into children\u0026#39;s developing efficiency in language comprehension. But how does language influence visual attention when the linguistic signal and the visual world are both processed via the visual channel? Here, we measured eye movements during real-time comprehension of a visual-manual language, American Sign Language (ASL), by 29 native ASL-learning children (16-53 mos, 16 deaf, 13 hearing) and 16 fluent deaf adult signers. All signers showed evidence of rapid, incremental language comprehension, tending to initiate an eye movement before sign offset. Deaf and hearing ASL-learners showed similar gaze patterns, suggesting that the in-the-moment dynamics of eye movements during ASL processing are shaped by the constraints of processing a visual language in real time and not by dif...","publication_date":{"day":16,"month":1,"year":2018,"errors":{}},"publication_name":"Developmental science"},"translated_abstract":"When children interpret spoken language in real time, linguistic information drives rapid shifts in visual attention to objects in the visual world. This language-vision interaction can provide insights into children\u0026#39;s developing efficiency in language comprehension. But how does language influence visual attention when the linguistic signal and the visual world are both processed via the visual channel? Here, we measured eye movements during real-time comprehension of a visual-manual language, American Sign Language (ASL), by 29 native ASL-learning children (16-53 mos, 16 deaf, 13 hearing) and 16 fluent deaf adult signers. All signers showed evidence of rapid, incremental language comprehension, tending to initiate an eye movement before sign offset. Deaf and hearing ASL-learners showed similar gaze patterns, suggesting that the in-the-moment dynamics of eye movements during ASL processing are shaped by the constraints of processing a visual language in real time and not by dif...","internal_url":"https://www.academia.edu/77554079/Real_time_lexical_comprehension_in_young_children_learning_American_Sign_Language","translated_internal_url":"","created_at":"2022-04-25T04:29:42.734-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84887708,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887708/thumbnails/1.jpg","file_name":"download.pdf","download_url":"https://www.academia.edu/attachments/84887708/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Real_time_lexical_comprehension_in_young.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887708/download-libre.pdf?1650938781=\u0026response-content-disposition=attachment%3B+filename%3DReal_time_lexical_comprehension_in_young.pdf\u0026Expires=1733053721\u0026Signature=Av9~nCILWaN4sns5wrYt-l36sgsZJzBf2C56b5aDSIx1C9Cnwgpop~0mchxWJzYEFhdnHPa1N3hZRDgtoVwSjqYuhoo8Nr402tDKy7sNDzEn~vsCJJyoKw3DIqOTFHTan226RUrw379QtLtmqMYiSu7D77GtFljx3Xy04Jg1rH4GbmqzYJhRpLIXFS2CD1xoLhfwZQV-w97DDmyxO2Ik-Ktubax1opyfrLo4pmdpE75dHx46T~WQYNfEAE7mk0QLllJ~4BoD3FrF6LlBsTjPgslgU5gzF04joi~rdheSF2D--znzw8bpfjaW90aRF6-IN3L8WZb3MmPliMbref2tBA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Real_time_lexical_comprehension_in_young_children_learning_American_Sign_Language","translated_slug":"","page_count":47,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84887708,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887708/thumbnails/1.jpg","file_name":"download.pdf","download_url":"https://www.academia.edu/attachments/84887708/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Real_time_lexical_comprehension_in_young.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887708/download-libre.pdf?1650938781=\u0026response-content-disposition=attachment%3B+filename%3DReal_time_lexical_comprehension_in_young.pdf\u0026Expires=1733053721\u0026Signature=Av9~nCILWaN4sns5wrYt-l36sgsZJzBf2C56b5aDSIx1C9Cnwgpop~0mchxWJzYEFhdnHPa1N3hZRDgtoVwSjqYuhoo8Nr402tDKy7sNDzEn~vsCJJyoKw3DIqOTFHTan226RUrw379QtLtmqMYiSu7D77GtFljx3Xy04Jg1rH4GbmqzYJhRpLIXFS2CD1xoLhfwZQV-w97DDmyxO2Ik-Ktubax1opyfrLo4pmdpE75dHx46T~WQYNfEAE7mk0QLllJ~4BoD3FrF6LlBsTjPgslgU5gzF04joi~rdheSF2D--znzw8bpfjaW90aRF6-IN3L8WZb3MmPliMbref2tBA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":237,"name":"Cognitive Science","url":"https://www.academia.edu/Documents/in/Cognitive_Science"},{"id":15674,"name":"Linguistics","url":"https://www.academia.edu/Documents/in/Linguistics"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":442068,"name":"Developmental Science","url":"https://www.academia.edu/Documents/in/Developmental_Science"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554078"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554078/Brain_and_Language"><img alt="Research paper thumbnail of Brain and Language" class="work-thumbnail" src="https://attachments.academia-assets.com/84887706/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554078/Brain_and_Language">Brain and Language</a></div><div class="wp-workCard_item"><span>Neuron</span><span>, 1998</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="11d854745ac8b23a19d80505f0acdd08" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84887706,&quot;asset_id&quot;:77554078,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84887706/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554078"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554078"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554078; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554078]").text(description); $(".js-view-count[data-work-id=77554078]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554078; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554078']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554078, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "11d854745ac8b23a19d80505f0acdd08" } } $('.js-work-strip[data-work-id=77554078]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554078,"title":"Brain and Language","translated_title":"","metadata":{"publication_date":{"day":null,"month":null,"year":1998,"errors":{}},"publication_name":"Neuron"},"translated_abstract":null,"internal_url":"https://www.academia.edu/77554078/Brain_and_Language","translated_internal_url":"","created_at":"2022-04-25T04:29:42.605-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84887706,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887706/thumbnails/1.jpg","file_name":"ATTACHMENT01.pdf","download_url":"https://www.academia.edu/attachments/84887706/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Brain_and_Language.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887706/ATTACHMENT01-libre.pdf?1650938760=\u0026response-content-disposition=attachment%3B+filename%3DBrain_and_Language.pdf\u0026Expires=1733053721\u0026Signature=XAv6J71nGkx93-OSnfgiPMgSODjpwt-SQDMctxucM-ypDpil-~hrnqcvx7~06nUvKASIw3Fnl8Nz1EH4TOz2IUXLRFy~RMVB78LYg03p1PG4g9wuTRD1EChHLAfFgfUxG75qSM72SbYM~yJleb43LUGguCRC-Hw1vljgro1TmeavoqG~nThIl7bLZ0ov7LkR8SggCq~zeZvs-w4S6JryM05sifZf~FswWWdSwZte1MYizuWPFFR0PQhnd4T4RvBmNK0rKLQYtyYn3TMYTO~-9EmsCvvz1qbIA1rW-E4IO0Iy2jqDEwuxGVnTlR9tI73Z1~rSIg8GvrssR656u62N-Q__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Brain_and_Language","translated_slug":"","page_count":5,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84887706,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887706/thumbnails/1.jpg","file_name":"ATTACHMENT01.pdf","download_url":"https://www.academia.edu/attachments/84887706/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Brain_and_Language.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887706/ATTACHMENT01-libre.pdf?1650938760=\u0026response-content-disposition=attachment%3B+filename%3DBrain_and_Language.pdf\u0026Expires=1733053721\u0026Signature=XAv6J71nGkx93-OSnfgiPMgSODjpwt-SQDMctxucM-ypDpil-~hrnqcvx7~06nUvKASIw3Fnl8Nz1EH4TOz2IUXLRFy~RMVB78LYg03p1PG4g9wuTRD1EChHLAfFgfUxG75qSM72SbYM~yJleb43LUGguCRC-Hw1vljgro1TmeavoqG~nThIl7bLZ0ov7LkR8SggCq~zeZvs-w4S6JryM05sifZf~FswWWdSwZte1MYizuWPFFR0PQhnd4T4RvBmNK0rKLQYtyYn3TMYTO~-9EmsCvvz1qbIA1rW-E4IO0Iy2jqDEwuxGVnTlR9tI73Z1~rSIg8GvrssR656u62N-Q__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":237,"name":"Cognitive Science","url":"https://www.academia.edu/Documents/in/Cognitive_Science"},{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":1214,"name":"Sign Language","url":"https://www.academia.edu/Documents/in/Sign_Language"},{"id":18174,"name":"Language","url":"https://www.academia.edu/Documents/in/Language"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":61474,"name":"Brain","url":"https://www.academia.edu/Documents/in/Brain"},{"id":473565,"name":"Neuron","url":"https://www.academia.edu/Documents/in/Neuron"},{"id":1239755,"name":"Neurosciences","url":"https://www.academia.edu/Documents/in/Neurosciences"},{"id":2234200,"name":"Functional Laterality","url":"https://www.academia.edu/Documents/in/Functional_Laterality"}],"urls":[{"id":19887683,"url":"http://sciencedirect.com/science/article/pii/s089662730080536x"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554077"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554077/The_Processing_of_Biologically_Plausible_and_Implausible_forms_in_American_Sign_Language_Evidence_for_Perceptual_Tuning"><img alt="Research paper thumbnail of The Processing of Biologically Plausible and Implausible forms in American Sign Language: Evidence for Perceptual Tuning" class="work-thumbnail" src="https://attachments.academia-assets.com/84887715/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554077/The_Processing_of_Biologically_Plausible_and_Implausible_forms_in_American_Sign_Language_Evidence_for_Perceptual_Tuning">The Processing of Biologically Plausible and Implausible forms in American Sign Language: Evidence for Perceptual Tuning</a></div><div class="wp-workCard_item"><span>Language, cognition and neuroscience</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The human auditory system distinguishes speech-like information from general auditory signals in ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The human auditory system distinguishes speech-like information from general auditory signals in a remarkably fast and efficient way. Combining psychophysics and neurophysiology (MEG), we demonstrate a similar result for the processing of visual information used for language communication in users of sign languages. We demonstrate that the earliest visual cortical responses in deaf signers viewing American Sign Language (ASL) signs show specific modulations to violations of anatomic constraints that would make the sign either possible or impossible to articulate. These neural data are accompanied with a significantly increased perceptual sensitivity to the anatomical incongruity. The differential effects in the early visual evoked potentials arguably reflect an expectation-driven assessment of somatic representational integrity, suggesting that language experience and/or auditory deprivation may shape the neuronal mechanisms underlying the analysis of complex human form. The data de...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="a4e6a8ca6c9185881871da7efcfd9199" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84887715,&quot;asset_id&quot;:77554077,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84887715/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554077"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554077"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554077; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554077]").text(description); $(".js-view-count[data-work-id=77554077]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554077; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554077']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554077, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "a4e6a8ca6c9185881871da7efcfd9199" } } $('.js-work-strip[data-work-id=77554077]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554077,"title":"The Processing of Biologically Plausible and Implausible forms in American Sign Language: Evidence for Perceptual Tuning","translated_title":"","metadata":{"abstract":"The human auditory system distinguishes speech-like information from general auditory signals in a remarkably fast and efficient way. Combining psychophysics and neurophysiology (MEG), we demonstrate a similar result for the processing of visual information used for language communication in users of sign languages. We demonstrate that the earliest visual cortical responses in deaf signers viewing American Sign Language (ASL) signs show specific modulations to violations of anatomic constraints that would make the sign either possible or impossible to articulate. These neural data are accompanied with a significantly increased perceptual sensitivity to the anatomical incongruity. The differential effects in the early visual evoked potentials arguably reflect an expectation-driven assessment of somatic representational integrity, suggesting that language experience and/or auditory deprivation may shape the neuronal mechanisms underlying the analysis of complex human form. The data de...","publication_name":"Language, cognition and neuroscience"},"translated_abstract":"The human auditory system distinguishes speech-like information from general auditory signals in a remarkably fast and efficient way. Combining psychophysics and neurophysiology (MEG), we demonstrate a similar result for the processing of visual information used for language communication in users of sign languages. We demonstrate that the earliest visual cortical responses in deaf signers viewing American Sign Language (ASL) signs show specific modulations to violations of anatomic constraints that would make the sign either possible or impossible to articulate. These neural data are accompanied with a significantly increased perceptual sensitivity to the anatomical incongruity. The differential effects in the early visual evoked potentials arguably reflect an expectation-driven assessment of somatic representational integrity, suggesting that language experience and/or auditory deprivation may shape the neuronal mechanisms underlying the analysis of complex human form. The data de...","internal_url":"https://www.academia.edu/77554077/The_Processing_of_Biologically_Plausible_and_Implausible_forms_in_American_Sign_Language_Evidence_for_Perceptual_Tuning","translated_internal_url":"","created_at":"2022-04-25T04:29:42.496-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84887715,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887715/thumbnails/1.jpg","file_name":"pmc4849140.pdf","download_url":"https://www.academia.edu/attachments/84887715/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"The_Processing_of_Biologically_Plausible.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887715/pmc4849140-libre.pdf?1650938760=\u0026response-content-disposition=attachment%3B+filename%3DThe_Processing_of_Biologically_Plausible.pdf\u0026Expires=1733053721\u0026Signature=cDo6mIstXIs~zCbx5djVAVug8aosvUWtom2V8rWQ~CYRVPMwMCnMO56fHvnR6HU-NdHTwda4F0X34eGPrfEryMX9WC9umdWf2TDnHbCNEmCZ~8pWNXubcWQQ3cLljwQXv6LAHwta7mhXcWdTW46fWdDNS95hlXbYtowoxI90wq-ag1r5-vQGdvS7ZRvOuEedOanL~IjNKqopsii40t2c~fiBM8we5RIaw2CC1GSPWR7j1QwL2tkGy6HY3sXmBvWB0b3KHC2f-fZlDXRI4noms34Kqqf5cPIj3bqC1Gz6hacXiSytv8RXYjpeDwWJAw5Z8tyBQgF18kBI~6LDhIfhxw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"The_Processing_of_Biologically_Plausible_and_Implausible_forms_in_American_Sign_Language_Evidence_for_Perceptual_Tuning","translated_slug":"","page_count":14,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84887715,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887715/thumbnails/1.jpg","file_name":"pmc4849140.pdf","download_url":"https://www.academia.edu/attachments/84887715/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"The_Processing_of_Biologically_Plausible.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887715/pmc4849140-libre.pdf?1650938760=\u0026response-content-disposition=attachment%3B+filename%3DThe_Processing_of_Biologically_Plausible.pdf\u0026Expires=1733053721\u0026Signature=cDo6mIstXIs~zCbx5djVAVug8aosvUWtom2V8rWQ~CYRVPMwMCnMO56fHvnR6HU-NdHTwda4F0X34eGPrfEryMX9WC9umdWf2TDnHbCNEmCZ~8pWNXubcWQQ3cLljwQXv6LAHwta7mhXcWdTW46fWdDNS95hlXbYtowoxI90wq-ag1r5-vQGdvS7ZRvOuEedOanL~IjNKqopsii40t2c~fiBM8we5RIaw2CC1GSPWR7j1QwL2tkGy6HY3sXmBvWB0b3KHC2f-fZlDXRI4noms34Kqqf5cPIj3bqC1Gz6hacXiSytv8RXYjpeDwWJAw5Z8tyBQgF18kBI~6LDhIfhxw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":953669,"name":"Routledge","url":"https://www.academia.edu/Documents/in/Routledge"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554076"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554076/Human_Temporal_Cortical_Single_Neuron_Activity_During_Working_Memory_Maintenance"><img alt="Research paper thumbnail of Human Temporal Cortical Single Neuron Activity During Working Memory Maintenance" class="work-thumbnail" src="https://attachments.academia-assets.com/84927744/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554076/Human_Temporal_Cortical_Single_Neuron_Activity_During_Working_Memory_Maintenance">Human Temporal Cortical Single Neuron Activity During Working Memory Maintenance</a></div><div class="wp-workCard_item"><span>Neuropsychologia</span><span>, 2016</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="1f2fd271ca1a1646b8b29ff56b41c227" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84927744,&quot;asset_id&quot;:77554076,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84927744/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554076"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554076"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554076; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554076]").text(description); $(".js-view-count[data-work-id=77554076]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554076; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554076']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554076, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "1f2fd271ca1a1646b8b29ff56b41c227" } } $('.js-work-strip[data-work-id=77554076]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554076,"title":"Human Temporal Cortical Single Neuron Activity During Working Memory Maintenance","translated_title":"","metadata":{"publisher":"Elsevier BV","publication_date":{"day":null,"month":null,"year":2016,"errors":{}},"publication_name":"Neuropsychologia"},"translated_abstract":null,"internal_url":"https://www.academia.edu/77554076/Human_Temporal_Cortical_Single_Neuron_Activity_During_Working_Memory_Maintenance","translated_internal_url":"","created_at":"2022-04-25T04:29:42.375-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84927744,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84927744/thumbnails/1.jpg","file_name":"ptpmcrender.pdf","download_url":"https://www.academia.edu/attachments/84927744/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Human_Temporal_Cortical_Single_Neuron_Ac.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84927744/ptpmcrender-libre.pdf?1650938979=\u0026response-content-disposition=attachment%3B+filename%3DHuman_Temporal_Cortical_Single_Neuron_Ac.pdf\u0026Expires=1733053721\u0026Signature=aib-DEu0v6y7H1--mPBVX0phzbqp-40jXmyBT3ZvoFiim0N0y9hGlMLYYyHRuf1iFq~23XwFSsrdw537CKa7D0ltjJoaEugN2TuS-P9fEx2hjsffkag3YJiu9XJphuOeZg3OonwghGtnCvGrzEZG41rC7keUOVCTAEr~whR8dwiilPLbj51lcaKOXEJ3aO7q5WlRC5rqayAnxbEVUXhhiX0E8oh7KfLez7JTZ74sVx-ES1nmiU7FTy9qDix1zLmhaU40s3GWv5UKEeE98~uBgOUd53pX4-S8JRcuOoICuzbWUsUacskzN1aWZliiCzILGzwfVPWyw1lsJu476KZUyA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Human_Temporal_Cortical_Single_Neuron_Activity_During_Working_Memory_Maintenance","translated_slug":"","page_count":29,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84927744,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84927744/thumbnails/1.jpg","file_name":"ptpmcrender.pdf","download_url":"https://www.academia.edu/attachments/84927744/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Human_Temporal_Cortical_Single_Neuron_Ac.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84927744/ptpmcrender-libre.pdf?1650938979=\u0026response-content-disposition=attachment%3B+filename%3DHuman_Temporal_Cortical_Single_Neuron_Ac.pdf\u0026Expires=1733053721\u0026Signature=aib-DEu0v6y7H1--mPBVX0phzbqp-40jXmyBT3ZvoFiim0N0y9hGlMLYYyHRuf1iFq~23XwFSsrdw537CKa7D0ltjJoaEugN2TuS-P9fEx2hjsffkag3YJiu9XJphuOeZg3OonwghGtnCvGrzEZG41rC7keUOVCTAEr~whR8dwiilPLbj51lcaKOXEJ3aO7q5WlRC5rqayAnxbEVUXhhiX0E8oh7KfLez7JTZ74sVx-ES1nmiU7FTy9qDix1zLmhaU40s3GWv5UKEeE98~uBgOUd53pX4-S8JRcuOoICuzbWUsUacskzN1aWZliiCzILGzwfVPWyw1lsJu476KZUyA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":237,"name":"Cognitive Science","url":"https://www.academia.edu/Documents/in/Cognitive_Science"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":452621,"name":"Neuropsychologia","url":"https://www.academia.edu/Documents/in/Neuropsychologia"},{"id":1239755,"name":"Neurosciences","url":"https://www.academia.edu/Documents/in/Neurosciences"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554074"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554074/nfluences_of_Linguistic_and_Non_Linguistic_Factors_in_the_Processing_of_American_Sign_Language_Evidence_from_Handshape_Monitoring"><img alt="Research paper thumbnail of nfluences of Linguistic and Non-Linguistic Factors in the Processing of American Sign Language: Evidence from Handshape Monitoring" class="work-thumbnail" src="https://attachments.academia-assets.com/84887720/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554074/nfluences_of_Linguistic_and_Non_Linguistic_Factors_in_the_Processing_of_American_Sign_Language_Evidence_from_Handshape_Monitoring">nfluences of Linguistic and Non-Linguistic Factors in the Processing of American Sign Language: Evidence from Handshape Monitoring</a></div><div class="wp-workCard_item"><span>Annual Meeting of the Berkeley Linguistics Society</span><span>, 2009</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="93a75e4b43fee9205880dab5876bf6c2" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84887720,&quot;asset_id&quot;:77554074,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84887720/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554074"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554074"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554074; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554074]").text(description); $(".js-view-count[data-work-id=77554074]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554074; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554074']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554074, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "93a75e4b43fee9205880dab5876bf6c2" } } $('.js-work-strip[data-work-id=77554074]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554074,"title":"nfluences of Linguistic and Non-Linguistic Factors in the Processing of American Sign Language: Evidence from Handshape Monitoring","translated_title":"","metadata":{"publisher":"Linguistic Society of America","publication_date":{"day":null,"month":null,"year":2009,"errors":{}},"publication_name":"Annual Meeting of the Berkeley Linguistics Society"},"translated_abstract":null,"internal_url":"https://www.academia.edu/77554074/nfluences_of_Linguistic_and_Non_Linguistic_Factors_in_the_Processing_of_American_Sign_Language_Evidence_from_Handshape_Monitoring","translated_internal_url":"","created_at":"2022-04-25T04:29:42.244-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84887720,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887720/thumbnails/1.jpg","file_name":"e2d11e764b3fa98dbb2c0f9f622366d6571c.pdf","download_url":"https://www.academia.edu/attachments/84887720/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"nfluences_of_Linguistic_and_Non_Linguist.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887720/e2d11e764b3fa98dbb2c0f9f622366d6571c-libre.pdf?1650938760=\u0026response-content-disposition=attachment%3B+filename%3Dnfluences_of_Linguistic_and_Non_Linguist.pdf\u0026Expires=1733053721\u0026Signature=L5w8ySSywL1II8cnHOoqCjVYeipnTeI5jaCUGXM4IZyTA-2m8OcLgcsenLmNOFYcQWt3oF2S3kdXAs9-9LOoxZT2o79Zb6XikkCprn61Cxoit~aCgqULpwS~XMAZ4LkSU7bJJ6xFP3zvDn6QhcSWMaBVUWJi9xV-GQQnpO71JBHz38LwZToDlkxrYFA53wOxPkeDp~2WNpszRvDKsAu21kojUy-RCtJFqRC~vXs3mKBA-myK91Cf~-t6UyZEXW0vC5nSHUDY6XMtWgMH3mdomrg14q1agBIdsvOKzEU0984D0uYIrvfuKAmos2XB1pAUIU~493r0IOa9jhWUkogiJA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"nfluences_of_Linguistic_and_Non_Linguistic_Factors_in_the_Processing_of_American_Sign_Language_Evidence_from_Handshape_Monitoring","translated_slug":"","page_count":12,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84887720,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887720/thumbnails/1.jpg","file_name":"e2d11e764b3fa98dbb2c0f9f622366d6571c.pdf","download_url":"https://www.academia.edu/attachments/84887720/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"nfluences_of_Linguistic_and_Non_Linguist.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887720/e2d11e764b3fa98dbb2c0f9f622366d6571c-libre.pdf?1650938760=\u0026response-content-disposition=attachment%3B+filename%3Dnfluences_of_Linguistic_and_Non_Linguist.pdf\u0026Expires=1733053721\u0026Signature=L5w8ySSywL1II8cnHOoqCjVYeipnTeI5jaCUGXM4IZyTA-2m8OcLgcsenLmNOFYcQWt3oF2S3kdXAs9-9LOoxZT2o79Zb6XikkCprn61Cxoit~aCgqULpwS~XMAZ4LkSU7bJJ6xFP3zvDn6QhcSWMaBVUWJi9xV-GQQnpO71JBHz38LwZToDlkxrYFA53wOxPkeDp~2WNpszRvDKsAu21kojUy-RCtJFqRC~vXs3mKBA-myK91Cf~-t6UyZEXW0vC5nSHUDY6XMtWgMH3mdomrg14q1agBIdsvOKzEU0984D0uYIrvfuKAmos2XB1pAUIU~493r0IOa9jhWUkogiJA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554073"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554073/An_fMRI_study_of_perception_and_action_in_deaf_signers"><img alt="Research paper thumbnail of An fMRI study of perception and action in. deaf signers" class="work-thumbnail" src="https://attachments.academia-assets.com/84927702/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554073/An_fMRI_study_of_perception_and_action_in_deaf_signers">An fMRI study of perception and action in. deaf signers</a></div><div class="wp-workCard_item"><span>Neuropsychologia</span><span>, 2016</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="57029de8125446e035c9c0f5b84ee08b" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84927702,&quot;asset_id&quot;:77554073,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84927702/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554073"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554073"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554073; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554073]").text(description); $(".js-view-count[data-work-id=77554073]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554073; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554073']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554073, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "57029de8125446e035c9c0f5b84ee08b" } } $('.js-work-strip[data-work-id=77554073]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554073,"title":"An fMRI study of perception and action in. deaf signers","translated_title":"","metadata":{"publisher":"Elsevier BV","publication_date":{"day":null,"month":null,"year":2016,"errors":{}},"publication_name":"Neuropsychologia"},"translated_abstract":null,"internal_url":"https://www.academia.edu/77554073/An_fMRI_study_of_perception_and_action_in_deaf_signers","translated_internal_url":"","created_at":"2022-04-25T04:29:42.105-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84927702,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84927702/thumbnails/1.jpg","file_name":"ptpmcrender.pdf","download_url":"https://www.academia.edu/attachments/84927702/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"An_fMRI_study_of_perception_and_action_i.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84927702/ptpmcrender-libre.pdf?1650938981=\u0026response-content-disposition=attachment%3B+filename%3DAn_fMRI_study_of_perception_and_action_i.pdf\u0026Expires=1733053721\u0026Signature=NWHnSKM3kfhRLa66YNRd0pvdtVc2l58ICj5QRc~w0MF2hhEfMuA5CorfuqCtMl28bSitUzrDlkD3t2NarF7D-IU~zbPpZioXGMXP6kJ8fMNLb29xyQjwgfXkCv2ZL-MIjvfsnfCY2LqfuAJQL6eLhDesBWouQgyWfphyQ6p18EwZrq-~EntyFyHNISoKUgOPClK78iZrWWu11Sg1xyfp5WslTDhV43MRU1ez2jvmO3~1Xu~7t~GnxQcp7DZygweWtPVjWUbPnBR99mXP~q-p-L~oClDjAgO1uOH2FTpSiryCuwbMfNCnV6mrDiddW2w-i4oxo~F898nH~x2fAJ4Cag__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"An_fMRI_study_of_perception_and_action_in_deaf_signers","translated_slug":"","page_count":24,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84927702,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84927702/thumbnails/1.jpg","file_name":"ptpmcrender.pdf","download_url":"https://www.academia.edu/attachments/84927702/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"An_fMRI_study_of_perception_and_action_i.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84927702/ptpmcrender-libre.pdf?1650938981=\u0026response-content-disposition=attachment%3B+filename%3DAn_fMRI_study_of_perception_and_action_i.pdf\u0026Expires=1733053721\u0026Signature=NWHnSKM3kfhRLa66YNRd0pvdtVc2l58ICj5QRc~w0MF2hhEfMuA5CorfuqCtMl28bSitUzrDlkD3t2NarF7D-IU~zbPpZioXGMXP6kJ8fMNLb29xyQjwgfXkCv2ZL-MIjvfsnfCY2LqfuAJQL6eLhDesBWouQgyWfphyQ6p18EwZrq-~EntyFyHNISoKUgOPClK78iZrWWu11Sg1xyfp5WslTDhV43MRU1ez2jvmO3~1Xu~7t~GnxQcp7DZygweWtPVjWUbPnBR99mXP~q-p-L~oClDjAgO1uOH2FTpSiryCuwbMfNCnV6mrDiddW2w-i4oxo~F898nH~x2fAJ4Cag__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":237,"name":"Cognitive Science","url":"https://www.academia.edu/Documents/in/Cognitive_Science"},{"id":1214,"name":"Sign Language","url":"https://www.academia.edu/Documents/in/Sign_Language"},{"id":2349,"name":"Semantics","url":"https://www.academia.edu/Documents/in/Semantics"},{"id":6200,"name":"Magnetic Resonance Imaging","url":"https://www.academia.edu/Documents/in/Magnetic_Resonance_Imaging"},{"id":22506,"name":"Adolescent","url":"https://www.academia.edu/Documents/in/Adolescent"},{"id":52176,"name":"Brain Mapping","url":"https://www.academia.edu/Documents/in/Brain_Mapping"},{"id":61474,"name":"Brain","url":"https://www.academia.edu/Documents/in/Brain"},{"id":153836,"name":"Motor Cortex","url":"https://www.academia.edu/Documents/in/Motor_Cortex"},{"id":226636,"name":"Deafness","url":"https://www.academia.edu/Documents/in/Deafness"},{"id":406036,"name":"Parietal Lobe","url":"https://www.academia.edu/Documents/in/Parietal_Lobe"},{"id":452621,"name":"Neuropsychologia","url":"https://www.academia.edu/Documents/in/Neuropsychologia"},{"id":1239755,"name":"Neurosciences","url":"https://www.academia.edu/Documents/in/Neurosciences"},{"id":1959585,"name":"Broca area","url":"https://www.academia.edu/Documents/in/Broca_area"},{"id":2444775,"name":"Psychomotor Performance","url":"https://www.academia.edu/Documents/in/Psychomotor_Performance"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554072"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554072/Brain_and_Language_Minireview_a_Perspective_from_Sign_Language"><img alt="Research paper thumbnail of Brain and Language: Minireview a Perspective from Sign Language" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554072/Brain_and_Language_Minireview_a_Perspective_from_Sign_Language">Brain and Language: Minireview a Perspective from Sign Language</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Psychology Department representations found in spoken language, includingUniversity of Oregon pho...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Psychology Department representations found in spoken language, includingUniversity of Oregon phonology, morphology, syntax, semantics, and prag-Eugene, Oregon 97403 matics (Lillo-Martin, 1991; Corina and Sandler, 1993).Thus, similar linguistic structures are found in spokenand signed languages. A number of authors have pro-One of the most enduring and significant findings from posed that the left hemisphere recruitmentfor languageneuropsychology is the left hemisphere dominance for results from aspecialization oftheseareas forthe analy-language processing. Studies both past and present sis of linguistic structures. By this view, the structuralconverge to establish a widespread language network similarity between signed and spoken languages pre-in the left peri-sylvian cortex which encompasses at dicts that left hemisphere language areas should alsoleast four main regions: Broca’s area, within the inferior be recruited during ASL processing.prefrontal cortex; Wernicke’s area, within the posterior On the surface, however, ASL differs markedly fromtwo-thirds of the superior temporal lobe; the anterior spoken languages. For example, in ASL, phonologicalportion of the superior temporal lobe; and the middle distinctions are created by the positions and shape ofprefrontal cortex (Neville and Bavelier, 1998). While the the hands relative to the body rather than by acousticlanguage processing abilities of the left hemisphere are features such as nasality and voicing found in spokenuncontroversial, little is known about the determinants languages. The fact that signed and spoken languagesof this left hemisphere specialization for language. Are rely on different input and output modalities carries im-these areas geneticallydetermined to processlinguistic portant consequences for theories on the origin of theinformation? To what extent is this organization influ- left hemisphere dominance for language. It is often ar-enced by the language experience of each individual? gued that the left hemisphere specialization for lan-guage originates from a left hemisphere advantage toWhat role doesthe acoustic structure of languagesplayexecute fine temporal discrimination, such as the fastin this pattern of organization?acoustic processingrequired during speech perceptionTodate,mostofourunderstandingoftheneuralbases(Tallal et al., 1993). By this view, the standard left hemi-of language is derived from the studies of spoken lan-sphere language areas may not be recruited during theguages. Unfortunately, this spoken language bias limitsprocessing of visuo-spatial languages such as ASL.our ability to infer the determinants of left hemisphere Signed and spoken languages also differ by the wayspecialization for human language. For example, weare they convey linguistic information. While most aspectsunable to assess whether left hemisphere dominance of spoken languages rely on fast acoustic transitionsarises from the analysis of the sequential/hierarchical (e.g., consonant contrast) and temporalorderingof con-structures that are the building blocks of natural lan- stituents (e.g., suffixation, prefixation, word order, etc.),guages or rather is attributable to processing of the sign languages make significant use of visuo-spatialacoustic signal of spoken language. devices. For example, the use of signing space as aAmerican Sign Language (ASL), which makes use of staging ground for the depiction of grammatical rela-spatial location and motion of the hands in encoding tions is a prominent feature of ASL syntax. As shown inlinguistic information, enables us to investigate this is- Figure 1, in ASL,nominals introducedinto the discoursesue. The comparison of the neural representations of are assigned arbitrary reference points in a horizontalspoken and signed languages permits the separationof plane of signing space. Signs with pronominal functionthose brain structures that are common to all natural are directedtoward thesepoints, and verb signs obliga-human languages from those that are determined by torilymovebetweensuchpointsinspecifyinggrammati-the modality in which a language develops, providing cal relations (subject of, object of). Thus, grammaticalnew insight into the specificity of left hemisphere spe- functions served in many spoken languages by casecialization for language. marking or by linear ordering of words are fulfilled inIn this paper, we will first review some properties of ASL by spatial mechanisms; this is often referred to asASL and then discuss the contribution of the left hemi- “spatialized syntax” (Lillo-Martin, 1991; Poizner et al.,sphere and that of the right hemisphere to ASL pro- 1987; but see Liddell, 1998, for an alternative view).cessing. Another example of ASL processing that makes specialuse of visuo-spatial information is the classifier system.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554072"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554072"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554072; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554072]").text(description); $(".js-view-count[data-work-id=77554072]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554072; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554072']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554072, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=77554072]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554072,"title":"Brain and Language: Minireview a Perspective from Sign Language","translated_title":"","metadata":{"abstract":"Psychology Department representations found in spoken language, includingUniversity of Oregon phonology, morphology, syntax, semantics, and prag-Eugene, Oregon 97403 matics (Lillo-Martin, 1991; Corina and Sandler, 1993).Thus, similar linguistic structures are found in spokenand signed languages. A number of authors have pro-One of the most enduring and significant findings from posed that the left hemisphere recruitmentfor languageneuropsychology is the left hemisphere dominance for results from aspecialization oftheseareas forthe analy-language processing. Studies both past and present sis of linguistic structures. By this view, the structuralconverge to establish a widespread language network similarity between signed and spoken languages pre-in the left peri-sylvian cortex which encompasses at dicts that left hemisphere language areas should alsoleast four main regions: Broca’s area, within the inferior be recruited during ASL processing.prefrontal cortex; Wernicke’s area, within the posterior On the surface, however, ASL differs markedly fromtwo-thirds of the superior temporal lobe; the anterior spoken languages. For example, in ASL, phonologicalportion of the superior temporal lobe; and the middle distinctions are created by the positions and shape ofprefrontal cortex (Neville and Bavelier, 1998). While the the hands relative to the body rather than by acousticlanguage processing abilities of the left hemisphere are features such as nasality and voicing found in spokenuncontroversial, little is known about the determinants languages. The fact that signed and spoken languagesof this left hemisphere specialization for language. Are rely on different input and output modalities carries im-these areas geneticallydetermined to processlinguistic portant consequences for theories on the origin of theinformation? To what extent is this organization influ- left hemisphere dominance for language. It is often ar-enced by the language experience of each individual? gued that the left hemisphere specialization for lan-guage originates from a left hemisphere advantage toWhat role doesthe acoustic structure of languagesplayexecute fine temporal discrimination, such as the fastin this pattern of organization?acoustic processingrequired during speech perceptionTodate,mostofourunderstandingoftheneuralbases(Tallal et al., 1993). By this view, the standard left hemi-of language is derived from the studies of spoken lan-sphere language areas may not be recruited during theguages. Unfortunately, this spoken language bias limitsprocessing of visuo-spatial languages such as ASL.our ability to infer the determinants of left hemisphere Signed and spoken languages also differ by the wayspecialization for human language. For example, weare they convey linguistic information. While most aspectsunable to assess whether left hemisphere dominance of spoken languages rely on fast acoustic transitionsarises from the analysis of the sequential/hierarchical (e.g., consonant contrast) and temporalorderingof con-structures that are the building blocks of natural lan- stituents (e.g., suffixation, prefixation, word order, etc.),guages or rather is attributable to processing of the sign languages make significant use of visuo-spatialacoustic signal of spoken language. devices. For example, the use of signing space as aAmerican Sign Language (ASL), which makes use of staging ground for the depiction of grammatical rela-spatial location and motion of the hands in encoding tions is a prominent feature of ASL syntax. As shown inlinguistic information, enables us to investigate this is- Figure 1, in ASL,nominals introducedinto the discoursesue. The comparison of the neural representations of are assigned arbitrary reference points in a horizontalspoken and signed languages permits the separationof plane of signing space. Signs with pronominal functionthose brain structures that are common to all natural are directedtoward thesepoints, and verb signs obliga-human languages from those that are determined by torilymovebetweensuchpointsinspecifyinggrammati-the modality in which a language develops, providing cal relations (subject of, object of). Thus, grammaticalnew insight into the specificity of left hemisphere spe- functions served in many spoken languages by casecialization for language. marking or by linear ordering of words are fulfilled inIn this paper, we will first review some properties of ASL by spatial mechanisms; this is often referred to asASL and then discuss the contribution of the left hemi- “spatialized syntax” (Lillo-Martin, 1991; Poizner et al.,sphere and that of the right hemisphere to ASL pro- 1987; but see Liddell, 1998, for an alternative view).cessing. Another example of ASL processing that makes specialuse of visuo-spatial information is the classifier system."},"translated_abstract":"Psychology Department representations found in spoken language, includingUniversity of Oregon phonology, morphology, syntax, semantics, and prag-Eugene, Oregon 97403 matics (Lillo-Martin, 1991; Corina and Sandler, 1993).Thus, similar linguistic structures are found in spokenand signed languages. A number of authors have pro-One of the most enduring and significant findings from posed that the left hemisphere recruitmentfor languageneuropsychology is the left hemisphere dominance for results from aspecialization oftheseareas forthe analy-language processing. Studies both past and present sis of linguistic structures. By this view, the structuralconverge to establish a widespread language network similarity between signed and spoken languages pre-in the left peri-sylvian cortex which encompasses at dicts that left hemisphere language areas should alsoleast four main regions: Broca’s area, within the inferior be recruited during ASL processing.prefrontal cortex; Wernicke’s area, within the posterior On the surface, however, ASL differs markedly fromtwo-thirds of the superior temporal lobe; the anterior spoken languages. For example, in ASL, phonologicalportion of the superior temporal lobe; and the middle distinctions are created by the positions and shape ofprefrontal cortex (Neville and Bavelier, 1998). While the the hands relative to the body rather than by acousticlanguage processing abilities of the left hemisphere are features such as nasality and voicing found in spokenuncontroversial, little is known about the determinants languages. The fact that signed and spoken languagesof this left hemisphere specialization for language. Are rely on different input and output modalities carries im-these areas geneticallydetermined to processlinguistic portant consequences for theories on the origin of theinformation? To what extent is this organization influ- left hemisphere dominance for language. It is often ar-enced by the language experience of each individual? gued that the left hemisphere specialization for lan-guage originates from a left hemisphere advantage toWhat role doesthe acoustic structure of languagesplayexecute fine temporal discrimination, such as the fastin this pattern of organization?acoustic processingrequired during speech perceptionTodate,mostofourunderstandingoftheneuralbases(Tallal et al., 1993). By this view, the standard left hemi-of language is derived from the studies of spoken lan-sphere language areas may not be recruited during theguages. Unfortunately, this spoken language bias limitsprocessing of visuo-spatial languages such as ASL.our ability to infer the determinants of left hemisphere Signed and spoken languages also differ by the wayspecialization for human language. For example, weare they convey linguistic information. While most aspectsunable to assess whether left hemisphere dominance of spoken languages rely on fast acoustic transitionsarises from the analysis of the sequential/hierarchical (e.g., consonant contrast) and temporalorderingof con-structures that are the building blocks of natural lan- stituents (e.g., suffixation, prefixation, word order, etc.),guages or rather is attributable to processing of the sign languages make significant use of visuo-spatialacoustic signal of spoken language. devices. For example, the use of signing space as aAmerican Sign Language (ASL), which makes use of staging ground for the depiction of grammatical rela-spatial location and motion of the hands in encoding tions is a prominent feature of ASL syntax. As shown inlinguistic information, enables us to investigate this is- Figure 1, in ASL,nominals introducedinto the discoursesue. The comparison of the neural representations of are assigned arbitrary reference points in a horizontalspoken and signed languages permits the separationof plane of signing space. Signs with pronominal functionthose brain structures that are common to all natural are directedtoward thesepoints, and verb signs obliga-human languages from those that are determined by torilymovebetweensuchpointsinspecifyinggrammati-the modality in which a language develops, providing cal relations (subject of, object of). Thus, grammaticalnew insight into the specificity of left hemisphere spe- functions served in many spoken languages by casecialization for language. marking or by linear ordering of words are fulfilled inIn this paper, we will first review some properties of ASL by spatial mechanisms; this is often referred to asASL and then discuss the contribution of the left hemi- “spatialized syntax” (Lillo-Martin, 1991; Poizner et al.,sphere and that of the right hemisphere to ASL pro- 1987; but see Liddell, 1998, for an alternative view).cessing. Another example of ASL processing that makes specialuse of visuo-spatial information is the classifier system.","internal_url":"https://www.academia.edu/77554072/Brain_and_Language_Minireview_a_Perspective_from_Sign_Language","translated_internal_url":"","created_at":"2022-04-25T04:29:41.932-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Brain_and_Language_Minireview_a_Perspective_from_Sign_Language","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":1214,"name":"Sign Language","url":"https://www.academia.edu/Documents/in/Sign_Language"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554071"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554071/Unobtrusive_integration_of_data_management_with_fMRI_analysis"><img alt="Research paper thumbnail of Unobtrusive integration of data management with fMRI analysis" class="work-thumbnail" src="https://attachments.academia-assets.com/84887727/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554071/Unobtrusive_integration_of_data_management_with_fMRI_analysis">Unobtrusive integration of data management with fMRI analysis</a></div><div class="wp-workCard_item"><span>Neuroinformatics</span><span>, 2007</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="a4592acb11e095e84892032da7ca78c4" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84887727,&quot;asset_id&quot;:77554071,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84887727/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554071"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554071"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554071; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554071]").text(description); $(".js-view-count[data-work-id=77554071]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554071; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554071']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554071, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "a4592acb11e095e84892032da7ca78c4" } } $('.js-work-strip[data-work-id=77554071]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554071,"title":"Unobtrusive integration of data management with fMRI analysis","translated_title":"","metadata":{"publisher":"Springer Nature","publication_date":{"day":null,"month":null,"year":2007,"errors":{}},"publication_name":"Neuroinformatics"},"translated_abstract":null,"internal_url":"https://www.academia.edu/77554071/Unobtrusive_integration_of_data_management_with_fMRI_analysis","translated_internal_url":"","created_at":"2022-04-25T04:29:41.803-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84887727,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887727/thumbnails/1.jpg","file_name":"hertzenberg_medinfo.pdf","download_url":"https://www.academia.edu/attachments/84887727/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Unobtrusive_integration_of_data_manageme.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887727/hertzenberg_medinfo-libre.pdf?1650938758=\u0026response-content-disposition=attachment%3B+filename%3DUnobtrusive_integration_of_data_manageme.pdf\u0026Expires=1733053721\u0026Signature=I~57koR9MgrOFoeZArnX-5CUfG8RARNrPKzX1vR3K3s0mBTUhXAf3O47y0bDP~29rkKml5fqdhuzSwpWAma0uP~UgcFXkdqYa2PQu2IWtTzW67LtuGoaFUEyoNn-UTpvggkpiLjlrRt~LBclVijfqbxTwbWETeCR9zYbcPgYOol7aZfInc~M57Uancoksi9Y7vl6q8vUk43Xh9zdzb5XKZB9lywq56zZh1WRY0mddZicIricPnllB8uB3F3CzYr8eyS8jX47E3ga4uBg46ozh8uFdVZSRZa4smuQ-a4kGYmymROJaYUOtsGzW7wLzFwgaLwW6qzTU9ULMMbHUbKLyA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Unobtrusive_integration_of_data_management_with_fMRI_analysis","translated_slug":"","page_count":1,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84887727,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887727/thumbnails/1.jpg","file_name":"hertzenberg_medinfo.pdf","download_url":"https://www.academia.edu/attachments/84887727/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Unobtrusive_integration_of_data_manageme.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887727/hertzenberg_medinfo-libre.pdf?1650938758=\u0026response-content-disposition=attachment%3B+filename%3DUnobtrusive_integration_of_data_manageme.pdf\u0026Expires=1733053721\u0026Signature=I~57koR9MgrOFoeZArnX-5CUfG8RARNrPKzX1vR3K3s0mBTUhXAf3O47y0bDP~29rkKml5fqdhuzSwpWAma0uP~UgcFXkdqYa2PQu2IWtTzW67LtuGoaFUEyoNn-UTpvggkpiLjlrRt~LBclVijfqbxTwbWETeCR9zYbcPgYOol7aZfInc~M57Uancoksi9Y7vl6q8vUk43Xh9zdzb5XKZB9lywq56zZh1WRY0mddZicIricPnllB8uB3F3CzYr8eyS8jX47E3ga4uBg46ozh8uFdVZSRZa4smuQ-a4kGYmymROJaYUOtsGzW7wLzFwgaLwW6qzTU9ULMMbHUbKLyA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":4492,"name":"Neuroinformatics","url":"https://www.academia.edu/Documents/in/Neuroinformatics"},{"id":6200,"name":"Magnetic Resonance Imaging","url":"https://www.academia.edu/Documents/in/Magnetic_Resonance_Imaging"},{"id":8131,"name":"Data Management","url":"https://www.academia.edu/Documents/in/Data_Management"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":45090,"name":"Database Management Systems","url":"https://www.academia.edu/Documents/in/Database_Management_Systems"},{"id":52176,"name":"Brain Mapping","url":"https://www.academia.edu/Documents/in/Brain_Mapping"},{"id":53293,"name":"Software","url":"https://www.academia.edu/Documents/in/Software"},{"id":216941,"name":"Data Center","url":"https://www.academia.edu/Documents/in/Data_Center"},{"id":1239755,"name":"Neurosciences","url":"https://www.academia.edu/Documents/in/Neurosciences"},{"id":1681026,"name":"Biochemistry and cell biology","url":"https://www.academia.edu/Documents/in/Biochemistry_and_cell_biology"},{"id":2057366,"name":"Software Package","url":"https://www.academia.edu/Documents/in/Software_Package"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554070"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554070/A_Connectionist_Perspective_on_Prosodic_Structure"><img alt="Research paper thumbnail of A Connectionist Perspective on Prosodic Structure" class="work-thumbnail" src="https://attachments.academia-assets.com/84887705/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554070/A_Connectionist_Perspective_on_Prosodic_Structure">A Connectionist Perspective on Prosodic Structure</a></div><div class="wp-workCard_item"><span>Annual Meeting of the Berkeley Linguistics Society</span><span>, 1989</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Proceedings of the Fifteenth Annual Meeting of the Berkeley Linguistics Society (1989), pp. 114-125</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="25b2d1a2ea34f653f26a927d7b25f00a" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84887705,&quot;asset_id&quot;:77554070,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84887705/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554070"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554070"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554070; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554070]").text(description); $(".js-view-count[data-work-id=77554070]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554070; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554070']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554070, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "25b2d1a2ea34f653f26a927d7b25f00a" } } $('.js-work-strip[data-work-id=77554070]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554070,"title":"A Connectionist Perspective on Prosodic Structure","translated_title":"","metadata":{"abstract":"Proceedings of the Fifteenth Annual Meeting of the Berkeley Linguistics Society (1989), pp. 114-125","publisher":"Linguistic Society of America","publication_date":{"day":null,"month":null,"year":1989,"errors":{}},"publication_name":"Annual Meeting of the Berkeley Linguistics Society"},"translated_abstract":"Proceedings of the Fifteenth Annual Meeting of the Berkeley Linguistics Society (1989), pp. 114-125","internal_url":"https://www.academia.edu/77554070/A_Connectionist_Perspective_on_Prosodic_Structure","translated_internal_url":"","created_at":"2022-04-25T04:29:41.683-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84887705,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887705/thumbnails/1.jpg","file_name":"e35f4473d10e323cdc7716de97148ae8a771.pdf","download_url":"https://www.academia.edu/attachments/84887705/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"A_Connectionist_Perspective_on_Prosodic.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887705/e35f4473d10e323cdc7716de97148ae8a771-libre.pdf?1650938762=\u0026response-content-disposition=attachment%3B+filename%3DA_Connectionist_Perspective_on_Prosodic.pdf\u0026Expires=1733053721\u0026Signature=FGwiqliR1-xIZR~ddEimnR3NNBDqkntCf56Zguu30fO492ovUS2XJFmBuocFFGnCRW49iPZPzT085SRj1pOEfkUFjQeL1cHSZgv8dolDoacIpv4P6~Xrk-UM2fUnqtig1IW6-KyTIWBMlINhLvLXjiEYTXnOOKzpAO1SIlcCf2vw2gASlFQmlYqkregCPhH6h4k398sDPAHYbY4hNxtc3smJaMD8Mz76s9sK50nagp9UmlTzKjbzLwsG7Shl8Bxlk7XvJLJRw5V4x~e7WFo48pLfO-5RrCNW~KwUVur9WA-J0Mbl83QcBprLsztyiqa2oodZ-YvQ8WOkIDp1-wgkhQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"A_Connectionist_Perspective_on_Prosodic_Structure","translated_slug":"","page_count":13,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84887705,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887705/thumbnails/1.jpg","file_name":"e35f4473d10e323cdc7716de97148ae8a771.pdf","download_url":"https://www.academia.edu/attachments/84887705/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"A_Connectionist_Perspective_on_Prosodic.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887705/e35f4473d10e323cdc7716de97148ae8a771-libre.pdf?1650938762=\u0026response-content-disposition=attachment%3B+filename%3DA_Connectionist_Perspective_on_Prosodic.pdf\u0026Expires=1733053721\u0026Signature=FGwiqliR1-xIZR~ddEimnR3NNBDqkntCf56Zguu30fO492ovUS2XJFmBuocFFGnCRW49iPZPzT085SRj1pOEfkUFjQeL1cHSZgv8dolDoacIpv4P6~Xrk-UM2fUnqtig1IW6-KyTIWBMlINhLvLXjiEYTXnOOKzpAO1SIlcCf2vw2gASlFQmlYqkregCPhH6h4k398sDPAHYbY4hNxtc3smJaMD8Mz76s9sK50nagp9UmlTzKjbzLwsG7Shl8Bxlk7XvJLJRw5V4x~e7WFo48pLfO-5RrCNW~KwUVur9WA-J0Mbl83QcBprLsztyiqa2oodZ-YvQ8WOkIDp1-wgkhQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":5423,"name":"Connectionism","url":"https://www.academia.edu/Documents/in/Connectionism"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554069"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554069/A_new_neuron_spike_sorting_method_using_maximal_overlap_discrete_wavelet_transform_and_rotated_principal_component_analysis"><img alt="Research paper thumbnail of A new neuron spike sorting method using maximal overlap discrete wavelet transform and rotated principal component analysis" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554069/A_new_neuron_spike_sorting_method_using_maximal_overlap_discrete_wavelet_transform_and_rotated_principal_component_analysis">A new neuron spike sorting method using maximal overlap discrete wavelet transform and rotated principal component analysis</a></div><div class="wp-workCard_item"><span>Proceedings of the 25th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (IEEE Cat. No.03CH37439)</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Abstract A new method for neuron spike sorting is presented that uses maximal overlap discrete wa...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Abstract A new method for neuron spike sorting is presented that uses maximal overlap discrete wavelet transform (MODWT) and rotated principal component analysis. MODWT is very effective in extracting the bandwidth of neuron firings without shape distortion. We then used a rotated principal component analysis to isolate unique neuron templates. In this procedure the first principal component serves as a neuron spike template. This component is then removed from the original data and the procedure is repeated. Thus, the recursive ...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554069"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554069"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554069; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554069]").text(description); $(".js-view-count[data-work-id=77554069]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554069; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554069']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554069, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=77554069]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554069,"title":"A new neuron spike sorting method using maximal overlap discrete wavelet transform and rotated principal component analysis","translated_title":"","metadata":{"abstract":"Abstract A new method for neuron spike sorting is presented that uses maximal overlap discrete wavelet transform (MODWT) and rotated principal component analysis. MODWT is very effective in extracting the bandwidth of neuron firings without shape distortion. We then used a rotated principal component analysis to isolate unique neuron templates. In this procedure the first principal component serves as a neuron spike template. This component is then removed from the original data and the procedure is repeated. Thus, the recursive ...","publisher":"IEEE","publication_name":"Proceedings of the 25th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (IEEE Cat. No.03CH37439)"},"translated_abstract":"Abstract A new method for neuron spike sorting is presented that uses maximal overlap discrete wavelet transform (MODWT) and rotated principal component analysis. MODWT is very effective in extracting the bandwidth of neuron firings without shape distortion. We then used a rotated principal component analysis to isolate unique neuron templates. In this procedure the first principal component serves as a neuron spike template. This component is then removed from the original data and the procedure is repeated. Thus, the recursive ...","internal_url":"https://www.academia.edu/77554069/A_new_neuron_spike_sorting_method_using_maximal_overlap_discrete_wavelet_transform_and_rotated_principal_component_analysis","translated_internal_url":"","created_at":"2022-04-25T04:29:41.555-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"A_new_neuron_spike_sorting_method_using_maximal_overlap_discrete_wavelet_transform_and_rotated_principal_component_analysis","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[],"research_interests":[{"id":5069,"name":"Principal Component Analysis","url":"https://www.academia.edu/Documents/in/Principal_Component_Analysis"},{"id":22272,"name":"Neurophysiology","url":"https://www.academia.edu/Documents/in/Neurophysiology"},{"id":91365,"name":"Wavelet Transforms","url":"https://www.academia.edu/Documents/in/Wavelet_Transforms"},{"id":160144,"name":"Feature Extraction","url":"https://www.academia.edu/Documents/in/Feature_Extraction"},{"id":557843,"name":"Discrete wavelet transform","url":"https://www.academia.edu/Documents/in/Discrete_wavelet_transform"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554068"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554068/Exploring_the_movement_dynamics_of_manual_and_oral_articulation_Evidence_from_coarticulation"><img alt="Research paper thumbnail of Exploring the movement dynamics of manual and oral articulation: Evidence from coarticulation" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554068/Exploring_the_movement_dynamics_of_manual_and_oral_articulation_Evidence_from_coarticulation">Exploring the movement dynamics of manual and oral articulation: Evidence from coarticulation</a></div><div class="wp-workCard_item"><span>Laboratory Phonology</span><span>, 2012</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">This project explores three classes of human action through an investigation of long-distance coa...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">This project explores three classes of human action through an investigation of long-distance coarticulation, defined here as the articulatory influence of one phonetic element (e.g., consonant or vowel) on another across more than one intervening element. Our first experiment investigated anticipatory vowel-to-vowel (VV) coarticulation in English. The second experiment was patterned after the first but deals instead with anticipatory location-to-location (LL) effects in American Sign Language (ASL). The sign experiment also incorporated a non-linguistic manual action, permitting a comparison of effects not only between spoken and signed language, but also between linguistic and non-linguistic manual actions.For the spoken-language study, sentences were created in which multiple consecutive schwas (target vowels) were followed by various context vowels. Eighteen English speakers were recorded as they repeated each sentence six times, and statistical tests were performed to determine...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554068"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554068"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554068; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554068]").text(description); $(".js-view-count[data-work-id=77554068]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554068; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554068']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554068, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=77554068]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554068,"title":"Exploring the movement dynamics of manual and oral articulation: Evidence from coarticulation","translated_title":"","metadata":{"abstract":"This project explores three classes of human action through an investigation of long-distance coarticulation, defined here as the articulatory influence of one phonetic element (e.g., consonant or vowel) on another across more than one intervening element. Our first experiment investigated anticipatory vowel-to-vowel (VV) coarticulation in English. The second experiment was patterned after the first but deals instead with anticipatory location-to-location (LL) effects in American Sign Language (ASL). The sign experiment also incorporated a non-linguistic manual action, permitting a comparison of effects not only between spoken and signed language, but also between linguistic and non-linguistic manual actions.For the spoken-language study, sentences were created in which multiple consecutive schwas (target vowels) were followed by various context vowels. Eighteen English speakers were recorded as they repeated each sentence six times, and statistical tests were performed to determine...","publisher":"Walter de Gruyter GmbH","publication_date":{"day":null,"month":null,"year":2012,"errors":{}},"publication_name":"Laboratory Phonology"},"translated_abstract":"This project explores three classes of human action through an investigation of long-distance coarticulation, defined here as the articulatory influence of one phonetic element (e.g., consonant or vowel) on another across more than one intervening element. Our first experiment investigated anticipatory vowel-to-vowel (VV) coarticulation in English. The second experiment was patterned after the first but deals instead with anticipatory location-to-location (LL) effects in American Sign Language (ASL). The sign experiment also incorporated a non-linguistic manual action, permitting a comparison of effects not only between spoken and signed language, but also between linguistic and non-linguistic manual actions.For the spoken-language study, sentences were created in which multiple consecutive schwas (target vowels) were followed by various context vowels. Eighteen English speakers were recorded as they repeated each sentence six times, and statistical tests were performed to determine...","internal_url":"https://www.academia.edu/77554068/Exploring_the_movement_dynamics_of_manual_and_oral_articulation_Evidence_from_coarticulation","translated_internal_url":"","created_at":"2022-04-25T04:29:41.429-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Exploring_the_movement_dynamics_of_manual_and_oral_articulation_Evidence_from_coarticulation","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[],"research_interests":[{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":1214,"name":"Sign Language","url":"https://www.academia.edu/Documents/in/Sign_Language"},{"id":383315,"name":"Coarticulation","url":"https://www.academia.edu/Documents/in/Coarticulation"},{"id":432451,"name":"ASL","url":"https://www.academia.edu/Documents/in/ASL"},{"id":897363,"name":"Laboratory Phonology","url":"https://www.academia.edu/Documents/in/Laboratory_Phonology"},{"id":1532601,"name":"Movement Dynamics","url":"https://www.academia.edu/Documents/in/Movement_Dynamics"},{"id":1714028,"name":"Long Distance","url":"https://www.academia.edu/Documents/in/Long_Distance"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> </div><div class="profile--tab_content_container js-tab-pane tab-pane" data-section-id="14731763" id="papers"><div class="js-work-strip profile--work_container" data-work-id="119246721"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/119246721/Human_temporal_cortical_single_neuron_activity_during_working_memory_maintenance"><img alt="Research paper thumbnail of Human temporal cortical single neuron activity during working memory maintenance" class="work-thumbnail" src="https://attachments.academia-assets.com/114662216/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/119246721/Human_temporal_cortical_single_neuron_activity_during_working_memory_maintenance">Human temporal cortical single neuron activity during working memory maintenance</a></div><div class="wp-workCard_item"><span>Neuropsychologia</span><span>, Jun 1, 2016</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="024296379f3e988c5ba5835422d82d58" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:114662216,&quot;asset_id&quot;:119246721,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/114662216/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="119246721"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="119246721"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 119246721; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=119246721]").text(description); $(".js-view-count[data-work-id=119246721]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 119246721; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='119246721']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 119246721, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "024296379f3e988c5ba5835422d82d58" } } $('.js-work-strip[data-work-id=119246721]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":119246721,"title":"Human temporal cortical single neuron activity during working memory maintenance","translated_title":"","metadata":{"publisher":"Elsevier BV","grobid_abstract":"The Working Memory model of human memory, first introduced by Baddeley and Hitch (1974) , has been one of the most influential psychological constructs in cognitive psychology and human neuroscience. However the neuronal correlates of core components of this model have yet to be fully elucidated. Here we present data from two studies where human temporal cortical single neuron activity was recorded during tasks differentially affecting the maintenance component of verbal working memory. In Study One we vary the presence or absence of distracting items for the entire period of memory storage. In Study Two we vary the duration of storage so that distractors filled all, or only one-third of the time the memory was stored. Extracellular single neuron recordings were obtained from 36 subjects undergoing awake temporal lobe resections for epilepsy, 25 in Study one, 11 in Study two. Recordings were obtained from a total of 166 lateral temporal cortex neurons during performance of one of these two tasks, 86 study one, 80 study two. Significant changes in activity with distractor manipulation were present in 74 of these neurons (45%), 38 Study one, 36 Study two. In 48 (65%) of those there was increased activity during the period when distracting items were absent, 26 Study One, 22 Study Two. The magnitude of this increase was greater for Study One, 47.6%, than Study Two, 8.1%, paralleling the reduction in memory errors in the absence of distracters, for Study One of 70.3%, Study Two 26.3% These findings establish that human lateral temporal cortex is part of the neural system for working memory, with activity during maintenance of that memory that parallels performance, suggesting it represents active rehearsal. In 31 of these neurons (65%) this activity was an extension of that during working memory encoding that differed significantly from the neural processes recorded during overt and silent language tasks without a recent memory component, 17 Study one, 14 Study two. Contrary to the Baddeley model, that activity during verbal working memory maintenance often represented activity specific to working memory rather than speech or language.","publication_date":{"day":1,"month":6,"year":2016,"errors":{}},"publication_name":"Neuropsychologia","grobid_abstract_attachment_id":114662216},"translated_abstract":null,"internal_url":"https://www.academia.edu/119246721/Human_temporal_cortical_single_neuron_activity_during_working_memory_maintenance","translated_internal_url":"","created_at":"2024-05-17T09:37:19.254-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":114662216,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/114662216/thumbnails/1.jpg","file_name":"pmc4899132.pdf","download_url":"https://www.academia.edu/attachments/114662216/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Human_temporal_cortical_single_neuron_ac.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/114662216/pmc4899132-libre.pdf?1715964031=\u0026response-content-disposition=attachment%3B+filename%3DHuman_temporal_cortical_single_neuron_ac.pdf\u0026Expires=1733053720\u0026Signature=cn1fQftdcPvbtwie1TAFvrkTLV4IUpck3eqSw0n~MkiLYfmfbF0KtJpct5QNkpnQ7mNrf8dQbKPKQd7kuj6bB4ekb82NTHqbThNLsE978XIK1FgmbfHMlybHbSp1X6UmadcLV4nsGHjuZ~6Au1Mp8gEl-NA4ywQUShuAL0rzQ-Wgaow16iMOEvgBhhhtgKMYQpHP5opxOLeKXSVFjMsCaBacm1wAWCEtOeEWm8Xas~NgJa2fOL0v1ZFtnVq5rMw7b8OZtLdX41O5HdSSRyDBh~T1Fros1VNriwyZ23IsALiuouC7N0E9N75cH2NiyI5fInDyYcHnIuA4fXa-7dYvYQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Human_temporal_cortical_single_neuron_activity_during_working_memory_maintenance","translated_slug":"","page_count":29,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":114662216,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/114662216/thumbnails/1.jpg","file_name":"pmc4899132.pdf","download_url":"https://www.academia.edu/attachments/114662216/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Human_temporal_cortical_single_neuron_ac.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/114662216/pmc4899132-libre.pdf?1715964031=\u0026response-content-disposition=attachment%3B+filename%3DHuman_temporal_cortical_single_neuron_ac.pdf\u0026Expires=1733053720\u0026Signature=cn1fQftdcPvbtwie1TAFvrkTLV4IUpck3eqSw0n~MkiLYfmfbF0KtJpct5QNkpnQ7mNrf8dQbKPKQd7kuj6bB4ekb82NTHqbThNLsE978XIK1FgmbfHMlybHbSp1X6UmadcLV4nsGHjuZ~6Au1Mp8gEl-NA4ywQUShuAL0rzQ-Wgaow16iMOEvgBhhhtgKMYQpHP5opxOLeKXSVFjMsCaBacm1wAWCEtOeEWm8Xas~NgJa2fOL0v1ZFtnVq5rMw7b8OZtLdX41O5HdSSRyDBh~T1Fros1VNriwyZ23IsALiuouC7N0E9N75cH2NiyI5fInDyYcHnIuA4fXa-7dYvYQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"},{"id":114662215,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/114662215/thumbnails/1.jpg","file_name":"pmc4899132.pdf","download_url":"https://www.academia.edu/attachments/114662215/download_file","bulk_download_file_name":"Human_temporal_cortical_single_neuron_ac.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/114662215/pmc4899132-libre.pdf?1715964031=\u0026response-content-disposition=attachment%3B+filename%3DHuman_temporal_cortical_single_neuron_ac.pdf\u0026Expires=1733053720\u0026Signature=T2WHVX1bhbE4-PDBuZfQ-lcQ8YEgmU-0yaxG9k7l~p9bZuXNBH-nAPKtxHFjEGxk54agIFHooPtzyCzaivaXVASDe87M7T7e1~6xU47lKKjeFnjMScD0JwbiuwgJbvRoZ~nB~2pFGGdZLd61d-VEZYtAgBDxcHNcf5Ln~qzTSw6CP7cNIU4ok98ncOQ~kR8Ngv~~nDG8x71CUm6UxkOfCDOQyqeZYAbcxQrA0xph-1Dt2NlmiNOIzA4eoz4Pw4CEXx0xc60cH-d~OM2hZk6AFc2GDgk0qLjdqK6NHpclhQweoF8ycSqackeDMjyvWjnovFK0wyxzCla9h3YfJI65Yw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":161,"name":"Neuroscience","url":"https://www.academia.edu/Documents/in/Neuroscience"},{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":237,"name":"Cognitive Science","url":"https://www.academia.edu/Documents/in/Cognitive_Science"},{"id":8538,"name":"Working Memory","url":"https://www.academia.edu/Documents/in/Working_Memory"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":57557,"name":"Temporal Lobe","url":"https://www.academia.edu/Documents/in/Temporal_Lobe"},{"id":452621,"name":"Neuropsychologia","url":"https://www.academia.edu/Documents/in/Neuropsychologia"},{"id":1239755,"name":"Neurosciences","url":"https://www.academia.edu/Documents/in/Neurosciences"}],"urls":[{"id":42052901,"url":"https://europepmc.org/articles/pmc4899132?pdf=render"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="119246720"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/119246720/Evaluating_spatial_normalization_methods_for_the_human_brain"><img alt="Research paper thumbnail of Evaluating spatial normalization methods for the human brain" class="work-thumbnail" src="https://attachments.academia-assets.com/114662260/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/119246720/Evaluating_spatial_normalization_methods_for_the_human_brain">Evaluating spatial normalization methods for the human brain</a></div><div class="wp-workCard_item"><span>Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual Conference</span><span>, 2005</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Cortical mapping (CSM) studies have shown cortical locations for language function are highly var...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Cortical mapping (CSM) studies have shown cortical locations for language function are highly variable from one subject to the next. If individual variation can be normalized, patterns of language organization may emerge that were heretofore hidden. In order to uncover these patterns, computer-aided spatial normalization to a common atlas is required. Our goal was to determine a methodology by which spatial normalization methods could be evaluated and compared. We developed key metrics to measure accuracy of a surface-based (Caret) and volume-based (SPM2) method. We specified that the optimal method would i) minimize variation as measured by spread reduction between CSM language sites across subjects while also ii) preserving anatomical localization of all CSM sites. Eleven subject&amp;#39;s structural MR image sets and corresponding CSM site coordinates were registered to the colin27 human brain atlas using each method. Local analysis showed that mapping error rates were highest in mor...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="391d6740af3f89be07e8bb574ccbaa89" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:114662260,&quot;asset_id&quot;:119246720,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/114662260/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="119246720"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="119246720"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 119246720; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=119246720]").text(description); $(".js-view-count[data-work-id=119246720]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 119246720; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='119246720']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 119246720, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "391d6740af3f89be07e8bb574ccbaa89" } } $('.js-work-strip[data-work-id=119246720]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":119246720,"title":"Evaluating spatial normalization methods for the human brain","translated_title":"","metadata":{"abstract":"Cortical mapping (CSM) studies have shown cortical locations for language function are highly variable from one subject to the next. If individual variation can be normalized, patterns of language organization may emerge that were heretofore hidden. In order to uncover these patterns, computer-aided spatial normalization to a common atlas is required. Our goal was to determine a methodology by which spatial normalization methods could be evaluated and compared. We developed key metrics to measure accuracy of a surface-based (Caret) and volume-based (SPM2) method. We specified that the optimal method would i) minimize variation as measured by spread reduction between CSM language sites across subjects while also ii) preserving anatomical localization of all CSM sites. Eleven subject\u0026#39;s structural MR image sets and corresponding CSM site coordinates were registered to the colin27 human brain atlas using each method. Local analysis showed that mapping error rates were highest in mor...","publication_date":{"day":null,"month":null,"year":2005,"errors":{}},"publication_name":"Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual Conference"},"translated_abstract":"Cortical mapping (CSM) studies have shown cortical locations for language function are highly variable from one subject to the next. If individual variation can be normalized, patterns of language organization may emerge that were heretofore hidden. In order to uncover these patterns, computer-aided spatial normalization to a common atlas is required. Our goal was to determine a methodology by which spatial normalization methods could be evaluated and compared. We developed key metrics to measure accuracy of a surface-based (Caret) and volume-based (SPM2) method. We specified that the optimal method would i) minimize variation as measured by spread reduction between CSM language sites across subjects while also ii) preserving anatomical localization of all CSM sites. Eleven subject\u0026#39;s structural MR image sets and corresponding CSM site coordinates were registered to the colin27 human brain atlas using each method. Local analysis showed that mapping error rates were highest in mor...","internal_url":"https://www.academia.edu/119246720/Evaluating_spatial_normalization_methods_for_the_human_brain","translated_internal_url":"","created_at":"2024-05-17T09:37:18.985-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":114662260,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/114662260/thumbnails/1.jpg","file_name":"download.pdf","download_url":"https://www.academia.edu/attachments/114662260/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Evaluating_spatial_normalization_methods.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/114662260/download-libre.pdf?1715964060=\u0026response-content-disposition=attachment%3B+filename%3DEvaluating_spatial_normalization_methods.pdf\u0026Expires=1733053720\u0026Signature=Bb3aCMqitiRCUW004ft8jEgfCs5Y8pEDW5jJ9jWAIXnYnItiIsVhTaqJHOCqR6TtTl5F-iBahr9Zo6BZInUGTFRo~3rL~tGXXLDehbL1oig5Q16AE1zGCID2bydEPe-FQ9UtZ0zvex7q3Aon~VS1kO0c2eYN9ZkLbUjdwHyy2uGNiOkPKLxtsz0V68go~YfssCMlPGVZKuF4Nd4bBL8A71Ty2HJqErYW6gi-Dd7B1y7qDFUvTYouRSRbf1t9SHM4ea3mqpoMSPtr2n3WUXEwxpej7m1cPlXk1OSCkF8txWvoq6KVQ71QjPCq9nMzFTuWjz~XFdOk-0IJKb~DrJYmxQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Evaluating_spatial_normalization_methods_for_the_human_brain","translated_slug":"","page_count":133,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":114662260,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/114662260/thumbnails/1.jpg","file_name":"download.pdf","download_url":"https://www.academia.edu/attachments/114662260/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Evaluating_spatial_normalization_methods.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/114662260/download-libre.pdf?1715964060=\u0026response-content-disposition=attachment%3B+filename%3DEvaluating_spatial_normalization_methods.pdf\u0026Expires=1733053720\u0026Signature=Bb3aCMqitiRCUW004ft8jEgfCs5Y8pEDW5jJ9jWAIXnYnItiIsVhTaqJHOCqR6TtTl5F-iBahr9Zo6BZInUGTFRo~3rL~tGXXLDehbL1oig5Q16AE1zGCID2bydEPe-FQ9UtZ0zvex7q3Aon~VS1kO0c2eYN9ZkLbUjdwHyy2uGNiOkPKLxtsz0V68go~YfssCMlPGVZKuF4Nd4bBL8A71Ty2HJqErYW6gi-Dd7B1y7qDFUvTYouRSRbf1t9SHM4ea3mqpoMSPtr2n3WUXEwxpej7m1cPlXk1OSCkF8txWvoq6KVQ71QjPCq9nMzFTuWjz~XFdOk-0IJKb~DrJYmxQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":465,"name":"Artificial Intelligence","url":"https://www.academia.edu/Documents/in/Artificial_Intelligence"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":29731,"name":"Image Registration","url":"https://www.academia.edu/Documents/in/Image_Registration"},{"id":54589,"name":"Anatomy","url":"https://www.academia.edu/Documents/in/Anatomy"},{"id":125564,"name":"Statistical Significance","url":"https://www.academia.edu/Documents/in/Statistical_Significance"},{"id":137633,"name":"Feedback","url":"https://www.academia.edu/Documents/in/Feedback"},{"id":152918,"name":"Error Analysis","url":"https://www.academia.edu/Documents/in/Error_Analysis"},{"id":164637,"name":"Bit Error Rate","url":"https://www.academia.edu/Documents/in/Bit_Error_Rate"},{"id":179931,"name":"Conference Proceedings","url":"https://www.academia.edu/Documents/in/Conference_Proceedings"},{"id":198377,"name":"Individual variation","url":"https://www.academia.edu/Documents/in/Individual_variation"},{"id":203010,"name":"Human Brain","url":"https://www.academia.edu/Documents/in/Human_Brain"},{"id":1122411,"name":"Mr Imaging","url":"https://www.academia.edu/Documents/in/Mr_Imaging"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="119246713"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/119246713/A_novel_EEG_paradigm_to_simultaneously_and_rapidly_assess_the_functioning_of_auditory_and_visual_pathways"><img alt="Research paper thumbnail of A novel EEG paradigm to simultaneously and rapidly assess the functioning of auditory and visual pathways" class="work-thumbnail" src="https://attachments.academia-assets.com/114662242/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/119246713/A_novel_EEG_paradigm_to_simultaneously_and_rapidly_assess_the_functioning_of_auditory_and_visual_pathways">A novel EEG paradigm to simultaneously and rapidly assess the functioning of auditory and visual pathways</a></div><div class="wp-workCard_item"><span>Journal of Neurophysiology</span><span>, 2019</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Objective assessment of the sensory pathways is crucial for understanding their development acros...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Objective assessment of the sensory pathways is crucial for understanding their development across the life span and how they may be affected by neurodevelopmental disorders (e.g., autism spectrum) and neurological pathologies (e.g., stroke, multiple sclerosis, etc.). Quick and passive measurements, for example, using electroencephalography (EEG), are especially important when working with infants and young children and with patient populations having communication deficits (e.g., aphasia). However, many EEG paradigms are limited to measuring activity from one sensory domain at a time, may be time consuming, and target only a subset of possible responses from that particular sensory domain (e.g., only auditory brainstem responses or only auditory P1-N1-P2 evoked potentials). Thus we developed a new multisensory paradigm that enables simultaneous, robust, and rapid (6–12 min) measurements of both auditory and visual EEG activity, including auditory brainstem responses, auditory and v...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="89f3ad46808fae12abd7e500df343f77" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:114662242,&quot;asset_id&quot;:119246713,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/114662242/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="119246713"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="119246713"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 119246713; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=119246713]").text(description); $(".js-view-count[data-work-id=119246713]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 119246713; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='119246713']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 119246713, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "89f3ad46808fae12abd7e500df343f77" } } $('.js-work-strip[data-work-id=119246713]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":119246713,"title":"A novel EEG paradigm to simultaneously and rapidly assess the functioning of auditory and visual pathways","translated_title":"","metadata":{"abstract":"Objective assessment of the sensory pathways is crucial for understanding their development across the life span and how they may be affected by neurodevelopmental disorders (e.g., autism spectrum) and neurological pathologies (e.g., stroke, multiple sclerosis, etc.). Quick and passive measurements, for example, using electroencephalography (EEG), are especially important when working with infants and young children and with patient populations having communication deficits (e.g., aphasia). However, many EEG paradigms are limited to measuring activity from one sensory domain at a time, may be time consuming, and target only a subset of possible responses from that particular sensory domain (e.g., only auditory brainstem responses or only auditory P1-N1-P2 evoked potentials). Thus we developed a new multisensory paradigm that enables simultaneous, robust, and rapid (6–12 min) measurements of both auditory and visual EEG activity, including auditory brainstem responses, auditory and v...","publisher":"American Physiological Society","publication_date":{"day":null,"month":null,"year":2019,"errors":{}},"publication_name":"Journal of Neurophysiology"},"translated_abstract":"Objective assessment of the sensory pathways is crucial for understanding their development across the life span and how they may be affected by neurodevelopmental disorders (e.g., autism spectrum) and neurological pathologies (e.g., stroke, multiple sclerosis, etc.). Quick and passive measurements, for example, using electroencephalography (EEG), are especially important when working with infants and young children and with patient populations having communication deficits (e.g., aphasia). However, many EEG paradigms are limited to measuring activity from one sensory domain at a time, may be time consuming, and target only a subset of possible responses from that particular sensory domain (e.g., only auditory brainstem responses or only auditory P1-N1-P2 evoked potentials). Thus we developed a new multisensory paradigm that enables simultaneous, robust, and rapid (6–12 min) measurements of both auditory and visual EEG activity, including auditory brainstem responses, auditory and v...","internal_url":"https://www.academia.edu/119246713/A_novel_EEG_paradigm_to_simultaneously_and_rapidly_assess_the_functioning_of_auditory_and_visual_pathways","translated_internal_url":"","created_at":"2024-05-17T09:36:59.676-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":114662242,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/114662242/thumbnails/1.jpg","file_name":"231868537.pdf","download_url":"https://www.academia.edu/attachments/114662242/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"A_novel_EEG_paradigm_to_simultaneously_a.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/114662242/231868537-libre.pdf?1715964183=\u0026response-content-disposition=attachment%3B+filename%3DA_novel_EEG_paradigm_to_simultaneously_a.pdf\u0026Expires=1733053720\u0026Signature=VnMzHOnQSI92D9fS5V3p4BGusvw5BDyd0OR5L~P0JU26AO-EI0nSq8-GIU3iKZtSg4r4skNvOUPVRgrNScH8Zx7IoC9wBrhgzhGXdbOGu1nGIqoek00DSIFvEjQMV6Os~na7I8-uWvS1S0jHkC94t9x04Vhgqqx8zfL9oYZeVTLaF8sZwbp2L5X4HjMd5vpxKrO1Kg-hPftflqbNe8Y4izHLoKcWISzf-X01jfTm~MFVKYAzhzcytd3~Sc8zrt1CfNyh6Po4APEh93KjSN-WMrFUlvgM7yCm8iXQZcqNjSiuwEzg9ToTTMZ7WJiXLvzswu8U3lqhkZYxg98iAkL2qw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"A_novel_EEG_paradigm_to_simultaneously_and_rapidly_assess_the_functioning_of_auditory_and_visual_pathways","translated_slug":"","page_count":60,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":114662242,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/114662242/thumbnails/1.jpg","file_name":"231868537.pdf","download_url":"https://www.academia.edu/attachments/114662242/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"A_novel_EEG_paradigm_to_simultaneously_a.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/114662242/231868537-libre.pdf?1715964183=\u0026response-content-disposition=attachment%3B+filename%3DA_novel_EEG_paradigm_to_simultaneously_a.pdf\u0026Expires=1733053720\u0026Signature=VnMzHOnQSI92D9fS5V3p4BGusvw5BDyd0OR5L~P0JU26AO-EI0nSq8-GIU3iKZtSg4r4skNvOUPVRgrNScH8Zx7IoC9wBrhgzhGXdbOGu1nGIqoek00DSIFvEjQMV6Os~na7I8-uWvS1S0jHkC94t9x04Vhgqqx8zfL9oYZeVTLaF8sZwbp2L5X4HjMd5vpxKrO1Kg-hPftflqbNe8Y4izHLoKcWISzf-X01jfTm~MFVKYAzhzcytd3~Sc8zrt1CfNyh6Po4APEh93KjSN-WMrFUlvgM7yCm8iXQZcqNjSiuwEzg9ToTTMZ7WJiXLvzswu8U3lqhkZYxg98iAkL2qw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":161,"name":"Neuroscience","url":"https://www.academia.edu/Documents/in/Neuroscience"},{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":10904,"name":"Electroencephalography","url":"https://www.academia.edu/Documents/in/Electroencephalography"},{"id":22272,"name":"Neurophysiology","url":"https://www.academia.edu/Documents/in/Neurophysiology"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":2922956,"name":"Psychology and Cognitive Sciences","url":"https://www.academia.edu/Documents/in/Psychology_and_Cognitive_Sciences"},{"id":3763225,"name":"Medical and Health Sciences","url":"https://www.academia.edu/Documents/in/Medical_and_Health_Sciences"}],"urls":[{"id":42052897,"url":"https://www.physiology.org/doi/pdf/10.1152/jn.00868.2018"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554085"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554085/in_American_Sign_Language"><img alt="Research paper thumbnail of in American Sign Language?" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554085/in_American_Sign_Language">in American Sign Language?</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Perceptual invariance or orientation specificity</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554085"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554085"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554085; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554085]").text(description); $(".js-view-count[data-work-id=77554085]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554085; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554085']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554085, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=77554085]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554085,"title":"in American Sign Language?","translated_title":"","metadata":{"abstract":"Perceptual invariance or orientation specificity","publication_date":{"day":null,"month":null,"year":2015,"errors":{}}},"translated_abstract":"Perceptual invariance or orientation specificity","internal_url":"https://www.academia.edu/77554085/in_American_Sign_Language","translated_internal_url":"","created_at":"2022-04-25T04:29:43.425-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"in_American_Sign_Language","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[],"research_interests":[{"id":319041,"name":"Repetition Priming","url":"https://www.academia.edu/Documents/in/Repetition_Priming"}],"urls":[{"id":19887687,"url":"http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.602.5112\u0026rep=rep1\u0026type=pdf"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554084"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554084/Characterization_Of_Visual_Properties_Of_Spatial_Frequency_And_Speed_in"><img alt="Research paper thumbnail of Characterization Of Visual Properties Of Spatial Frequency And Speed in" class="work-thumbnail" src="https://attachments.academia-assets.com/84887712/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554084/Characterization_Of_Visual_Properties_Of_Spatial_Frequency_And_Speed_in">Characterization Of Visual Properties Of Spatial Frequency And Speed in</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Careful measurements of the dynamics of speech production have provided important insights into p...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Careful measurements of the dynamics of speech production have provided important insights into phonetic properties of spoken languages. By contrast, analytic quantification of the visual properties of signed languages remains largely unexplored. The purpose of this study was to characterize the spatial and temporal visual properties of American Sign Language (ASL). Novel measurement techniques were used to analyze the spatial frequency of signs and the speed of the hands as they move through space. In Study 1, the amount of energy (or &amp;quot;contrast&amp;quot;) as a function of spatial frequency was determined for various sign categories by applying a Fourier transform to static photographs of twoASL signers. In order to determine whether signing produces unique spatial frequency information, amplitude spectra of a person signing were compared to those of a &amp;quot;neutral&amp;quot; image of a person at rest (not signing). The results of this study reveal only small differences in the amplitu...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="bb5f98517fb67ff226fa08d06f3eb288" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84887712,&quot;asset_id&quot;:77554084,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84887712/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554084"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554084"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554084; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554084]").text(description); $(".js-view-count[data-work-id=77554084]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554084; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554084']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554084, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "bb5f98517fb67ff226fa08d06f3eb288" } } $('.js-work-strip[data-work-id=77554084]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554084,"title":"Characterization Of Visual Properties Of Spatial Frequency And Speed in","translated_title":"","metadata":{"abstract":"Careful measurements of the dynamics of speech production have provided important insights into phonetic properties of spoken languages. By contrast, analytic quantification of the visual properties of signed languages remains largely unexplored. The purpose of this study was to characterize the spatial and temporal visual properties of American Sign Language (ASL). Novel measurement techniques were used to analyze the spatial frequency of signs and the speed of the hands as they move through space. In Study 1, the amount of energy (or \u0026quot;contrast\u0026quot;) as a function of spatial frequency was determined for various sign categories by applying a Fourier transform to static photographs of twoASL signers. In order to determine whether signing produces unique spatial frequency information, amplitude spectra of a person signing were compared to those of a \u0026quot;neutral\u0026quot; image of a person at rest (not signing). The results of this study reveal only small differences in the amplitu...","publication_date":{"day":null,"month":null,"year":2003,"errors":{}}},"translated_abstract":"Careful measurements of the dynamics of speech production have provided important insights into phonetic properties of spoken languages. By contrast, analytic quantification of the visual properties of signed languages remains largely unexplored. The purpose of this study was to characterize the spatial and temporal visual properties of American Sign Language (ASL). Novel measurement techniques were used to analyze the spatial frequency of signs and the speed of the hands as they move through space. In Study 1, the amount of energy (or \u0026quot;contrast\u0026quot;) as a function of spatial frequency was determined for various sign categories by applying a Fourier transform to static photographs of twoASL signers. In order to determine whether signing produces unique spatial frequency information, amplitude spectra of a person signing were compared to those of a \u0026quot;neutral\u0026quot; image of a person at rest (not signing). The results of this study reveal only small differences in the amplitu...","internal_url":"https://www.academia.edu/77554084/Characterization_Of_Visual_Properties_Of_Spatial_Frequency_And_Speed_in","translated_internal_url":"","created_at":"2022-04-25T04:29:43.293-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84887712,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887712/thumbnails/1.jpg","file_name":"BosworthManuscript.pdf","download_url":"https://www.academia.edu/attachments/84887712/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Characterization_Of_Visual_Properties_Of.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887712/BosworthManuscript-libre.pdf?1650938759=\u0026response-content-disposition=attachment%3B+filename%3DCharacterization_Of_Visual_Properties_Of.pdf\u0026Expires=1733053720\u0026Signature=PbC~WefGAVBMWIU2wvDI8yBPqOIt0pYCx-7LF8CWQoiwCKjW1~l6bmRO83EVGeUCqYrwMxRBassdtDkwEf7IFuFb~lZMbGz6yKP8PSrNrC0ggAut9libYOW0a2DS3-cguxDTj64Dbqthld0c~53OG0lxoYRGbxuRkYimshVJSPUOTn3-CJufaYCGbGGQlCGsufmPZgY4CYmJhcZaALhXIC0cBw0vw9X11i8MjJbK0HjeVL03U5pNdBFS8ImbhfZpxktddQws-f1YSf7Jbb9kj76SABEgPKrxvbcgQQwMbdj8bd1NDv2ZeSaHtOSuad7cOO1lAjttd2Nx~sUJXFcb0A__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Characterization_Of_Visual_Properties_Of_Spatial_Frequency_And_Speed_in","translated_slug":"","page_count":17,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84887712,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887712/thumbnails/1.jpg","file_name":"BosworthManuscript.pdf","download_url":"https://www.academia.edu/attachments/84887712/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Characterization_Of_Visual_Properties_Of.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887712/BosworthManuscript-libre.pdf?1650938759=\u0026response-content-disposition=attachment%3B+filename%3DCharacterization_Of_Visual_Properties_Of.pdf\u0026Expires=1733053720\u0026Signature=PbC~WefGAVBMWIU2wvDI8yBPqOIt0pYCx-7LF8CWQoiwCKjW1~l6bmRO83EVGeUCqYrwMxRBassdtDkwEf7IFuFb~lZMbGz6yKP8PSrNrC0ggAut9libYOW0a2DS3-cguxDTj64Dbqthld0c~53OG0lxoYRGbxuRkYimshVJSPUOTn3-CJufaYCGbGGQlCGsufmPZgY4CYmJhcZaALhXIC0cBw0vw9X11i8MjJbK0HjeVL03U5pNdBFS8ImbhfZpxktddQws-f1YSf7Jbb9kj76SABEgPKrxvbcgQQwMbdj8bd1NDv2ZeSaHtOSuad7cOO1lAjttd2Nx~sUJXFcb0A__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":237,"name":"Cognitive Science","url":"https://www.academia.edu/Documents/in/Cognitive_Science"},{"id":1214,"name":"Sign Language","url":"https://www.academia.edu/Documents/in/Sign_Language"},{"id":8054,"name":"Speech Production","url":"https://www.academia.edu/Documents/in/Speech_Production"},{"id":9781,"name":"American Sign Language","url":"https://www.academia.edu/Documents/in/American_Sign_Language"},{"id":155840,"name":"Spatial Frequency","url":"https://www.academia.edu/Documents/in/Spatial_Frequency"},{"id":267802,"name":"Dimensional","url":"https://www.academia.edu/Documents/in/Dimensional"},{"id":390056,"name":"Fourier transform","url":"https://www.academia.edu/Documents/in/Fourier_transform"},{"id":3007616,"name":"Measurement technique","url":"https://www.academia.edu/Documents/in/Measurement_technique"}],"urls":[{"id":19887686,"url":"http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.5.4932\u0026rep=rep1\u0026type=pdf"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554083"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554083/Full_title_Perceptual_invariance_or_orientation_specificity_in_American_Sign_Language_Evidence_from_repetition_priming_for_signs_and_gestures_Short_title_ASL_repetition_priming"><img alt="Research paper thumbnail of Full title : Perceptual invariance or orientation specificity in American Sign Language ? Evidence from repetition priming for signs and gestures Short title : ASL repetition priming" class="work-thumbnail" src="https://attachments.academia-assets.com/84847728/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554083/Full_title_Perceptual_invariance_or_orientation_specificity_in_American_Sign_Language_Evidence_from_repetition_priming_for_signs_and_gestures_Short_title_ASL_repetition_priming">Full title : Perceptual invariance or orientation specificity in American Sign Language ? Evidence from repetition priming for signs and gestures Short title : ASL repetition priming</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Repetition priming has been successfully employed to examine stages of processing in a wide varie...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Repetition priming has been successfully employed to examine stages of processing in a wide variety of cognitive domains including language, object recognition and memory. This study uses a novel repetition priming paradigm in the context of a categorization task to explore early stages in the processing of American Sign Language signs and self-grooming gestures. Specifically, we investigated the degree to which deaf signers&amp;#39; and hearing non-signers&amp;#39; perception of these linguistic or non-linguistic actions might be differentially robust to changes in perceptual viewpoint. We conjectured that to the extent that signers were accessing language-specific representations in their performance of the task, they might show more similar priming effects under different viewing conditions than hearing subjects. In essence, this would provide evidence for a visually-based &amp;quot; lack of invariance &amp;quot; phenomenon. However, if the early stages of visual action processing are similar fo...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="87f339653d0cb53bb97eedd3097a2642" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84847728,&quot;asset_id&quot;:77554083,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84847728/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554083"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554083"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554083; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554083]").text(description); $(".js-view-count[data-work-id=77554083]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554083; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554083']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554083, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "87f339653d0cb53bb97eedd3097a2642" } } $('.js-work-strip[data-work-id=77554083]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554083,"title":"Full title : Perceptual invariance or orientation specificity in American Sign Language ? Evidence from repetition priming for signs and gestures Short title : ASL repetition priming","translated_title":"","metadata":{"abstract":"Repetition priming has been successfully employed to examine stages of processing in a wide variety of cognitive domains including language, object recognition and memory. This study uses a novel repetition priming paradigm in the context of a categorization task to explore early stages in the processing of American Sign Language signs and self-grooming gestures. Specifically, we investigated the degree to which deaf signers\u0026#39; and hearing non-signers\u0026#39; perception of these linguistic or non-linguistic actions might be differentially robust to changes in perceptual viewpoint. We conjectured that to the extent that signers were accessing language-specific representations in their performance of the task, they might show more similar priming effects under different viewing conditions than hearing subjects. In essence, this would provide evidence for a visually-based \u0026quot; lack of invariance \u0026quot; phenomenon. However, if the early stages of visual action processing are similar fo...","ai_title_tag":"Perceptual Invariance in ASL: Evidence from Repetition Priming","publication_date":{"day":null,"month":null,"year":2010,"errors":{}}},"translated_abstract":"Repetition priming has been successfully employed to examine stages of processing in a wide variety of cognitive domains including language, object recognition and memory. This study uses a novel repetition priming paradigm in the context of a categorization task to explore early stages in the processing of American Sign Language signs and self-grooming gestures. Specifically, we investigated the degree to which deaf signers\u0026#39; and hearing non-signers\u0026#39; perception of these linguistic or non-linguistic actions might be differentially robust to changes in perceptual viewpoint. We conjectured that to the extent that signers were accessing language-specific representations in their performance of the task, they might show more similar priming effects under different viewing conditions than hearing subjects. In essence, this would provide evidence for a visually-based \u0026quot; lack of invariance \u0026quot; phenomenon. However, if the early stages of visual action processing are similar fo...","internal_url":"https://www.academia.edu/77554083/Full_title_Perceptual_invariance_or_orientation_specificity_in_American_Sign_Language_Evidence_from_repetition_priming_for_signs_and_gestures_Short_title_ASL_repetition_priming","translated_internal_url":"","created_at":"2022-04-25T04:29:43.165-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84847728,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84847728/thumbnails/1.jpg","file_name":"Corina_20FLR_20Paper_20August_202010.pdf","download_url":"https://www.academia.edu/attachments/84847728/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Full_title_Perceptual_invariance_or_orie.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84847728/Corina_20FLR_20Paper_20August_202010-libre.pdf?1650886665=\u0026response-content-disposition=attachment%3B+filename%3DFull_title_Perceptual_invariance_or_orie.pdf\u0026Expires=1733053720\u0026Signature=TKIiFIP5tLDxTF8U-qHGwWnhQ5kg3kFts~DbB7G8aX0NZX6HyMYNcbUJ-rVDfFAI28C9n95PnKuTvVMElGxZoRh8iGNyuYapTrYQokiXAAITw-TZGEpaaF0lhcBf962aKfdSs~zUxtfO03RJn~mXc4b8iv6bG55yA~oejXe1bSANYx4M6lHXh8J~H~mCbLy-w4f-Rt2J1v2LPs11nZdxp57Zr29BhLEXBcQiuGz9qXTZun6O~JwkfLJhP3yPlNuM00dUXOrZGmJWKgT~Mphr7Sve9pGUKjqEi0UKDQOx7RSbuNS-Hbgn84OyLkFMJQOPHrS18C6kr9PMWFHbRzsppg__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Full_title_Perceptual_invariance_or_orientation_specificity_in_American_Sign_Language_Evidence_from_repetition_priming_for_signs_and_gestures_Short_title_ASL_repetition_priming","translated_slug":"","page_count":36,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84847728,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84847728/thumbnails/1.jpg","file_name":"Corina_20FLR_20Paper_20August_202010.pdf","download_url":"https://www.academia.edu/attachments/84847728/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMCw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Full_title_Perceptual_invariance_or_orie.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84847728/Corina_20FLR_20Paper_20August_202010-libre.pdf?1650886665=\u0026response-content-disposition=attachment%3B+filename%3DFull_title_Perceptual_invariance_or_orie.pdf\u0026Expires=1733053720\u0026Signature=TKIiFIP5tLDxTF8U-qHGwWnhQ5kg3kFts~DbB7G8aX0NZX6HyMYNcbUJ-rVDfFAI28C9n95PnKuTvVMElGxZoRh8iGNyuYapTrYQokiXAAITw-TZGEpaaF0lhcBf962aKfdSs~zUxtfO03RJn~mXc4b8iv6bG55yA~oejXe1bSANYx4M6lHXh8J~H~mCbLy-w4f-Rt2J1v2LPs11nZdxp57Zr29BhLEXBcQiuGz9qXTZun6O~JwkfLJhP3yPlNuM00dUXOrZGmJWKgT~Mphr7Sve9pGUKjqEi0UKDQOx7RSbuNS-Hbgn84OyLkFMJQOPHrS18C6kr9PMWFHbRzsppg__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"},{"id":84847729,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84847729/thumbnails/1.jpg","file_name":"Corina_20FLR_20Paper_20August_202010.pdf","download_url":"https://www.academia.edu/attachments/84847729/download_file","bulk_download_file_name":"Full_title_Perceptual_invariance_or_orie.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84847729/Corina_20FLR_20Paper_20August_202010-libre.pdf?1650886665=\u0026response-content-disposition=attachment%3B+filename%3DFull_title_Perceptual_invariance_or_orie.pdf\u0026Expires=1733053720\u0026Signature=CsRC1GJyneuC6EnLw6DkIxQUSrYbKefVEpGcdVZ4-viXE8eMmzl2bKrxHWJpUgU-cbI17tAJxOfr0c1yUZnW7efOKWk0CRYQ-wPRQTDL10fqSLET9TV8rvJyRV4jCC-DkeXOWB7tfgInEJr6IOYD8kZPReKy5Ii0Htf9tLeK1Oy0I1wlrMJXYZneeDjKWy4RSBtu7eyXl9e9dXbxvpdhaeM-dLvhffkEQyYOiLXVQ2IicUQwJxHChFAau5NtY1OJwjRmn5Xh0clzEgDe9QqAbe7YtfjfcRuXzEmieXvfvP9tSfPs3ofd5X4auUxoPNClQOnHmev3aBm0Ij80tlAftg__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[],"urls":[{"id":19887685,"url":"http://lcn.salk.edu/publications/2009%202010/Corina%20FLR%20Paper%20August%202010.pdf"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554082"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554082/Differential_Processing_of_Topographic_and_Referential_Functions_of_Space"><img alt="Research paper thumbnail of Differential Processing of Topographic and Referential Functions of Space" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554082/Differential_Processing_of_Topographic_and_Referential_Functions_of_Space">Differential Processing of Topographic and Referential Functions of Space</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">... involving 37 deaf adult signers and 1 hearing adult signer with a predominantly mesial superi...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">... involving 37 deaf adult signers and 1 hearing adult signer with a predominantly mesial superior occipital-parietal lesion] that investigate: (1) the neural underpinnings of topographic and referential spatial functions, (2) on-line processing of these different uses of space, and (3 ...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554082"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554082"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554082; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554082]").text(description); $(".js-view-count[data-work-id=77554082]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554082; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554082']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554082, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=77554082]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554082,"title":"Differential Processing of Topographic and Referential Functions of Space","translated_title":"","metadata":{"abstract":"... involving 37 deaf adult signers and 1 hearing adult signer with a predominantly mesial superior occipital-parietal lesion] that investigate: (1) the neural underpinnings of topographic and referential spatial functions, (2) on-line processing of these different uses of space, and (3 ...","publication_date":{"day":null,"month":null,"year":2013,"errors":{}}},"translated_abstract":"... involving 37 deaf adult signers and 1 hearing adult signer with a predominantly mesial superior occipital-parietal lesion] that investigate: (1) the neural underpinnings of topographic and referential spatial functions, (2) on-line processing of these different uses of space, and (3 ...","internal_url":"https://www.academia.edu/77554082/Differential_Processing_of_Topographic_and_Referential_Functions_of_Space","translated_internal_url":"","created_at":"2022-04-25T04:29:43.059-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Differential_Processing_of_Topographic_and_Referential_Functions_of_Space","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[],"research_interests":[{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554081"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554081/Procesamiento_neural_de_un_lenguaje_silbado"><img alt="Research paper thumbnail of Procesamiento neural de un lenguaje silbado" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554081/Procesamiento_neural_de_un_lenguaje_silbado">Procesamiento neural de un lenguaje silbado</a></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554081"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554081"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554081; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554081]").text(description); $(".js-view-count[data-work-id=77554081]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554081; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554081']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554081, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=77554081]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554081,"title":"Procesamiento neural de un lenguaje silbado","translated_title":"","metadata":{"publication_date":{"day":null,"month":null,"year":2005,"errors":{}}},"translated_abstract":null,"internal_url":"https://www.academia.edu/77554081/Procesamiento_neural_de_un_lenguaje_silbado","translated_internal_url":"","created_at":"2022-04-25T04:29:42.968-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Procesamiento_neural_de_un_lenguaje_silbado","translated_slug":"","page_count":null,"language":"es","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2","email":"akk2STJxV015dnVhOGxvdmZydzY2d1NhSE5MRis4NlpualJjTmZlNEQyOD0tLXAzVnV1eFl0VkxaVVZDRk0vQnJCWlE9PQ==--391f32be9bca30489f63a7cdd92ac71a49be911d"},"attachments":[],"research_interests":[],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554080"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554080/Visual_Attention_to_the_Periphery_Is_Enhanced_in_Congenitally_Deaf_Individuals"><img alt="Research paper thumbnail of Visual Attention to the Periphery Is Enhanced in Congenitally Deaf Individuals" class="work-thumbnail" src="https://attachments.academia-assets.com/84887760/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554080/Visual_Attention_to_the_Periphery_Is_Enhanced_in_Congenitally_Deaf_Individuals">Visual Attention to the Periphery Is Enhanced in Congenitally Deaf Individuals</a></div><div class="wp-workCard_item"><span>The Journal of Neuroscience</span><span>, 2000</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="5b7c5858843660adeef687ccfb1fd6b4" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84887760,&quot;asset_id&quot;:77554080,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84887760/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554080"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554080"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554080; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554080]").text(description); $(".js-view-count[data-work-id=77554080]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554080; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554080']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554080, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "5b7c5858843660adeef687ccfb1fd6b4" } } $('.js-work-strip[data-work-id=77554080]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554080,"title":"Visual Attention to the Periphery Is Enhanced in Congenitally Deaf Individuals","translated_title":"","metadata":{"publisher":"Society for Neuroscience","publication_date":{"day":null,"month":null,"year":2000,"errors":{}},"publication_name":"The Journal of Neuroscience"},"translated_abstract":null,"internal_url":"https://www.academia.edu/77554080/Visual_Attention_to_the_Periphery_Is_Enhanced_in_Congenitally_Deaf_Individuals","translated_internal_url":"","created_at":"2022-04-25T04:29:42.832-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84887760,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887760/thumbnails/1.jpg","file_name":"RC93.full.pdf","download_url":"https://www.academia.edu/attachments/84887760/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Visual_Attention_to_the_Periphery_Is_Enh.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887760/RC93.full-libre.pdf?1650938757=\u0026response-content-disposition=attachment%3B+filename%3DVisual_Attention_to_the_Periphery_Is_Enh.pdf\u0026Expires=1733053721\u0026Signature=XfEUr6WsiZXSR-AW8GxUdoA2~Rgv3MlSGQQjc7cGunecFFvAws7Zjw6iq5lu1RqHAfmy8ri0VRD-75gW2XGIdaqO5rcaw7QKr4d~KibndHy4gk0KVt-Z12SdZPxuW8msTTaF28HhzL6egovwvdVBt94Yq6D~SVAxwsjPRZE8heqWPaD6sfhG3rVp8WOUQ4CE9AcQxWEKlXIGjdOotsX032~6gii-qYyfyntP2EDBh9xTuc7VAYmCIikPQ023rxBperVzSkVwlbM625KMBttXJTQ9TioYHftkfr1Io5LURkvLZYFuFUxdHhAXcBZ0Sk-d0~ZOLE9cZDnDnpLC0SB7pg__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Visual_Attention_to_the_Periphery_Is_Enhanced_in_Congenitally_Deaf_Individuals","translated_slug":"","page_count":6,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84887760,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887760/thumbnails/1.jpg","file_name":"RC93.full.pdf","download_url":"https://www.academia.edu/attachments/84887760/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Visual_Attention_to_the_Periphery_Is_Enh.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887760/RC93.full-libre.pdf?1650938757=\u0026response-content-disposition=attachment%3B+filename%3DVisual_Attention_to_the_Periphery_Is_Enh.pdf\u0026Expires=1733053721\u0026Signature=XfEUr6WsiZXSR-AW8GxUdoA2~Rgv3MlSGQQjc7cGunecFFvAws7Zjw6iq5lu1RqHAfmy8ri0VRD-75gW2XGIdaqO5rcaw7QKr4d~KibndHy4gk0KVt-Z12SdZPxuW8msTTaF28HhzL6egovwvdVBt94Yq6D~SVAxwsjPRZE8heqWPaD6sfhG3rVp8WOUQ4CE9AcQxWEKlXIGjdOotsX032~6gii-qYyfyntP2EDBh9xTuc7VAYmCIikPQ023rxBperVzSkVwlbM625KMBttXJTQ9TioYHftkfr1Io5LURkvLZYFuFUxdHhAXcBZ0Sk-d0~ZOLE9cZDnDnpLC0SB7pg__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":161,"name":"Neuroscience","url":"https://www.academia.edu/Documents/in/Neuroscience"},{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":4008,"name":"Visual attention","url":"https://www.academia.edu/Documents/in/Visual_attention"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":123274,"name":"Parietal Cortex","url":"https://www.academia.edu/Documents/in/Parietal_Cortex"},{"id":123277,"name":"Posterior Parietal Cortex","url":"https://www.academia.edu/Documents/in/Posterior_Parietal_Cortex"},{"id":124951,"name":"Effective Connectivity","url":"https://www.academia.edu/Documents/in/Effective_Connectivity"},{"id":214510,"name":"Structural Equation Model","url":"https://www.academia.edu/Documents/in/Structural_Equation_Model"},{"id":298502,"name":"Area Mt","url":"https://www.academia.edu/Documents/in/Area_Mt"},{"id":914074,"name":"Visual Field","url":"https://www.academia.edu/Documents/in/Visual_Field"},{"id":2922956,"name":"Psychology and Cognitive Sciences","url":"https://www.academia.edu/Documents/in/Psychology_and_Cognitive_Sciences"},{"id":3095916,"name":"normal hearing","url":"https://www.academia.edu/Documents/in/normal_hearing"},{"id":3763225,"name":"Medical and Health Sciences","url":"https://www.academia.edu/Documents/in/Medical_and_Health_Sciences"}],"urls":[{"id":19887684,"url":"https://syndication.highwire.org/content/doi/10.1523/JNEUROSCI.20-17-j0001.2000"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554079"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554079/Real_time_lexical_comprehension_in_young_children_learning_American_Sign_Language"><img alt="Research paper thumbnail of Real-time lexical comprehension in young children learning American Sign Language" class="work-thumbnail" src="https://attachments.academia-assets.com/84887708/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554079/Real_time_lexical_comprehension_in_young_children_learning_American_Sign_Language">Real-time lexical comprehension in young children learning American Sign Language</a></div><div class="wp-workCard_item"><span>Developmental science</span><span>, Jan 16, 2018</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">When children interpret spoken language in real time, linguistic information drives rapid shifts ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">When children interpret spoken language in real time, linguistic information drives rapid shifts in visual attention to objects in the visual world. This language-vision interaction can provide insights into children&amp;#39;s developing efficiency in language comprehension. But how does language influence visual attention when the linguistic signal and the visual world are both processed via the visual channel? Here, we measured eye movements during real-time comprehension of a visual-manual language, American Sign Language (ASL), by 29 native ASL-learning children (16-53 mos, 16 deaf, 13 hearing) and 16 fluent deaf adult signers. All signers showed evidence of rapid, incremental language comprehension, tending to initiate an eye movement before sign offset. Deaf and hearing ASL-learners showed similar gaze patterns, suggesting that the in-the-moment dynamics of eye movements during ASL processing are shaped by the constraints of processing a visual language in real time and not by dif...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="1628de48f5b20bb931119fa7077a82b1" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84887708,&quot;asset_id&quot;:77554079,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84887708/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554079"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554079"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554079; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554079]").text(description); $(".js-view-count[data-work-id=77554079]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554079; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554079']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554079, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "1628de48f5b20bb931119fa7077a82b1" } } $('.js-work-strip[data-work-id=77554079]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554079,"title":"Real-time lexical comprehension in young children learning American Sign Language","translated_title":"","metadata":{"abstract":"When children interpret spoken language in real time, linguistic information drives rapid shifts in visual attention to objects in the visual world. This language-vision interaction can provide insights into children\u0026#39;s developing efficiency in language comprehension. But how does language influence visual attention when the linguistic signal and the visual world are both processed via the visual channel? Here, we measured eye movements during real-time comprehension of a visual-manual language, American Sign Language (ASL), by 29 native ASL-learning children (16-53 mos, 16 deaf, 13 hearing) and 16 fluent deaf adult signers. All signers showed evidence of rapid, incremental language comprehension, tending to initiate an eye movement before sign offset. Deaf and hearing ASL-learners showed similar gaze patterns, suggesting that the in-the-moment dynamics of eye movements during ASL processing are shaped by the constraints of processing a visual language in real time and not by dif...","publication_date":{"day":16,"month":1,"year":2018,"errors":{}},"publication_name":"Developmental science"},"translated_abstract":"When children interpret spoken language in real time, linguistic information drives rapid shifts in visual attention to objects in the visual world. This language-vision interaction can provide insights into children\u0026#39;s developing efficiency in language comprehension. But how does language influence visual attention when the linguistic signal and the visual world are both processed via the visual channel? Here, we measured eye movements during real-time comprehension of a visual-manual language, American Sign Language (ASL), by 29 native ASL-learning children (16-53 mos, 16 deaf, 13 hearing) and 16 fluent deaf adult signers. All signers showed evidence of rapid, incremental language comprehension, tending to initiate an eye movement before sign offset. Deaf and hearing ASL-learners showed similar gaze patterns, suggesting that the in-the-moment dynamics of eye movements during ASL processing are shaped by the constraints of processing a visual language in real time and not by dif...","internal_url":"https://www.academia.edu/77554079/Real_time_lexical_comprehension_in_young_children_learning_American_Sign_Language","translated_internal_url":"","created_at":"2022-04-25T04:29:42.734-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84887708,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887708/thumbnails/1.jpg","file_name":"download.pdf","download_url":"https://www.academia.edu/attachments/84887708/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Real_time_lexical_comprehension_in_young.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887708/download-libre.pdf?1650938781=\u0026response-content-disposition=attachment%3B+filename%3DReal_time_lexical_comprehension_in_young.pdf\u0026Expires=1733053721\u0026Signature=Av9~nCILWaN4sns5wrYt-l36sgsZJzBf2C56b5aDSIx1C9Cnwgpop~0mchxWJzYEFhdnHPa1N3hZRDgtoVwSjqYuhoo8Nr402tDKy7sNDzEn~vsCJJyoKw3DIqOTFHTan226RUrw379QtLtmqMYiSu7D77GtFljx3Xy04Jg1rH4GbmqzYJhRpLIXFS2CD1xoLhfwZQV-w97DDmyxO2Ik-Ktubax1opyfrLo4pmdpE75dHx46T~WQYNfEAE7mk0QLllJ~4BoD3FrF6LlBsTjPgslgU5gzF04joi~rdheSF2D--znzw8bpfjaW90aRF6-IN3L8WZb3MmPliMbref2tBA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Real_time_lexical_comprehension_in_young_children_learning_American_Sign_Language","translated_slug":"","page_count":47,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84887708,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887708/thumbnails/1.jpg","file_name":"download.pdf","download_url":"https://www.academia.edu/attachments/84887708/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Real_time_lexical_comprehension_in_young.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887708/download-libre.pdf?1650938781=\u0026response-content-disposition=attachment%3B+filename%3DReal_time_lexical_comprehension_in_young.pdf\u0026Expires=1733053721\u0026Signature=Av9~nCILWaN4sns5wrYt-l36sgsZJzBf2C56b5aDSIx1C9Cnwgpop~0mchxWJzYEFhdnHPa1N3hZRDgtoVwSjqYuhoo8Nr402tDKy7sNDzEn~vsCJJyoKw3DIqOTFHTan226RUrw379QtLtmqMYiSu7D77GtFljx3Xy04Jg1rH4GbmqzYJhRpLIXFS2CD1xoLhfwZQV-w97DDmyxO2Ik-Ktubax1opyfrLo4pmdpE75dHx46T~WQYNfEAE7mk0QLllJ~4BoD3FrF6LlBsTjPgslgU5gzF04joi~rdheSF2D--znzw8bpfjaW90aRF6-IN3L8WZb3MmPliMbref2tBA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":237,"name":"Cognitive Science","url":"https://www.academia.edu/Documents/in/Cognitive_Science"},{"id":15674,"name":"Linguistics","url":"https://www.academia.edu/Documents/in/Linguistics"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":442068,"name":"Developmental Science","url":"https://www.academia.edu/Documents/in/Developmental_Science"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554078"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554078/Brain_and_Language"><img alt="Research paper thumbnail of Brain and Language" class="work-thumbnail" src="https://attachments.academia-assets.com/84887706/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554078/Brain_and_Language">Brain and Language</a></div><div class="wp-workCard_item"><span>Neuron</span><span>, 1998</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="11d854745ac8b23a19d80505f0acdd08" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84887706,&quot;asset_id&quot;:77554078,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84887706/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554078"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554078"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554078; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554078]").text(description); $(".js-view-count[data-work-id=77554078]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554078; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554078']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554078, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "11d854745ac8b23a19d80505f0acdd08" } } $('.js-work-strip[data-work-id=77554078]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554078,"title":"Brain and Language","translated_title":"","metadata":{"publication_date":{"day":null,"month":null,"year":1998,"errors":{}},"publication_name":"Neuron"},"translated_abstract":null,"internal_url":"https://www.academia.edu/77554078/Brain_and_Language","translated_internal_url":"","created_at":"2022-04-25T04:29:42.605-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84887706,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887706/thumbnails/1.jpg","file_name":"ATTACHMENT01.pdf","download_url":"https://www.academia.edu/attachments/84887706/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Brain_and_Language.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887706/ATTACHMENT01-libre.pdf?1650938760=\u0026response-content-disposition=attachment%3B+filename%3DBrain_and_Language.pdf\u0026Expires=1733053721\u0026Signature=XAv6J71nGkx93-OSnfgiPMgSODjpwt-SQDMctxucM-ypDpil-~hrnqcvx7~06nUvKASIw3Fnl8Nz1EH4TOz2IUXLRFy~RMVB78LYg03p1PG4g9wuTRD1EChHLAfFgfUxG75qSM72SbYM~yJleb43LUGguCRC-Hw1vljgro1TmeavoqG~nThIl7bLZ0ov7LkR8SggCq~zeZvs-w4S6JryM05sifZf~FswWWdSwZte1MYizuWPFFR0PQhnd4T4RvBmNK0rKLQYtyYn3TMYTO~-9EmsCvvz1qbIA1rW-E4IO0Iy2jqDEwuxGVnTlR9tI73Z1~rSIg8GvrssR656u62N-Q__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Brain_and_Language","translated_slug":"","page_count":5,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84887706,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887706/thumbnails/1.jpg","file_name":"ATTACHMENT01.pdf","download_url":"https://www.academia.edu/attachments/84887706/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Brain_and_Language.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887706/ATTACHMENT01-libre.pdf?1650938760=\u0026response-content-disposition=attachment%3B+filename%3DBrain_and_Language.pdf\u0026Expires=1733053721\u0026Signature=XAv6J71nGkx93-OSnfgiPMgSODjpwt-SQDMctxucM-ypDpil-~hrnqcvx7~06nUvKASIw3Fnl8Nz1EH4TOz2IUXLRFy~RMVB78LYg03p1PG4g9wuTRD1EChHLAfFgfUxG75qSM72SbYM~yJleb43LUGguCRC-Hw1vljgro1TmeavoqG~nThIl7bLZ0ov7LkR8SggCq~zeZvs-w4S6JryM05sifZf~FswWWdSwZte1MYizuWPFFR0PQhnd4T4RvBmNK0rKLQYtyYn3TMYTO~-9EmsCvvz1qbIA1rW-E4IO0Iy2jqDEwuxGVnTlR9tI73Z1~rSIg8GvrssR656u62N-Q__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":237,"name":"Cognitive Science","url":"https://www.academia.edu/Documents/in/Cognitive_Science"},{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":1214,"name":"Sign Language","url":"https://www.academia.edu/Documents/in/Sign_Language"},{"id":18174,"name":"Language","url":"https://www.academia.edu/Documents/in/Language"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":61474,"name":"Brain","url":"https://www.academia.edu/Documents/in/Brain"},{"id":473565,"name":"Neuron","url":"https://www.academia.edu/Documents/in/Neuron"},{"id":1239755,"name":"Neurosciences","url":"https://www.academia.edu/Documents/in/Neurosciences"},{"id":2234200,"name":"Functional Laterality","url":"https://www.academia.edu/Documents/in/Functional_Laterality"}],"urls":[{"id":19887683,"url":"http://sciencedirect.com/science/article/pii/s089662730080536x"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554077"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554077/The_Processing_of_Biologically_Plausible_and_Implausible_forms_in_American_Sign_Language_Evidence_for_Perceptual_Tuning"><img alt="Research paper thumbnail of The Processing of Biologically Plausible and Implausible forms in American Sign Language: Evidence for Perceptual Tuning" class="work-thumbnail" src="https://attachments.academia-assets.com/84887715/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554077/The_Processing_of_Biologically_Plausible_and_Implausible_forms_in_American_Sign_Language_Evidence_for_Perceptual_Tuning">The Processing of Biologically Plausible and Implausible forms in American Sign Language: Evidence for Perceptual Tuning</a></div><div class="wp-workCard_item"><span>Language, cognition and neuroscience</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The human auditory system distinguishes speech-like information from general auditory signals in ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The human auditory system distinguishes speech-like information from general auditory signals in a remarkably fast and efficient way. Combining psychophysics and neurophysiology (MEG), we demonstrate a similar result for the processing of visual information used for language communication in users of sign languages. We demonstrate that the earliest visual cortical responses in deaf signers viewing American Sign Language (ASL) signs show specific modulations to violations of anatomic constraints that would make the sign either possible or impossible to articulate. These neural data are accompanied with a significantly increased perceptual sensitivity to the anatomical incongruity. The differential effects in the early visual evoked potentials arguably reflect an expectation-driven assessment of somatic representational integrity, suggesting that language experience and/or auditory deprivation may shape the neuronal mechanisms underlying the analysis of complex human form. The data de...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="a4e6a8ca6c9185881871da7efcfd9199" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84887715,&quot;asset_id&quot;:77554077,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84887715/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554077"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554077"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554077; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554077]").text(description); $(".js-view-count[data-work-id=77554077]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554077; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554077']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554077, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "a4e6a8ca6c9185881871da7efcfd9199" } } $('.js-work-strip[data-work-id=77554077]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554077,"title":"The Processing of Biologically Plausible and Implausible forms in American Sign Language: Evidence for Perceptual Tuning","translated_title":"","metadata":{"abstract":"The human auditory system distinguishes speech-like information from general auditory signals in a remarkably fast and efficient way. Combining psychophysics and neurophysiology (MEG), we demonstrate a similar result for the processing of visual information used for language communication in users of sign languages. We demonstrate that the earliest visual cortical responses in deaf signers viewing American Sign Language (ASL) signs show specific modulations to violations of anatomic constraints that would make the sign either possible or impossible to articulate. These neural data are accompanied with a significantly increased perceptual sensitivity to the anatomical incongruity. The differential effects in the early visual evoked potentials arguably reflect an expectation-driven assessment of somatic representational integrity, suggesting that language experience and/or auditory deprivation may shape the neuronal mechanisms underlying the analysis of complex human form. The data de...","publication_name":"Language, cognition and neuroscience"},"translated_abstract":"The human auditory system distinguishes speech-like information from general auditory signals in a remarkably fast and efficient way. Combining psychophysics and neurophysiology (MEG), we demonstrate a similar result for the processing of visual information used for language communication in users of sign languages. We demonstrate that the earliest visual cortical responses in deaf signers viewing American Sign Language (ASL) signs show specific modulations to violations of anatomic constraints that would make the sign either possible or impossible to articulate. These neural data are accompanied with a significantly increased perceptual sensitivity to the anatomical incongruity. The differential effects in the early visual evoked potentials arguably reflect an expectation-driven assessment of somatic representational integrity, suggesting that language experience and/or auditory deprivation may shape the neuronal mechanisms underlying the analysis of complex human form. The data de...","internal_url":"https://www.academia.edu/77554077/The_Processing_of_Biologically_Plausible_and_Implausible_forms_in_American_Sign_Language_Evidence_for_Perceptual_Tuning","translated_internal_url":"","created_at":"2022-04-25T04:29:42.496-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84887715,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887715/thumbnails/1.jpg","file_name":"pmc4849140.pdf","download_url":"https://www.academia.edu/attachments/84887715/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"The_Processing_of_Biologically_Plausible.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887715/pmc4849140-libre.pdf?1650938760=\u0026response-content-disposition=attachment%3B+filename%3DThe_Processing_of_Biologically_Plausible.pdf\u0026Expires=1733053721\u0026Signature=cDo6mIstXIs~zCbx5djVAVug8aosvUWtom2V8rWQ~CYRVPMwMCnMO56fHvnR6HU-NdHTwda4F0X34eGPrfEryMX9WC9umdWf2TDnHbCNEmCZ~8pWNXubcWQQ3cLljwQXv6LAHwta7mhXcWdTW46fWdDNS95hlXbYtowoxI90wq-ag1r5-vQGdvS7ZRvOuEedOanL~IjNKqopsii40t2c~fiBM8we5RIaw2CC1GSPWR7j1QwL2tkGy6HY3sXmBvWB0b3KHC2f-fZlDXRI4noms34Kqqf5cPIj3bqC1Gz6hacXiSytv8RXYjpeDwWJAw5Z8tyBQgF18kBI~6LDhIfhxw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"The_Processing_of_Biologically_Plausible_and_Implausible_forms_in_American_Sign_Language_Evidence_for_Perceptual_Tuning","translated_slug":"","page_count":14,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84887715,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887715/thumbnails/1.jpg","file_name":"pmc4849140.pdf","download_url":"https://www.academia.edu/attachments/84887715/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"The_Processing_of_Biologically_Plausible.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887715/pmc4849140-libre.pdf?1650938760=\u0026response-content-disposition=attachment%3B+filename%3DThe_Processing_of_Biologically_Plausible.pdf\u0026Expires=1733053721\u0026Signature=cDo6mIstXIs~zCbx5djVAVug8aosvUWtom2V8rWQ~CYRVPMwMCnMO56fHvnR6HU-NdHTwda4F0X34eGPrfEryMX9WC9umdWf2TDnHbCNEmCZ~8pWNXubcWQQ3cLljwQXv6LAHwta7mhXcWdTW46fWdDNS95hlXbYtowoxI90wq-ag1r5-vQGdvS7ZRvOuEedOanL~IjNKqopsii40t2c~fiBM8we5RIaw2CC1GSPWR7j1QwL2tkGy6HY3sXmBvWB0b3KHC2f-fZlDXRI4noms34Kqqf5cPIj3bqC1Gz6hacXiSytv8RXYjpeDwWJAw5Z8tyBQgF18kBI~6LDhIfhxw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":953669,"name":"Routledge","url":"https://www.academia.edu/Documents/in/Routledge"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554076"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554076/Human_Temporal_Cortical_Single_Neuron_Activity_During_Working_Memory_Maintenance"><img alt="Research paper thumbnail of Human Temporal Cortical Single Neuron Activity During Working Memory Maintenance" class="work-thumbnail" src="https://attachments.academia-assets.com/84927744/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554076/Human_Temporal_Cortical_Single_Neuron_Activity_During_Working_Memory_Maintenance">Human Temporal Cortical Single Neuron Activity During Working Memory Maintenance</a></div><div class="wp-workCard_item"><span>Neuropsychologia</span><span>, 2016</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="1f2fd271ca1a1646b8b29ff56b41c227" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84927744,&quot;asset_id&quot;:77554076,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84927744/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554076"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554076"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554076; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554076]").text(description); $(".js-view-count[data-work-id=77554076]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554076; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554076']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554076, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "1f2fd271ca1a1646b8b29ff56b41c227" } } $('.js-work-strip[data-work-id=77554076]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554076,"title":"Human Temporal Cortical Single Neuron Activity During Working Memory Maintenance","translated_title":"","metadata":{"publisher":"Elsevier BV","publication_date":{"day":null,"month":null,"year":2016,"errors":{}},"publication_name":"Neuropsychologia"},"translated_abstract":null,"internal_url":"https://www.academia.edu/77554076/Human_Temporal_Cortical_Single_Neuron_Activity_During_Working_Memory_Maintenance","translated_internal_url":"","created_at":"2022-04-25T04:29:42.375-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84927744,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84927744/thumbnails/1.jpg","file_name":"ptpmcrender.pdf","download_url":"https://www.academia.edu/attachments/84927744/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Human_Temporal_Cortical_Single_Neuron_Ac.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84927744/ptpmcrender-libre.pdf?1650938979=\u0026response-content-disposition=attachment%3B+filename%3DHuman_Temporal_Cortical_Single_Neuron_Ac.pdf\u0026Expires=1733053721\u0026Signature=aib-DEu0v6y7H1--mPBVX0phzbqp-40jXmyBT3ZvoFiim0N0y9hGlMLYYyHRuf1iFq~23XwFSsrdw537CKa7D0ltjJoaEugN2TuS-P9fEx2hjsffkag3YJiu9XJphuOeZg3OonwghGtnCvGrzEZG41rC7keUOVCTAEr~whR8dwiilPLbj51lcaKOXEJ3aO7q5WlRC5rqayAnxbEVUXhhiX0E8oh7KfLez7JTZ74sVx-ES1nmiU7FTy9qDix1zLmhaU40s3GWv5UKEeE98~uBgOUd53pX4-S8JRcuOoICuzbWUsUacskzN1aWZliiCzILGzwfVPWyw1lsJu476KZUyA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Human_Temporal_Cortical_Single_Neuron_Activity_During_Working_Memory_Maintenance","translated_slug":"","page_count":29,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84927744,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84927744/thumbnails/1.jpg","file_name":"ptpmcrender.pdf","download_url":"https://www.academia.edu/attachments/84927744/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Human_Temporal_Cortical_Single_Neuron_Ac.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84927744/ptpmcrender-libre.pdf?1650938979=\u0026response-content-disposition=attachment%3B+filename%3DHuman_Temporal_Cortical_Single_Neuron_Ac.pdf\u0026Expires=1733053721\u0026Signature=aib-DEu0v6y7H1--mPBVX0phzbqp-40jXmyBT3ZvoFiim0N0y9hGlMLYYyHRuf1iFq~23XwFSsrdw537CKa7D0ltjJoaEugN2TuS-P9fEx2hjsffkag3YJiu9XJphuOeZg3OonwghGtnCvGrzEZG41rC7keUOVCTAEr~whR8dwiilPLbj51lcaKOXEJ3aO7q5WlRC5rqayAnxbEVUXhhiX0E8oh7KfLez7JTZ74sVx-ES1nmiU7FTy9qDix1zLmhaU40s3GWv5UKEeE98~uBgOUd53pX4-S8JRcuOoICuzbWUsUacskzN1aWZliiCzILGzwfVPWyw1lsJu476KZUyA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":237,"name":"Cognitive Science","url":"https://www.academia.edu/Documents/in/Cognitive_Science"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":452621,"name":"Neuropsychologia","url":"https://www.academia.edu/Documents/in/Neuropsychologia"},{"id":1239755,"name":"Neurosciences","url":"https://www.academia.edu/Documents/in/Neurosciences"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554074"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554074/nfluences_of_Linguistic_and_Non_Linguistic_Factors_in_the_Processing_of_American_Sign_Language_Evidence_from_Handshape_Monitoring"><img alt="Research paper thumbnail of nfluences of Linguistic and Non-Linguistic Factors in the Processing of American Sign Language: Evidence from Handshape Monitoring" class="work-thumbnail" src="https://attachments.academia-assets.com/84887720/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554074/nfluences_of_Linguistic_and_Non_Linguistic_Factors_in_the_Processing_of_American_Sign_Language_Evidence_from_Handshape_Monitoring">nfluences of Linguistic and Non-Linguistic Factors in the Processing of American Sign Language: Evidence from Handshape Monitoring</a></div><div class="wp-workCard_item"><span>Annual Meeting of the Berkeley Linguistics Society</span><span>, 2009</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="93a75e4b43fee9205880dab5876bf6c2" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84887720,&quot;asset_id&quot;:77554074,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84887720/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554074"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554074"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554074; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554074]").text(description); $(".js-view-count[data-work-id=77554074]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554074; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554074']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554074, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "93a75e4b43fee9205880dab5876bf6c2" } } $('.js-work-strip[data-work-id=77554074]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554074,"title":"nfluences of Linguistic and Non-Linguistic Factors in the Processing of American Sign Language: Evidence from Handshape Monitoring","translated_title":"","metadata":{"publisher":"Linguistic Society of America","publication_date":{"day":null,"month":null,"year":2009,"errors":{}},"publication_name":"Annual Meeting of the Berkeley Linguistics Society"},"translated_abstract":null,"internal_url":"https://www.academia.edu/77554074/nfluences_of_Linguistic_and_Non_Linguistic_Factors_in_the_Processing_of_American_Sign_Language_Evidence_from_Handshape_Monitoring","translated_internal_url":"","created_at":"2022-04-25T04:29:42.244-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84887720,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887720/thumbnails/1.jpg","file_name":"e2d11e764b3fa98dbb2c0f9f622366d6571c.pdf","download_url":"https://www.academia.edu/attachments/84887720/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"nfluences_of_Linguistic_and_Non_Linguist.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887720/e2d11e764b3fa98dbb2c0f9f622366d6571c-libre.pdf?1650938760=\u0026response-content-disposition=attachment%3B+filename%3Dnfluences_of_Linguistic_and_Non_Linguist.pdf\u0026Expires=1733053721\u0026Signature=L5w8ySSywL1II8cnHOoqCjVYeipnTeI5jaCUGXM4IZyTA-2m8OcLgcsenLmNOFYcQWt3oF2S3kdXAs9-9LOoxZT2o79Zb6XikkCprn61Cxoit~aCgqULpwS~XMAZ4LkSU7bJJ6xFP3zvDn6QhcSWMaBVUWJi9xV-GQQnpO71JBHz38LwZToDlkxrYFA53wOxPkeDp~2WNpszRvDKsAu21kojUy-RCtJFqRC~vXs3mKBA-myK91Cf~-t6UyZEXW0vC5nSHUDY6XMtWgMH3mdomrg14q1agBIdsvOKzEU0984D0uYIrvfuKAmos2XB1pAUIU~493r0IOa9jhWUkogiJA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"nfluences_of_Linguistic_and_Non_Linguistic_Factors_in_the_Processing_of_American_Sign_Language_Evidence_from_Handshape_Monitoring","translated_slug":"","page_count":12,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84887720,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887720/thumbnails/1.jpg","file_name":"e2d11e764b3fa98dbb2c0f9f622366d6571c.pdf","download_url":"https://www.academia.edu/attachments/84887720/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"nfluences_of_Linguistic_and_Non_Linguist.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887720/e2d11e764b3fa98dbb2c0f9f622366d6571c-libre.pdf?1650938760=\u0026response-content-disposition=attachment%3B+filename%3Dnfluences_of_Linguistic_and_Non_Linguist.pdf\u0026Expires=1733053721\u0026Signature=L5w8ySSywL1II8cnHOoqCjVYeipnTeI5jaCUGXM4IZyTA-2m8OcLgcsenLmNOFYcQWt3oF2S3kdXAs9-9LOoxZT2o79Zb6XikkCprn61Cxoit~aCgqULpwS~XMAZ4LkSU7bJJ6xFP3zvDn6QhcSWMaBVUWJi9xV-GQQnpO71JBHz38LwZToDlkxrYFA53wOxPkeDp~2WNpszRvDKsAu21kojUy-RCtJFqRC~vXs3mKBA-myK91Cf~-t6UyZEXW0vC5nSHUDY6XMtWgMH3mdomrg14q1agBIdsvOKzEU0984D0uYIrvfuKAmos2XB1pAUIU~493r0IOa9jhWUkogiJA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554073"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554073/An_fMRI_study_of_perception_and_action_in_deaf_signers"><img alt="Research paper thumbnail of An fMRI study of perception and action in. deaf signers" class="work-thumbnail" src="https://attachments.academia-assets.com/84927702/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554073/An_fMRI_study_of_perception_and_action_in_deaf_signers">An fMRI study of perception and action in. deaf signers</a></div><div class="wp-workCard_item"><span>Neuropsychologia</span><span>, 2016</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="57029de8125446e035c9c0f5b84ee08b" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84927702,&quot;asset_id&quot;:77554073,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84927702/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554073"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554073"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554073; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554073]").text(description); $(".js-view-count[data-work-id=77554073]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554073; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554073']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554073, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "57029de8125446e035c9c0f5b84ee08b" } } $('.js-work-strip[data-work-id=77554073]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554073,"title":"An fMRI study of perception and action in. deaf signers","translated_title":"","metadata":{"publisher":"Elsevier BV","publication_date":{"day":null,"month":null,"year":2016,"errors":{}},"publication_name":"Neuropsychologia"},"translated_abstract":null,"internal_url":"https://www.academia.edu/77554073/An_fMRI_study_of_perception_and_action_in_deaf_signers","translated_internal_url":"","created_at":"2022-04-25T04:29:42.105-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84927702,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84927702/thumbnails/1.jpg","file_name":"ptpmcrender.pdf","download_url":"https://www.academia.edu/attachments/84927702/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"An_fMRI_study_of_perception_and_action_i.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84927702/ptpmcrender-libre.pdf?1650938981=\u0026response-content-disposition=attachment%3B+filename%3DAn_fMRI_study_of_perception_and_action_i.pdf\u0026Expires=1733053721\u0026Signature=NWHnSKM3kfhRLa66YNRd0pvdtVc2l58ICj5QRc~w0MF2hhEfMuA5CorfuqCtMl28bSitUzrDlkD3t2NarF7D-IU~zbPpZioXGMXP6kJ8fMNLb29xyQjwgfXkCv2ZL-MIjvfsnfCY2LqfuAJQL6eLhDesBWouQgyWfphyQ6p18EwZrq-~EntyFyHNISoKUgOPClK78iZrWWu11Sg1xyfp5WslTDhV43MRU1ez2jvmO3~1Xu~7t~GnxQcp7DZygweWtPVjWUbPnBR99mXP~q-p-L~oClDjAgO1uOH2FTpSiryCuwbMfNCnV6mrDiddW2w-i4oxo~F898nH~x2fAJ4Cag__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"An_fMRI_study_of_perception_and_action_in_deaf_signers","translated_slug":"","page_count":24,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84927702,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84927702/thumbnails/1.jpg","file_name":"ptpmcrender.pdf","download_url":"https://www.academia.edu/attachments/84927702/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"An_fMRI_study_of_perception_and_action_i.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84927702/ptpmcrender-libre.pdf?1650938981=\u0026response-content-disposition=attachment%3B+filename%3DAn_fMRI_study_of_perception_and_action_i.pdf\u0026Expires=1733053721\u0026Signature=NWHnSKM3kfhRLa66YNRd0pvdtVc2l58ICj5QRc~w0MF2hhEfMuA5CorfuqCtMl28bSitUzrDlkD3t2NarF7D-IU~zbPpZioXGMXP6kJ8fMNLb29xyQjwgfXkCv2ZL-MIjvfsnfCY2LqfuAJQL6eLhDesBWouQgyWfphyQ6p18EwZrq-~EntyFyHNISoKUgOPClK78iZrWWu11Sg1xyfp5WslTDhV43MRU1ez2jvmO3~1Xu~7t~GnxQcp7DZygweWtPVjWUbPnBR99mXP~q-p-L~oClDjAgO1uOH2FTpSiryCuwbMfNCnV6mrDiddW2w-i4oxo~F898nH~x2fAJ4Cag__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":237,"name":"Cognitive Science","url":"https://www.academia.edu/Documents/in/Cognitive_Science"},{"id":1214,"name":"Sign Language","url":"https://www.academia.edu/Documents/in/Sign_Language"},{"id":2349,"name":"Semantics","url":"https://www.academia.edu/Documents/in/Semantics"},{"id":6200,"name":"Magnetic Resonance Imaging","url":"https://www.academia.edu/Documents/in/Magnetic_Resonance_Imaging"},{"id":22506,"name":"Adolescent","url":"https://www.academia.edu/Documents/in/Adolescent"},{"id":52176,"name":"Brain Mapping","url":"https://www.academia.edu/Documents/in/Brain_Mapping"},{"id":61474,"name":"Brain","url":"https://www.academia.edu/Documents/in/Brain"},{"id":153836,"name":"Motor Cortex","url":"https://www.academia.edu/Documents/in/Motor_Cortex"},{"id":226636,"name":"Deafness","url":"https://www.academia.edu/Documents/in/Deafness"},{"id":406036,"name":"Parietal Lobe","url":"https://www.academia.edu/Documents/in/Parietal_Lobe"},{"id":452621,"name":"Neuropsychologia","url":"https://www.academia.edu/Documents/in/Neuropsychologia"},{"id":1239755,"name":"Neurosciences","url":"https://www.academia.edu/Documents/in/Neurosciences"},{"id":1959585,"name":"Broca area","url":"https://www.academia.edu/Documents/in/Broca_area"},{"id":2444775,"name":"Psychomotor Performance","url":"https://www.academia.edu/Documents/in/Psychomotor_Performance"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554072"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554072/Brain_and_Language_Minireview_a_Perspective_from_Sign_Language"><img alt="Research paper thumbnail of Brain and Language: Minireview a Perspective from Sign Language" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554072/Brain_and_Language_Minireview_a_Perspective_from_Sign_Language">Brain and Language: Minireview a Perspective from Sign Language</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Psychology Department representations found in spoken language, includingUniversity of Oregon pho...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Psychology Department representations found in spoken language, includingUniversity of Oregon phonology, morphology, syntax, semantics, and prag-Eugene, Oregon 97403 matics (Lillo-Martin, 1991; Corina and Sandler, 1993).Thus, similar linguistic structures are found in spokenand signed languages. A number of authors have pro-One of the most enduring and significant findings from posed that the left hemisphere recruitmentfor languageneuropsychology is the left hemisphere dominance for results from aspecialization oftheseareas forthe analy-language processing. Studies both past and present sis of linguistic structures. By this view, the structuralconverge to establish a widespread language network similarity between signed and spoken languages pre-in the left peri-sylvian cortex which encompasses at dicts that left hemisphere language areas should alsoleast four main regions: Broca’s area, within the inferior be recruited during ASL processing.prefrontal cortex; Wernicke’s area, within the posterior On the surface, however, ASL differs markedly fromtwo-thirds of the superior temporal lobe; the anterior spoken languages. For example, in ASL, phonologicalportion of the superior temporal lobe; and the middle distinctions are created by the positions and shape ofprefrontal cortex (Neville and Bavelier, 1998). While the the hands relative to the body rather than by acousticlanguage processing abilities of the left hemisphere are features such as nasality and voicing found in spokenuncontroversial, little is known about the determinants languages. The fact that signed and spoken languagesof this left hemisphere specialization for language. Are rely on different input and output modalities carries im-these areas geneticallydetermined to processlinguistic portant consequences for theories on the origin of theinformation? To what extent is this organization influ- left hemisphere dominance for language. It is often ar-enced by the language experience of each individual? gued that the left hemisphere specialization for lan-guage originates from a left hemisphere advantage toWhat role doesthe acoustic structure of languagesplayexecute fine temporal discrimination, such as the fastin this pattern of organization?acoustic processingrequired during speech perceptionTodate,mostofourunderstandingoftheneuralbases(Tallal et al., 1993). By this view, the standard left hemi-of language is derived from the studies of spoken lan-sphere language areas may not be recruited during theguages. Unfortunately, this spoken language bias limitsprocessing of visuo-spatial languages such as ASL.our ability to infer the determinants of left hemisphere Signed and spoken languages also differ by the wayspecialization for human language. For example, weare they convey linguistic information. While most aspectsunable to assess whether left hemisphere dominance of spoken languages rely on fast acoustic transitionsarises from the analysis of the sequential/hierarchical (e.g., consonant contrast) and temporalorderingof con-structures that are the building blocks of natural lan- stituents (e.g., suffixation, prefixation, word order, etc.),guages or rather is attributable to processing of the sign languages make significant use of visuo-spatialacoustic signal of spoken language. devices. For example, the use of signing space as aAmerican Sign Language (ASL), which makes use of staging ground for the depiction of grammatical rela-spatial location and motion of the hands in encoding tions is a prominent feature of ASL syntax. As shown inlinguistic information, enables us to investigate this is- Figure 1, in ASL,nominals introducedinto the discoursesue. The comparison of the neural representations of are assigned arbitrary reference points in a horizontalspoken and signed languages permits the separationof plane of signing space. Signs with pronominal functionthose brain structures that are common to all natural are directedtoward thesepoints, and verb signs obliga-human languages from those that are determined by torilymovebetweensuchpointsinspecifyinggrammati-the modality in which a language develops, providing cal relations (subject of, object of). Thus, grammaticalnew insight into the specificity of left hemisphere spe- functions served in many spoken languages by casecialization for language. marking or by linear ordering of words are fulfilled inIn this paper, we will first review some properties of ASL by spatial mechanisms; this is often referred to asASL and then discuss the contribution of the left hemi- “spatialized syntax” (Lillo-Martin, 1991; Poizner et al.,sphere and that of the right hemisphere to ASL pro- 1987; but see Liddell, 1998, for an alternative view).cessing. Another example of ASL processing that makes specialuse of visuo-spatial information is the classifier system.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554072"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554072"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554072; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554072]").text(description); $(".js-view-count[data-work-id=77554072]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554072; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554072']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554072, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=77554072]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554072,"title":"Brain and Language: Minireview a Perspective from Sign Language","translated_title":"","metadata":{"abstract":"Psychology Department representations found in spoken language, includingUniversity of Oregon phonology, morphology, syntax, semantics, and prag-Eugene, Oregon 97403 matics (Lillo-Martin, 1991; Corina and Sandler, 1993).Thus, similar linguistic structures are found in spokenand signed languages. A number of authors have pro-One of the most enduring and significant findings from posed that the left hemisphere recruitmentfor languageneuropsychology is the left hemisphere dominance for results from aspecialization oftheseareas forthe analy-language processing. Studies both past and present sis of linguistic structures. By this view, the structuralconverge to establish a widespread language network similarity between signed and spoken languages pre-in the left peri-sylvian cortex which encompasses at dicts that left hemisphere language areas should alsoleast four main regions: Broca’s area, within the inferior be recruited during ASL processing.prefrontal cortex; Wernicke’s area, within the posterior On the surface, however, ASL differs markedly fromtwo-thirds of the superior temporal lobe; the anterior spoken languages. For example, in ASL, phonologicalportion of the superior temporal lobe; and the middle distinctions are created by the positions and shape ofprefrontal cortex (Neville and Bavelier, 1998). While the the hands relative to the body rather than by acousticlanguage processing abilities of the left hemisphere are features such as nasality and voicing found in spokenuncontroversial, little is known about the determinants languages. The fact that signed and spoken languagesof this left hemisphere specialization for language. Are rely on different input and output modalities carries im-these areas geneticallydetermined to processlinguistic portant consequences for theories on the origin of theinformation? To what extent is this organization influ- left hemisphere dominance for language. It is often ar-enced by the language experience of each individual? gued that the left hemisphere specialization for lan-guage originates from a left hemisphere advantage toWhat role doesthe acoustic structure of languagesplayexecute fine temporal discrimination, such as the fastin this pattern of organization?acoustic processingrequired during speech perceptionTodate,mostofourunderstandingoftheneuralbases(Tallal et al., 1993). By this view, the standard left hemi-of language is derived from the studies of spoken lan-sphere language areas may not be recruited during theguages. Unfortunately, this spoken language bias limitsprocessing of visuo-spatial languages such as ASL.our ability to infer the determinants of left hemisphere Signed and spoken languages also differ by the wayspecialization for human language. For example, weare they convey linguistic information. While most aspectsunable to assess whether left hemisphere dominance of spoken languages rely on fast acoustic transitionsarises from the analysis of the sequential/hierarchical (e.g., consonant contrast) and temporalorderingof con-structures that are the building blocks of natural lan- stituents (e.g., suffixation, prefixation, word order, etc.),guages or rather is attributable to processing of the sign languages make significant use of visuo-spatialacoustic signal of spoken language. devices. For example, the use of signing space as aAmerican Sign Language (ASL), which makes use of staging ground for the depiction of grammatical rela-spatial location and motion of the hands in encoding tions is a prominent feature of ASL syntax. As shown inlinguistic information, enables us to investigate this is- Figure 1, in ASL,nominals introducedinto the discoursesue. The comparison of the neural representations of are assigned arbitrary reference points in a horizontalspoken and signed languages permits the separationof plane of signing space. Signs with pronominal functionthose brain structures that are common to all natural are directedtoward thesepoints, and verb signs obliga-human languages from those that are determined by torilymovebetweensuchpointsinspecifyinggrammati-the modality in which a language develops, providing cal relations (subject of, object of). Thus, grammaticalnew insight into the specificity of left hemisphere spe- functions served in many spoken languages by casecialization for language. marking or by linear ordering of words are fulfilled inIn this paper, we will first review some properties of ASL by spatial mechanisms; this is often referred to asASL and then discuss the contribution of the left hemi- “spatialized syntax” (Lillo-Martin, 1991; Poizner et al.,sphere and that of the right hemisphere to ASL pro- 1987; but see Liddell, 1998, for an alternative view).cessing. Another example of ASL processing that makes specialuse of visuo-spatial information is the classifier system."},"translated_abstract":"Psychology Department representations found in spoken language, includingUniversity of Oregon phonology, morphology, syntax, semantics, and prag-Eugene, Oregon 97403 matics (Lillo-Martin, 1991; Corina and Sandler, 1993).Thus, similar linguistic structures are found in spokenand signed languages. A number of authors have pro-One of the most enduring and significant findings from posed that the left hemisphere recruitmentfor languageneuropsychology is the left hemisphere dominance for results from aspecialization oftheseareas forthe analy-language processing. Studies both past and present sis of linguistic structures. By this view, the structuralconverge to establish a widespread language network similarity between signed and spoken languages pre-in the left peri-sylvian cortex which encompasses at dicts that left hemisphere language areas should alsoleast four main regions: Broca’s area, within the inferior be recruited during ASL processing.prefrontal cortex; Wernicke’s area, within the posterior On the surface, however, ASL differs markedly fromtwo-thirds of the superior temporal lobe; the anterior spoken languages. For example, in ASL, phonologicalportion of the superior temporal lobe; and the middle distinctions are created by the positions and shape ofprefrontal cortex (Neville and Bavelier, 1998). While the the hands relative to the body rather than by acousticlanguage processing abilities of the left hemisphere are features such as nasality and voicing found in spokenuncontroversial, little is known about the determinants languages. The fact that signed and spoken languagesof this left hemisphere specialization for language. Are rely on different input and output modalities carries im-these areas geneticallydetermined to processlinguistic portant consequences for theories on the origin of theinformation? To what extent is this organization influ- left hemisphere dominance for language. It is often ar-enced by the language experience of each individual? gued that the left hemisphere specialization for lan-guage originates from a left hemisphere advantage toWhat role doesthe acoustic structure of languagesplayexecute fine temporal discrimination, such as the fastin this pattern of organization?acoustic processingrequired during speech perceptionTodate,mostofourunderstandingoftheneuralbases(Tallal et al., 1993). By this view, the standard left hemi-of language is derived from the studies of spoken lan-sphere language areas may not be recruited during theguages. Unfortunately, this spoken language bias limitsprocessing of visuo-spatial languages such as ASL.our ability to infer the determinants of left hemisphere Signed and spoken languages also differ by the wayspecialization for human language. For example, weare they convey linguistic information. While most aspectsunable to assess whether left hemisphere dominance of spoken languages rely on fast acoustic transitionsarises from the analysis of the sequential/hierarchical (e.g., consonant contrast) and temporalorderingof con-structures that are the building blocks of natural lan- stituents (e.g., suffixation, prefixation, word order, etc.),guages or rather is attributable to processing of the sign languages make significant use of visuo-spatialacoustic signal of spoken language. devices. For example, the use of signing space as aAmerican Sign Language (ASL), which makes use of staging ground for the depiction of grammatical rela-spatial location and motion of the hands in encoding tions is a prominent feature of ASL syntax. As shown inlinguistic information, enables us to investigate this is- Figure 1, in ASL,nominals introducedinto the discoursesue. The comparison of the neural representations of are assigned arbitrary reference points in a horizontalspoken and signed languages permits the separationof plane of signing space. Signs with pronominal functionthose brain structures that are common to all natural are directedtoward thesepoints, and verb signs obliga-human languages from those that are determined by torilymovebetweensuchpointsinspecifyinggrammati-the modality in which a language develops, providing cal relations (subject of, object of). Thus, grammaticalnew insight into the specificity of left hemisphere spe- functions served in many spoken languages by casecialization for language. marking or by linear ordering of words are fulfilled inIn this paper, we will first review some properties of ASL by spatial mechanisms; this is often referred to asASL and then discuss the contribution of the left hemi- “spatialized syntax” (Lillo-Martin, 1991; Poizner et al.,sphere and that of the right hemisphere to ASL pro- 1987; but see Liddell, 1998, for an alternative view).cessing. Another example of ASL processing that makes specialuse of visuo-spatial information is the classifier system.","internal_url":"https://www.academia.edu/77554072/Brain_and_Language_Minireview_a_Perspective_from_Sign_Language","translated_internal_url":"","created_at":"2022-04-25T04:29:41.932-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Brain_and_Language_Minireview_a_Perspective_from_Sign_Language","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":1214,"name":"Sign Language","url":"https://www.academia.edu/Documents/in/Sign_Language"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554071"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554071/Unobtrusive_integration_of_data_management_with_fMRI_analysis"><img alt="Research paper thumbnail of Unobtrusive integration of data management with fMRI analysis" class="work-thumbnail" src="https://attachments.academia-assets.com/84887727/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554071/Unobtrusive_integration_of_data_management_with_fMRI_analysis">Unobtrusive integration of data management with fMRI analysis</a></div><div class="wp-workCard_item"><span>Neuroinformatics</span><span>, 2007</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="a4592acb11e095e84892032da7ca78c4" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84887727,&quot;asset_id&quot;:77554071,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84887727/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554071"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554071"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554071; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554071]").text(description); $(".js-view-count[data-work-id=77554071]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554071; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554071']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554071, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "a4592acb11e095e84892032da7ca78c4" } } $('.js-work-strip[data-work-id=77554071]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554071,"title":"Unobtrusive integration of data management with fMRI analysis","translated_title":"","metadata":{"publisher":"Springer Nature","publication_date":{"day":null,"month":null,"year":2007,"errors":{}},"publication_name":"Neuroinformatics"},"translated_abstract":null,"internal_url":"https://www.academia.edu/77554071/Unobtrusive_integration_of_data_management_with_fMRI_analysis","translated_internal_url":"","created_at":"2022-04-25T04:29:41.803-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84887727,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887727/thumbnails/1.jpg","file_name":"hertzenberg_medinfo.pdf","download_url":"https://www.academia.edu/attachments/84887727/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Unobtrusive_integration_of_data_manageme.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887727/hertzenberg_medinfo-libre.pdf?1650938758=\u0026response-content-disposition=attachment%3B+filename%3DUnobtrusive_integration_of_data_manageme.pdf\u0026Expires=1733053721\u0026Signature=I~57koR9MgrOFoeZArnX-5CUfG8RARNrPKzX1vR3K3s0mBTUhXAf3O47y0bDP~29rkKml5fqdhuzSwpWAma0uP~UgcFXkdqYa2PQu2IWtTzW67LtuGoaFUEyoNn-UTpvggkpiLjlrRt~LBclVijfqbxTwbWETeCR9zYbcPgYOol7aZfInc~M57Uancoksi9Y7vl6q8vUk43Xh9zdzb5XKZB9lywq56zZh1WRY0mddZicIricPnllB8uB3F3CzYr8eyS8jX47E3ga4uBg46ozh8uFdVZSRZa4smuQ-a4kGYmymROJaYUOtsGzW7wLzFwgaLwW6qzTU9ULMMbHUbKLyA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"Unobtrusive_integration_of_data_management_with_fMRI_analysis","translated_slug":"","page_count":1,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84887727,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887727/thumbnails/1.jpg","file_name":"hertzenberg_medinfo.pdf","download_url":"https://www.academia.edu/attachments/84887727/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"Unobtrusive_integration_of_data_manageme.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887727/hertzenberg_medinfo-libre.pdf?1650938758=\u0026response-content-disposition=attachment%3B+filename%3DUnobtrusive_integration_of_data_manageme.pdf\u0026Expires=1733053721\u0026Signature=I~57koR9MgrOFoeZArnX-5CUfG8RARNrPKzX1vR3K3s0mBTUhXAf3O47y0bDP~29rkKml5fqdhuzSwpWAma0uP~UgcFXkdqYa2PQu2IWtTzW67LtuGoaFUEyoNn-UTpvggkpiLjlrRt~LBclVijfqbxTwbWETeCR9zYbcPgYOol7aZfInc~M57Uancoksi9Y7vl6q8vUk43Xh9zdzb5XKZB9lywq56zZh1WRY0mddZicIricPnllB8uB3F3CzYr8eyS8jX47E3ga4uBg46ozh8uFdVZSRZa4smuQ-a4kGYmymROJaYUOtsGzW7wLzFwgaLwW6qzTU9ULMMbHUbKLyA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":4492,"name":"Neuroinformatics","url":"https://www.academia.edu/Documents/in/Neuroinformatics"},{"id":6200,"name":"Magnetic Resonance Imaging","url":"https://www.academia.edu/Documents/in/Magnetic_Resonance_Imaging"},{"id":8131,"name":"Data Management","url":"https://www.academia.edu/Documents/in/Data_Management"},{"id":26327,"name":"Medicine","url":"https://www.academia.edu/Documents/in/Medicine"},{"id":45090,"name":"Database Management Systems","url":"https://www.academia.edu/Documents/in/Database_Management_Systems"},{"id":52176,"name":"Brain Mapping","url":"https://www.academia.edu/Documents/in/Brain_Mapping"},{"id":53293,"name":"Software","url":"https://www.academia.edu/Documents/in/Software"},{"id":216941,"name":"Data Center","url":"https://www.academia.edu/Documents/in/Data_Center"},{"id":1239755,"name":"Neurosciences","url":"https://www.academia.edu/Documents/in/Neurosciences"},{"id":1681026,"name":"Biochemistry and cell biology","url":"https://www.academia.edu/Documents/in/Biochemistry_and_cell_biology"},{"id":2057366,"name":"Software Package","url":"https://www.academia.edu/Documents/in/Software_Package"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554070"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554070/A_Connectionist_Perspective_on_Prosodic_Structure"><img alt="Research paper thumbnail of A Connectionist Perspective on Prosodic Structure" class="work-thumbnail" src="https://attachments.academia-assets.com/84887705/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554070/A_Connectionist_Perspective_on_Prosodic_Structure">A Connectionist Perspective on Prosodic Structure</a></div><div class="wp-workCard_item"><span>Annual Meeting of the Berkeley Linguistics Society</span><span>, 1989</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Proceedings of the Fifteenth Annual Meeting of the Berkeley Linguistics Society (1989), pp. 114-125</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="25b2d1a2ea34f653f26a927d7b25f00a" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:84887705,&quot;asset_id&quot;:77554070,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/84887705/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554070"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554070"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554070; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554070]").text(description); $(".js-view-count[data-work-id=77554070]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554070; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554070']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554070, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "25b2d1a2ea34f653f26a927d7b25f00a" } } $('.js-work-strip[data-work-id=77554070]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554070,"title":"A Connectionist Perspective on Prosodic Structure","translated_title":"","metadata":{"abstract":"Proceedings of the Fifteenth Annual Meeting of the Berkeley Linguistics Society (1989), pp. 114-125","publisher":"Linguistic Society of America","publication_date":{"day":null,"month":null,"year":1989,"errors":{}},"publication_name":"Annual Meeting of the Berkeley Linguistics Society"},"translated_abstract":"Proceedings of the Fifteenth Annual Meeting of the Berkeley Linguistics Society (1989), pp. 114-125","internal_url":"https://www.academia.edu/77554070/A_Connectionist_Perspective_on_Prosodic_Structure","translated_internal_url":"","created_at":"2022-04-25T04:29:41.683-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[{"id":84887705,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887705/thumbnails/1.jpg","file_name":"e35f4473d10e323cdc7716de97148ae8a771.pdf","download_url":"https://www.academia.edu/attachments/84887705/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"A_Connectionist_Perspective_on_Prosodic.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887705/e35f4473d10e323cdc7716de97148ae8a771-libre.pdf?1650938762=\u0026response-content-disposition=attachment%3B+filename%3DA_Connectionist_Perspective_on_Prosodic.pdf\u0026Expires=1733053721\u0026Signature=FGwiqliR1-xIZR~ddEimnR3NNBDqkntCf56Zguu30fO492ovUS2XJFmBuocFFGnCRW49iPZPzT085SRj1pOEfkUFjQeL1cHSZgv8dolDoacIpv4P6~Xrk-UM2fUnqtig1IW6-KyTIWBMlINhLvLXjiEYTXnOOKzpAO1SIlcCf2vw2gASlFQmlYqkregCPhH6h4k398sDPAHYbY4hNxtc3smJaMD8Mz76s9sK50nagp9UmlTzKjbzLwsG7Shl8Bxlk7XvJLJRw5V4x~e7WFo48pLfO-5RrCNW~KwUVur9WA-J0Mbl83QcBprLsztyiqa2oodZ-YvQ8WOkIDp1-wgkhQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"slug":"A_Connectionist_Perspective_on_Prosodic_Structure","translated_slug":"","page_count":13,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[{"id":84887705,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/84887705/thumbnails/1.jpg","file_name":"e35f4473d10e323cdc7716de97148ae8a771.pdf","download_url":"https://www.academia.edu/attachments/84887705/download_file?st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&st=MTczMzA1MDEyMSw4LjIyMi4yMDguMTQ2&","bulk_download_file_name":"A_Connectionist_Perspective_on_Prosodic.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/84887705/e35f4473d10e323cdc7716de97148ae8a771-libre.pdf?1650938762=\u0026response-content-disposition=attachment%3B+filename%3DA_Connectionist_Perspective_on_Prosodic.pdf\u0026Expires=1733053721\u0026Signature=FGwiqliR1-xIZR~ddEimnR3NNBDqkntCf56Zguu30fO492ovUS2XJFmBuocFFGnCRW49iPZPzT085SRj1pOEfkUFjQeL1cHSZgv8dolDoacIpv4P6~Xrk-UM2fUnqtig1IW6-KyTIWBMlINhLvLXjiEYTXnOOKzpAO1SIlcCf2vw2gASlFQmlYqkregCPhH6h4k398sDPAHYbY4hNxtc3smJaMD8Mz76s9sK50nagp9UmlTzKjbzLwsG7Shl8Bxlk7XvJLJRw5V4x~e7WFo48pLfO-5RrCNW~KwUVur9WA-J0Mbl83QcBprLsztyiqa2oodZ-YvQ8WOkIDp1-wgkhQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}],"research_interests":[{"id":422,"name":"Computer Science","url":"https://www.academia.edu/Documents/in/Computer_Science"},{"id":5423,"name":"Connectionism","url":"https://www.academia.edu/Documents/in/Connectionism"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554069"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554069/A_new_neuron_spike_sorting_method_using_maximal_overlap_discrete_wavelet_transform_and_rotated_principal_component_analysis"><img alt="Research paper thumbnail of A new neuron spike sorting method using maximal overlap discrete wavelet transform and rotated principal component analysis" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554069/A_new_neuron_spike_sorting_method_using_maximal_overlap_discrete_wavelet_transform_and_rotated_principal_component_analysis">A new neuron spike sorting method using maximal overlap discrete wavelet transform and rotated principal component analysis</a></div><div class="wp-workCard_item"><span>Proceedings of the 25th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (IEEE Cat. No.03CH37439)</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Abstract A new method for neuron spike sorting is presented that uses maximal overlap discrete wa...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Abstract A new method for neuron spike sorting is presented that uses maximal overlap discrete wavelet transform (MODWT) and rotated principal component analysis. MODWT is very effective in extracting the bandwidth of neuron firings without shape distortion. We then used a rotated principal component analysis to isolate unique neuron templates. In this procedure the first principal component serves as a neuron spike template. This component is then removed from the original data and the procedure is repeated. Thus, the recursive ...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554069"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554069"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554069; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554069]").text(description); $(".js-view-count[data-work-id=77554069]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554069; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554069']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554069, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=77554069]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554069,"title":"A new neuron spike sorting method using maximal overlap discrete wavelet transform and rotated principal component analysis","translated_title":"","metadata":{"abstract":"Abstract A new method for neuron spike sorting is presented that uses maximal overlap discrete wavelet transform (MODWT) and rotated principal component analysis. MODWT is very effective in extracting the bandwidth of neuron firings without shape distortion. We then used a rotated principal component analysis to isolate unique neuron templates. In this procedure the first principal component serves as a neuron spike template. This component is then removed from the original data and the procedure is repeated. Thus, the recursive ...","publisher":"IEEE","publication_name":"Proceedings of the 25th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (IEEE Cat. No.03CH37439)"},"translated_abstract":"Abstract A new method for neuron spike sorting is presented that uses maximal overlap discrete wavelet transform (MODWT) and rotated principal component analysis. MODWT is very effective in extracting the bandwidth of neuron firings without shape distortion. We then used a rotated principal component analysis to isolate unique neuron templates. In this procedure the first principal component serves as a neuron spike template. This component is then removed from the original data and the procedure is repeated. Thus, the recursive ...","internal_url":"https://www.academia.edu/77554069/A_new_neuron_spike_sorting_method_using_maximal_overlap_discrete_wavelet_transform_and_rotated_principal_component_analysis","translated_internal_url":"","created_at":"2022-04-25T04:29:41.555-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"A_new_neuron_spike_sorting_method_using_maximal_overlap_discrete_wavelet_transform_and_rotated_principal_component_analysis","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[],"research_interests":[{"id":5069,"name":"Principal Component Analysis","url":"https://www.academia.edu/Documents/in/Principal_Component_Analysis"},{"id":22272,"name":"Neurophysiology","url":"https://www.academia.edu/Documents/in/Neurophysiology"},{"id":91365,"name":"Wavelet Transforms","url":"https://www.academia.edu/Documents/in/Wavelet_Transforms"},{"id":160144,"name":"Feature Extraction","url":"https://www.academia.edu/Documents/in/Feature_Extraction"},{"id":557843,"name":"Discrete wavelet transform","url":"https://www.academia.edu/Documents/in/Discrete_wavelet_transform"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="77554068"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/77554068/Exploring_the_movement_dynamics_of_manual_and_oral_articulation_Evidence_from_coarticulation"><img alt="Research paper thumbnail of Exploring the movement dynamics of manual and oral articulation: Evidence from coarticulation" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/77554068/Exploring_the_movement_dynamics_of_manual_and_oral_articulation_Evidence_from_coarticulation">Exploring the movement dynamics of manual and oral articulation: Evidence from coarticulation</a></div><div class="wp-workCard_item"><span>Laboratory Phonology</span><span>, 2012</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">This project explores three classes of human action through an investigation of long-distance coa...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">This project explores three classes of human action through an investigation of long-distance coarticulation, defined here as the articulatory influence of one phonetic element (e.g., consonant or vowel) on another across more than one intervening element. Our first experiment investigated anticipatory vowel-to-vowel (VV) coarticulation in English. The second experiment was patterned after the first but deals instead with anticipatory location-to-location (LL) effects in American Sign Language (ASL). The sign experiment also incorporated a non-linguistic manual action, permitting a comparison of effects not only between spoken and signed language, but also between linguistic and non-linguistic manual actions.For the spoken-language study, sentences were created in which multiple consecutive schwas (target vowels) were followed by various context vowels. Eighteen English speakers were recorded as they repeated each sentence six times, and statistical tests were performed to determine...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="77554068"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span><span id="work-strip-rankings-button-container"></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="77554068"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 77554068; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=77554068]").text(description); $(".js-view-count[data-work-id=77554068]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 77554068; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='77554068']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span><span><script>$(function() { new Works.PaperRankView({ workId: 77554068, container: "", }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=77554068]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":77554068,"title":"Exploring the movement dynamics of manual and oral articulation: Evidence from coarticulation","translated_title":"","metadata":{"abstract":"This project explores three classes of human action through an investigation of long-distance coarticulation, defined here as the articulatory influence of one phonetic element (e.g., consonant or vowel) on another across more than one intervening element. Our first experiment investigated anticipatory vowel-to-vowel (VV) coarticulation in English. The second experiment was patterned after the first but deals instead with anticipatory location-to-location (LL) effects in American Sign Language (ASL). The sign experiment also incorporated a non-linguistic manual action, permitting a comparison of effects not only between spoken and signed language, but also between linguistic and non-linguistic manual actions.For the spoken-language study, sentences were created in which multiple consecutive schwas (target vowels) were followed by various context vowels. Eighteen English speakers were recorded as they repeated each sentence six times, and statistical tests were performed to determine...","publisher":"Walter de Gruyter GmbH","publication_date":{"day":null,"month":null,"year":2012,"errors":{}},"publication_name":"Laboratory Phonology"},"translated_abstract":"This project explores three classes of human action through an investigation of long-distance coarticulation, defined here as the articulatory influence of one phonetic element (e.g., consonant or vowel) on another across more than one intervening element. Our first experiment investigated anticipatory vowel-to-vowel (VV) coarticulation in English. The second experiment was patterned after the first but deals instead with anticipatory location-to-location (LL) effects in American Sign Language (ASL). The sign experiment also incorporated a non-linguistic manual action, permitting a comparison of effects not only between spoken and signed language, but also between linguistic and non-linguistic manual actions.For the spoken-language study, sentences were created in which multiple consecutive schwas (target vowels) were followed by various context vowels. Eighteen English speakers were recorded as they repeated each sentence six times, and statistical tests were performed to determine...","internal_url":"https://www.academia.edu/77554068/Exploring_the_movement_dynamics_of_manual_and_oral_articulation_Evidence_from_coarticulation","translated_internal_url":"","created_at":"2022-04-25T04:29:41.429-07:00","preview_url":null,"current_user_can_edit":null,"current_user_is_owner":null,"owner_id":142673724,"coauthors_can_edit":true,"document_type":"paper","co_author_tags":[],"downloadable_attachments":[],"slug":"Exploring_the_movement_dynamics_of_manual_and_oral_articulation_Evidence_from_coarticulation","translated_slug":"","page_count":null,"language":"en","content_type":"Work","owner":{"id":142673724,"first_name":"David","middle_initials":null,"last_name":"Corina","page_name":"DavidCorina2","domain_name":"independent","created_at":"2020-01-20T04:49:45.235-08:00","display_name":"David Corina","url":"https://independent.academia.edu/DavidCorina2"},"attachments":[],"research_interests":[{"id":221,"name":"Psychology","url":"https://www.academia.edu/Documents/in/Psychology"},{"id":1214,"name":"Sign Language","url":"https://www.academia.edu/Documents/in/Sign_Language"},{"id":383315,"name":"Coarticulation","url":"https://www.academia.edu/Documents/in/Coarticulation"},{"id":432451,"name":"ASL","url":"https://www.academia.edu/Documents/in/ASL"},{"id":897363,"name":"Laboratory Phonology","url":"https://www.academia.edu/Documents/in/Laboratory_Phonology"},{"id":1532601,"name":"Movement Dynamics","url":"https://www.academia.edu/Documents/in/Movement_Dynamics"},{"id":1714028,"name":"Long Distance","url":"https://www.academia.edu/Documents/in/Long_Distance"}],"urls":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> </div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js","https://a.academia-assets.com/assets/google_contacts-0dfb882d836b94dbcb4a2d123d6933fc9533eda5be911641f20b4eb428429600.js"], function() { // from javascript_helper.rb $('.js-google-connect-button').click(function(e) { e.preventDefault(); GoogleContacts.authorize_and_show_contacts(); Aedu.Dismissibles.recordClickthrough("WowProfileImportContactsPrompt"); }); $('.js-update-biography-button').click(function(e) { e.preventDefault(); Aedu.Dismissibles.recordClickthrough("UpdateUserBiographyPrompt"); $.ajax({ url: $r.api_v0_profiles_update_about_path({ subdomain_param: 'api', about: "", }), type: 'PUT', success: function(response) { location.reload(); } }); }); $('.js-work-creator-button').click(function (e) { e.preventDefault(); window.location = $r.upload_funnel_document_path({ source: encodeURIComponent(""), }); }); $('.js-video-upload-button').click(function (e) { e.preventDefault(); window.location = $r.upload_funnel_video_path({ source: encodeURIComponent(""), }); }); $('.js-do-this-later-button').click(function() { $(this).closest('.js-profile-nag-panel').remove(); Aedu.Dismissibles.recordDismissal("WowProfileImportContactsPrompt"); }); $('.js-update-biography-do-this-later-button').click(function(){ $(this).closest('.js-profile-nag-panel').remove(); Aedu.Dismissibles.recordDismissal("UpdateUserBiographyPrompt"); }); $('.wow-profile-mentions-upsell--close').click(function(){ $('.wow-profile-mentions-upsell--panel').hide(); Aedu.Dismissibles.recordDismissal("WowProfileMentionsUpsell"); }); $('.wow-profile-mentions-upsell--button').click(function(){ Aedu.Dismissibles.recordClickthrough("WowProfileMentionsUpsell"); }); new WowProfile.SocialRedesignUserWorks({ initialWorksOffset: 20, allWorksOffset: 20, maxSections: 1 }) }); </script> </div></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile_edit-5ea339ee107c863779f560dd7275595239fed73f1a13d279d2b599a28c0ecd33.js","https://a.academia-assets.com/assets/add_coauthor-22174b608f9cb871d03443cafa7feac496fb50d7df2d66a53f5ee3c04ba67f53.js","https://a.academia-assets.com/assets/tab-dcac0130902f0cc2d8cb403714dd47454f11fc6fb0e99ae6a0827b06613abc20.js","https://a.academia-assets.com/assets/wow_profile-f77ea15d77ce96025a6048a514272ad8becbad23c641fc2b3bd6e24ca6ff1932.js"], function() { // from javascript_helper.rb window.ae = window.ae || {}; window.ae.WowProfile = window.ae.WowProfile || {}; if(Aedu.User.current && Aedu.User.current.id === $viewedUser.id) { window.ae.WowProfile.current_user_edit = {}; new WowProfileEdit.EditUploadView({ el: '.js-edit-upload-button-wrapper', model: window.$current_user, }); new AddCoauthor.AddCoauthorsController(); } var userInfoView = new WowProfile.SocialRedesignUserInfo({ recaptcha_key: "6LdxlRMTAAAAADnu_zyLhLg0YF9uACwz78shpjJB" }); WowProfile.router = new WowProfile.Router({ userInfoView: userInfoView }); Backbone.history.start({ pushState: true, root: "/" + $viewedUser.page_name }); new WowProfile.UserWorksNav() }); </script> </div> <div class="bootstrap login"><div class="modal fade login-modal" id="login-modal"><div class="login-modal-dialog modal-dialog"><div class="modal-content"><div class="modal-header"><button class="close close" data-dismiss="modal" type="button"><span aria-hidden="true">&times;</span><span class="sr-only">Close</span></button><h4 class="modal-title text-center"><strong>Log In</strong></h4></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><button class="btn btn-fb btn-lg btn-block btn-v-center-content" id="login-facebook-oauth-button"><svg style="float: left; width: 19px; line-height: 1em; margin-right: .3em;" aria-hidden="true" focusable="false" data-prefix="fab" data-icon="facebook-square" class="svg-inline--fa fa-facebook-square fa-w-14" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 448 512"><path fill="currentColor" d="M400 32H48A48 48 0 0 0 0 80v352a48 48 0 0 0 48 48h137.25V327.69h-63V256h63v-54.64c0-62.15 37-96.48 93.67-96.48 27.14 0 55.52 4.84 55.52 4.84v61h-31.27c-30.81 0-40.42 19.12-40.42 38.73V256h68.78l-11 71.69h-57.78V480H400a48 48 0 0 0 48-48V80a48 48 0 0 0-48-48z"></path></svg><small><strong>Log in</strong> with <strong>Facebook</strong></small></button><br /><button class="btn btn-google btn-lg btn-block btn-v-center-content" id="login-google-oauth-button"><svg style="float: left; width: 22px; line-height: 1em; margin-right: .3em;" aria-hidden="true" focusable="false" data-prefix="fab" data-icon="google-plus" class="svg-inline--fa fa-google-plus fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M256,8C119.1,8,8,119.1,8,256S119.1,504,256,504,504,392.9,504,256,392.9,8,256,8ZM185.3,380a124,124,0,0,1,0-248c31.3,0,60.1,11,83,32.3l-33.6,32.6c-13.2-12.9-31.3-19.1-49.4-19.1-42.9,0-77.2,35.5-77.2,78.1S142.3,334,185.3,334c32.6,0,64.9-19.1,70.1-53.3H185.3V238.1H302.2a109.2,109.2,0,0,1,1.9,20.7c0,70.8-47.5,121.2-118.8,121.2ZM415.5,273.8v35.5H380V273.8H344.5V238.3H380V202.8h35.5v35.5h35.2v35.5Z"></path></svg><small><strong>Log in</strong> with <strong>Google</strong></small></button><br /><style type="text/css">.sign-in-with-apple-button { width: 100%; height: 52px; border-radius: 3px; border: 1px solid black; cursor: pointer; }</style><script src="https://appleid.cdn-apple.com/appleauth/static/jsapi/appleid/1/en_US/appleid.auth.js" type="text/javascript"></script><div class="sign-in-with-apple-button" data-border="false" data-color="white" id="appleid-signin"><span &nbsp;&nbsp;="Sign Up with Apple" class="u-fs11"></span></div><script>AppleID.auth.init({ clientId: 'edu.academia.applesignon', scope: 'name email', redirectURI: 'https://www.academia.edu/sessions', state: "19b50803bec498aa7dc8a122035e42502e76446d776072619b98aea4a2402428", });</script><script>// Hacky way of checking if on fast loswp if (window.loswp == null) { (function() { const Google = window?.Aedu?.Auth?.OauthButton?.Login?.Google; const Facebook = window?.Aedu?.Auth?.OauthButton?.Login?.Facebook; if (Google) { new Google({ el: '#login-google-oauth-button', rememberMeCheckboxId: 'remember_me', track: null }); } if (Facebook) { new Facebook({ el: '#login-facebook-oauth-button', rememberMeCheckboxId: 'remember_me', track: null }); } })(); }</script></div></div></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><div class="hr-heading login-hr-heading"><span class="hr-heading-text">or</span></div></div></div></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><form class="js-login-form" action="https://www.academia.edu/sessions" accept-charset="UTF-8" method="post"><input name="utf8" type="hidden" value="&#x2713;" autocomplete="off" /><input type="hidden" name="authenticity_token" value="lPv4jgxjV5NqodywcqYUTeIX3y58RkwD7ntZFkIusXl2tA49bZbM21PiriLuPGv6PdBiuIuTrgq+MhyLEbraBg==" autocomplete="off" /><div class="form-group"><label class="control-label" for="login-modal-email-input" style="font-size: 14px;">Email</label><input class="form-control" id="login-modal-email-input" name="login" type="email" /></div><div class="form-group"><label class="control-label" for="login-modal-password-input" style="font-size: 14px;">Password</label><input class="form-control" id="login-modal-password-input" name="password" type="password" /></div><input type="hidden" name="post_login_redirect_url" id="post_login_redirect_url" value="https://independent.academia.edu/DavidCorina2" autocomplete="off" /><div class="checkbox"><label><input type="checkbox" name="remember_me" id="remember_me" value="1" checked="checked" /><small style="font-size: 12px; margin-top: 2px; display: inline-block;">Remember me on this computer</small></label></div><br><input type="submit" name="commit" value="Log In" class="btn btn-primary btn-block btn-lg js-login-submit" data-disable-with="Log In" /></br></form><script>typeof window?.Aedu?.recaptchaManagedForm === 'function' && window.Aedu.recaptchaManagedForm( document.querySelector('.js-login-form'), document.querySelector('.js-login-submit') );</script><small style="font-size: 12px;"><br />or <a data-target="#login-modal-reset-password-container" data-toggle="collapse" href="javascript:void(0)">reset password</a></small><div class="collapse" id="login-modal-reset-password-container"><br /><div class="well margin-0x"><form class="js-password-reset-form" action="https://www.academia.edu/reset_password" accept-charset="UTF-8" method="post"><input name="utf8" type="hidden" value="&#x2713;" autocomplete="off" /><input type="hidden" name="authenticity_token" value="LJbTGzVbE6LqiICLs+jcR2ZsDwrD4bYuE/YSFhJG5tnO2SWoVK6I6tPL8hkvcqPwuauynDQ0VCdDv1eLQdKNpg==" autocomplete="off" /><p>Enter the email address you signed up with and we&#39;ll email you a reset link.</p><div class="form-group"><input class="form-control" name="email" type="email" /></div><script src="https://recaptcha.net/recaptcha/api.js" async defer></script> <script> var invisibleRecaptchaSubmit = function () { var closestForm = function (ele) { var curEle = ele.parentNode; while (curEle.nodeName !== 'FORM' && curEle.nodeName !== 'BODY'){ curEle = curEle.parentNode; } return curEle.nodeName === 'FORM' ? curEle : null }; var eles = document.getElementsByClassName('g-recaptcha'); if (eles.length > 0) { var form = closestForm(eles[0]); if (form) { form.submit(); } } }; </script> <input type="submit" data-sitekey="6Lf3KHUUAAAAACggoMpmGJdQDtiyrjVlvGJ6BbAj" data-callback="invisibleRecaptchaSubmit" class="g-recaptcha btn btn-primary btn-block" value="Email me a link" value=""/> </form></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/collapse-45805421cf446ca5adf7aaa1935b08a3a8d1d9a6cc5d91a62a2a3a00b20b3e6a.js"], function() { // from javascript_helper.rb $("#login-modal-reset-password-container").on("shown.bs.collapse", function() { $(this).find("input[type=email]").focus(); }); }); </script> </div></div></div><div class="modal-footer"><div class="text-center"><small style="font-size: 12px;">Need an account?&nbsp;<a rel="nofollow" href="https://www.academia.edu/signup">Click here to sign up</a></small></div></div></div></div></div></div><script>// If we are on subdomain or non-bootstrapped page, redirect to login page instead of showing modal (function(){ if (typeof $ === 'undefined') return; var host = window.location.hostname; if ((host === $domain || host === "www."+$domain) && (typeof $().modal === 'function')) { $("#nav_log_in").click(function(e) { // Don't follow the link and open the modal e.preventDefault(); $("#login-modal").on('shown.bs.modal', function() { $(this).find("#login-modal-email-input").focus() }).modal('show'); }); } })()</script> <div class="bootstrap" id="footer"><div class="footer-content clearfix text-center padding-top-7x" style="width:100%;"><ul class="footer-links-secondary footer-links-wide list-inline margin-bottom-1x"><li><a href="https://www.academia.edu/about">About</a></li><li><a href="https://www.academia.edu/press">Press</a></li><li><a rel="nofollow" href="https://medium.com/academia">Blog</a></li><li><a href="https://www.academia.edu/documents">Papers</a></li><li><a href="https://www.academia.edu/topics">Topics</a></li><li><a href="https://www.academia.edu/journals">Academia.edu Journals</a></li><li><a rel="nofollow" href="https://www.academia.edu/hiring"><svg style="width: 13px; height: 13px;" aria-hidden="true" focusable="false" data-prefix="fas" data-icon="briefcase" class="svg-inline--fa fa-briefcase fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M320 336c0 8.84-7.16 16-16 16h-96c-8.84 0-16-7.16-16-16v-48H0v144c0 25.6 22.4 48 48 48h416c25.6 0 48-22.4 48-48V288H320v48zm144-208h-80V80c0-25.6-22.4-48-48-48H176c-25.6 0-48 22.4-48 48v48H48c-25.6 0-48 22.4-48 48v80h512v-80c0-25.6-22.4-48-48-48zm-144 0H192V96h128v32z"></path></svg>&nbsp;<strong>We're Hiring!</strong></a></li><li><a rel="nofollow" href="https://support.academia.edu/"><svg style="width: 12px; height: 12px;" aria-hidden="true" focusable="false" data-prefix="fas" data-icon="question-circle" class="svg-inline--fa fa-question-circle fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M504 256c0 136.997-111.043 248-248 248S8 392.997 8 256C8 119.083 119.043 8 256 8s248 111.083 248 248zM262.655 90c-54.497 0-89.255 22.957-116.549 63.758-3.536 5.286-2.353 12.415 2.715 16.258l34.699 26.31c5.205 3.947 12.621 3.008 16.665-2.122 17.864-22.658 30.113-35.797 57.303-35.797 20.429 0 45.698 13.148 45.698 32.958 0 14.976-12.363 22.667-32.534 33.976C247.128 238.528 216 254.941 216 296v4c0 6.627 5.373 12 12 12h56c6.627 0 12-5.373 12-12v-1.333c0-28.462 83.186-29.647 83.186-106.667 0-58.002-60.165-102-116.531-102zM256 338c-25.365 0-46 20.635-46 46 0 25.364 20.635 46 46 46s46-20.636 46-46c0-25.365-20.635-46-46-46z"></path></svg>&nbsp;<strong>Help Center</strong></a></li></ul><ul class="footer-links-tertiary list-inline margin-bottom-1x"><li class="small">Find new research papers in:</li><li class="small"><a href="https://www.academia.edu/Documents/in/Physics">Physics</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Chemistry">Chemistry</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Biology">Biology</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Health_Sciences">Health Sciences</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Ecology">Ecology</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Earth_Sciences">Earth Sciences</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Cognitive_Science">Cognitive Science</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Mathematics">Mathematics</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Computer_Science">Computer Science</a></li></ul></div></div><div class="DesignSystem" id="credit" style="width:100%;"><ul class="u-pl0x footer-links-legal list-inline"><li><a rel="nofollow" href="https://www.academia.edu/terms">Terms</a></li><li><a rel="nofollow" href="https://www.academia.edu/privacy">Privacy</a></li><li><a rel="nofollow" href="https://www.academia.edu/copyright">Copyright</a></li><li>Academia &copy;2024</li></ul></div><script> //<![CDATA[ window.detect_gmtoffset = true; window.Academia && window.Academia.set_gmtoffset && Academia.set_gmtoffset('/gmtoffset'); //]]> </script> <div id='overlay_background'></div> <div id='bootstrap-modal-container' class='bootstrap'></div> <div id='ds-modal-container' class='bootstrap DesignSystem'></div> <div id='full-screen-modal'></div> </div> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10