CINXE.COM

Catherine Guastavino | McGill University - Academia.edu

<!DOCTYPE html> <html lang="en" xmlns:fb="http://www.facebook.com/2008/fbml" class="wf-loading"> <head prefix="og: https://ogp.me/ns# fb: https://ogp.me/ns/fb# academia: https://ogp.me/ns/fb/academia#"> <meta charset="utf-8"> <meta name=viewport content="width=device-width, initial-scale=1"> <meta rel="search" type="application/opensearchdescription+xml" href="/open_search.xml" title="Academia.edu"> <title>Catherine Guastavino | McGill University - Academia.edu</title> <!-- _ _ _ | | (_) | | __ _ ___ __ _ __| | ___ _ __ ___ _ __ _ ___ __| |_ _ / _` |/ __/ _` |/ _` |/ _ \ '_ ` _ \| |/ _` | / _ \/ _` | | | | | (_| | (_| (_| | (_| | __/ | | | | | | (_| || __/ (_| | |_| | \__,_|\___\__,_|\__,_|\___|_| |_| |_|_|\__,_(_)___|\__,_|\__,_| We're hiring! See https://www.academia.edu/hiring --> <link href="//a.academia-assets.com/images/favicons/favicon-production.ico" rel="shortcut icon" type="image/vnd.microsoft.icon"> <link rel="apple-touch-icon" sizes="57x57" href="//a.academia-assets.com/images/favicons/apple-touch-icon-57x57.png"> <link rel="apple-touch-icon" sizes="60x60" href="//a.academia-assets.com/images/favicons/apple-touch-icon-60x60.png"> <link rel="apple-touch-icon" sizes="72x72" href="//a.academia-assets.com/images/favicons/apple-touch-icon-72x72.png"> <link rel="apple-touch-icon" sizes="76x76" href="//a.academia-assets.com/images/favicons/apple-touch-icon-76x76.png"> <link rel="apple-touch-icon" sizes="114x114" href="//a.academia-assets.com/images/favicons/apple-touch-icon-114x114.png"> <link rel="apple-touch-icon" sizes="120x120" href="//a.academia-assets.com/images/favicons/apple-touch-icon-120x120.png"> <link rel="apple-touch-icon" sizes="144x144" href="//a.academia-assets.com/images/favicons/apple-touch-icon-144x144.png"> <link rel="apple-touch-icon" sizes="152x152" href="//a.academia-assets.com/images/favicons/apple-touch-icon-152x152.png"> <link rel="apple-touch-icon" sizes="180x180" href="//a.academia-assets.com/images/favicons/apple-touch-icon-180x180.png"> <link rel="icon" type="image/png" href="//a.academia-assets.com/images/favicons/favicon-32x32.png" sizes="32x32"> <link rel="icon" type="image/png" href="//a.academia-assets.com/images/favicons/favicon-194x194.png" sizes="194x194"> <link rel="icon" type="image/png" href="//a.academia-assets.com/images/favicons/favicon-96x96.png" sizes="96x96"> <link rel="icon" type="image/png" href="//a.academia-assets.com/images/favicons/android-chrome-192x192.png" sizes="192x192"> <link rel="icon" type="image/png" href="//a.academia-assets.com/images/favicons/favicon-16x16.png" sizes="16x16"> <link rel="manifest" href="//a.academia-assets.com/images/favicons/manifest.json"> <meta name="msapplication-TileColor" content="#2b5797"> <meta name="msapplication-TileImage" content="//a.academia-assets.com/images/favicons/mstile-144x144.png"> <meta name="theme-color" content="#ffffff"> <script> window.performance && window.performance.measure && window.performance.measure("Time To First Byte", "requestStart", "responseStart"); </script> <script> (function() { if (!window.URLSearchParams || !window.history || !window.history.replaceState) { return; } var searchParams = new URLSearchParams(window.location.search); var paramsToDelete = [ 'fs', 'sm', 'swp', 'iid', 'nbs', 'rcc', // related content category 'rcpos', // related content carousel position 'rcpg', // related carousel page 'rchid', // related content hit id 'f_ri', // research interest id, for SEO tracking 'f_fri', // featured research interest, for SEO tracking (param key without value) 'f_rid', // from research interest directory for SEO tracking 'f_loswp', // from research interest pills on LOSWP sidebar for SEO tracking 'rhid', // referrring hit id ]; if (paramsToDelete.every((key) => searchParams.get(key) === null)) { return; } paramsToDelete.forEach((key) => { searchParams.delete(key); }); var cleanUrl = new URL(window.location.href); cleanUrl.search = searchParams.toString(); history.replaceState({}, document.title, cleanUrl); })(); </script> <script async src="https://www.googletagmanager.com/gtag/js?id=G-5VKX33P2DS"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-5VKX33P2DS', { cookie_domain: 'academia.edu', send_page_view: false, }); gtag('event', 'page_view', { 'controller': "profiles/works", 'action': "summary", 'controller_action': 'profiles/works#summary', 'logged_in': 'false', 'edge': 'unknown', // Send nil if there is no A/B test bucket, in case some records get logged // with missing data - that way we can distinguish between the two cases. // ab_test_bucket should be of the form <ab_test_name>:<bucket> 'ab_test_bucket': null, }) </script> <script type="text/javascript"> window.sendUserTiming = function(timingName) { if (!(window.performance && window.performance.measure)) return; var entries = window.performance.getEntriesByName(timingName, "measure"); if (entries.length !== 1) return; var timingValue = Math.round(entries[0].duration); gtag('event', 'timing_complete', { name: timingName, value: timingValue, event_category: 'User-centric', }); }; window.sendUserTiming("Time To First Byte"); </script> <meta name="csrf-param" content="authenticity_token" /> <meta name="csrf-token" content="yrKRXbe9wm_FXf7V5D_543m78v66AK67jfTb2WEXMcvGjzoE-z31cDkmi1DwJxSvc-CUpYAupkHkh01vwtWy9w" /> <link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/wow-3d36c19b4875b226bfed0fcba1dcea3f2fe61148383d97c0465c016b8c969290.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/social/home-79e78ce59bef0a338eb6540ec3d93b4a7952115b56c57f1760943128f4544d42.css" /><script type="application/ld+json">{"@context":"https://schema.org","@type":"ProfilePage","mainEntity":{"@context":"https://schema.org","@type":"Person","name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino","image":"https://0.academia-photos.com/45597668/117885521/107192199/s200_catherine.guastavino.jpg","sameAs":[]},"dateCreated":"2016-03-22T07:12:44-07:00","dateModified":"2025-02-20T20:04:55-08:00","name":"Catherine Guastavino","description":"","image":"https://0.academia-photos.com/45597668/117885521/107192199/s200_catherine.guastavino.jpg","thumbnailUrl":"https://0.academia-photos.com/45597668/117885521/107192199/s65_catherine.guastavino.jpg","primaryImageOfPage":{"@type":"ImageObject","url":"https://0.academia-photos.com/45597668/117885521/107192199/s200_catherine.guastavino.jpg","width":200},"sameAs":[],"relatedLink":"https://www.academia.edu/110629292/Montreal_soundscapes_during_the_COVID_19_pandemic_A_spatial_analysis_of_noise_complaints_and_residents_surveys"}</script><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/heading-95367dc03b794f6737f30123738a886cf53b7a65cdef98a922a98591d60063e3.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/button-8c9ae4b5c8a2531640c354d92a1f3579c8ff103277ef74913e34c8a76d4e6c00.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/body-170d1319f0e354621e81ca17054bb147da2856ec0702fe440a99af314a6338c5.css" /><style type="text/css">@media(max-width: 567px){:root{--token-mode: Parity;--dropshadow: 0 2px 4px 0 #22223340;--primary-brand: #0645b1;--error-dark: #b60000;--success-dark: #05b01c;--inactive-fill: #ebebee;--hover: #0c3b8d;--pressed: #082f75;--button-primary-fill-inactive: #ebebee;--button-primary-fill: #0645b1;--button-primary-text: #ffffff;--button-primary-fill-hover: #0c3b8d;--button-primary-fill-press: #082f75;--button-primary-icon: #ffffff;--button-primary-fill-inverse: #ffffff;--button-primary-text-inverse: #082f75;--button-primary-icon-inverse: #0645b1;--button-primary-fill-inverse-hover: #cddaef;--button-primary-stroke-inverse-pressed: #0645b1;--button-secondary-stroke-inactive: #b1b1ba;--button-secondary-fill: #eef2f9;--button-secondary-text: #082f75;--button-secondary-fill-press: #cddaef;--button-secondary-fill-inactive: #ebebee;--button-secondary-stroke: #cddaef;--button-secondary-stroke-hover: #386ac1;--button-secondary-stroke-press: #0645b1;--button-secondary-text-inactive: #b1b1ba;--button-secondary-icon: #082f75;--button-secondary-fill-hover: #e6ecf7;--button-secondary-stroke-inverse: #ffffff;--button-secondary-fill-inverse: rgba(255, 255, 255, 0);--button-secondary-icon-inverse: #ffffff;--button-secondary-icon-hover: #082f75;--button-secondary-icon-press: #082f75;--button-secondary-text-inverse: #ffffff;--button-secondary-text-hover: #082f75;--button-secondary-text-press: #082f75;--button-secondary-fill-inverse-hover: #043059;--button-xs-stroke: #141413;--button-xs-stroke-hover: #0c3b8d;--button-xs-stroke-press: #082f75;--button-xs-stroke-inactive: #ebebee;--button-xs-text: #141413;--button-xs-text-hover: #0c3b8d;--button-xs-text-press: #082f75;--button-xs-text-inactive: #91919e;--button-xs-icon: #141413;--button-xs-icon-hover: #0c3b8d;--button-xs-icon-press: #082f75;--button-xs-icon-inactive: #91919e;--button-xs-fill: #ffffff;--button-xs-fill-hover: #f4f7fc;--button-xs-fill-press: #eef2f9;--buttons-button-text-inactive: #91919e;--buttons-button-focus: #0645b1;--buttons-button-icon-inactive: #91919e;--buttons-small-buttons-corner-radius: 8px;--buttons-small-buttons-l-r-padding: 12px;--buttons-small-buttons-height: 44px;--buttons-small-buttons-gap: 8px;--buttons-small-buttons-icon-only-width: 44px;--buttons-small-buttons-icon-size: 20px;--buttons-small-buttons-stroke-default: 1px;--buttons-small-buttons-stroke-thick: 2px;--buttons-large-buttons-l-r-padding: 20px;--buttons-large-buttons-height: 54px;--buttons-large-buttons-icon-only-width: 54px;--buttons-large-buttons-icon-size: 20px;--buttons-large-buttons-gap: 8px;--buttons-large-buttons-corner-radius: 8px;--buttons-large-buttons-stroke-default: 1px;--buttons-large-buttons-stroke-thick: 2px;--buttons-extra-small-buttons-l-r-padding: 8px;--buttons-extra-small-buttons-height: 32px;--buttons-extra-small-buttons-icon-size: 16px;--buttons-extra-small-buttons-gap: 4px;--buttons-extra-small-buttons-corner-radius: 8px;--buttons-stroke-default: 1px;--buttons-stroke-thick: 2px;--background-beige: #f9f7f4;--error-light: #fff2f2;--text-placeholder: #6d6d7d;--stroke-dark: #141413;--stroke-light: #dddde2;--stroke-medium: #535366;--accent-green: #ccffd4;--accent-turquoise: #ccf7ff;--accent-yellow: #f7ffcc;--accent-peach: #ffd4cc;--accent-violet: #f7ccff;--accent-purple: #f4f7fc;--text-primary: #141413;--secondary-brand: #141413;--text-hover: #0c3b8d;--text-white: #ffffff;--text-link: #0645b1;--text-press: #082f75;--success-light: #f0f8f1;--background-light-blue: #eef2f9;--background-white: #ffffff;--premium-dark: #877440;--premium-light: #f9f6ed;--stroke-white: #ffffff;--inactive-content: #b1b1ba;--annotate-light: #a35dff;--annotate-dark: #824acc;--grid: #eef2f9;--inactive-stroke: #ebebee;--shadow: rgba(34, 34, 51, 0.25);--text-inactive: #6d6d7d;--text-error: #b60000;--stroke-error: #b60000;--background-error: #fff2f2;--background-black: #141413;--icon-default: #141413;--icon-blue: #0645b1;--background-grey: #dddde2;--icon-grey: #b1b1ba;--text-focus: #082f75;--brand-colors-neutral-black: #141413;--brand-colors-neutral-900: #535366;--brand-colors-neutral-800: #6d6d7d;--brand-colors-neutral-700: #91919e;--brand-colors-neutral-600: #b1b1ba;--brand-colors-neutral-500: #c8c8cf;--brand-colors-neutral-400: #dddde2;--brand-colors-neutral-300: #ebebee;--brand-colors-neutral-200: #f8f8fb;--brand-colors-neutral-100: #fafafa;--brand-colors-neutral-white: #ffffff;--brand-colors-blue-900: #043059;--brand-colors-blue-800: #082f75;--brand-colors-blue-700: #0c3b8d;--brand-colors-blue-600: #0645b1;--brand-colors-blue-500: #386ac1;--brand-colors-blue-400: #cddaef;--brand-colors-blue-300: #e6ecf7;--brand-colors-blue-200: #eef2f9;--brand-colors-blue-100: #f4f7fc;--brand-colors-gold-500: #877440;--brand-colors-gold-400: #e9e3d4;--brand-colors-gold-300: #f2efe8;--brand-colors-gold-200: #f9f6ed;--brand-colors-gold-100: #f9f7f4;--brand-colors-error-900: #920000;--brand-colors-error-500: #b60000;--brand-colors-success-900: #035c0f;--brand-colors-green: #ccffd4;--brand-colors-turquoise: #ccf7ff;--brand-colors-yellow: #f7ffcc;--brand-colors-peach: #ffd4cc;--brand-colors-violet: #f7ccff;--brand-colors-error-100: #fff2f2;--brand-colors-success-500: #05b01c;--brand-colors-success-100: #f0f8f1;--text-secondary: #535366;--icon-white: #ffffff;--background-beige-darker: #f2efe8;--icon-dark-grey: #535366;--type-font-family-sans-serif: Roboto;--type-font-family-serif: Georgia;--type-font-family-mono: IBM Plex Mono;--type-weights-300: 300;--type-weights-400: 400;--type-weights-500: 500;--type-weights-700: 700;--type-sizes-12: 12px;--type-sizes-14: 14px;--type-sizes-16: 16px;--type-sizes-18: 18px;--type-sizes-20: 20px;--type-sizes-22: 22px;--type-sizes-24: 24px;--type-sizes-28: 28px;--type-sizes-30: 30px;--type-sizes-32: 32px;--type-sizes-40: 40px;--type-sizes-42: 42px;--type-sizes-48-2: 48px;--type-line-heights-16: 16px;--type-line-heights-20: 20px;--type-line-heights-23: 23px;--type-line-heights-24: 24px;--type-line-heights-25: 25px;--type-line-heights-26: 26px;--type-line-heights-29: 29px;--type-line-heights-30: 30px;--type-line-heights-32: 32px;--type-line-heights-34: 34px;--type-line-heights-35: 35px;--type-line-heights-36: 36px;--type-line-heights-38: 38px;--type-line-heights-40: 40px;--type-line-heights-46: 46px;--type-line-heights-48: 48px;--type-line-heights-52: 52px;--type-line-heights-58: 58px;--type-line-heights-68: 68px;--type-line-heights-74: 74px;--type-line-heights-82: 82px;--type-paragraph-spacings-0: 0px;--type-paragraph-spacings-4: 4px;--type-paragraph-spacings-8: 8px;--type-paragraph-spacings-16: 16px;--type-sans-serif-xl-font-weight: 400;--type-sans-serif-xl-size: 32px;--type-sans-serif-xl-line-height: 46px;--type-sans-serif-xl-paragraph-spacing: 16px;--type-sans-serif-lg-font-weight: 400;--type-sans-serif-lg-size: 30px;--type-sans-serif-lg-line-height: 36px;--type-sans-serif-lg-paragraph-spacing: 16px;--type-sans-serif-md-font-weight: 400;--type-sans-serif-md-line-height: 30px;--type-sans-serif-md-paragraph-spacing: 16px;--type-sans-serif-md-size: 24px;--type-sans-serif-xs-font-weight: 700;--type-sans-serif-xs-line-height: 24px;--type-sans-serif-xs-paragraph-spacing: 0px;--type-sans-serif-xs-size: 18px;--type-sans-serif-sm-font-weight: 400;--type-sans-serif-sm-line-height: 32px;--type-sans-serif-sm-paragraph-spacing: 16px;--type-sans-serif-sm-size: 20px;--type-body-xl-font-weight: 400;--type-body-xl-size: 24px;--type-body-xl-line-height: 36px;--type-body-xl-paragraph-spacing: 0px;--type-body-sm-font-weight: 400;--type-body-sm-size: 14px;--type-body-sm-line-height: 20px;--type-body-sm-paragraph-spacing: 8px;--type-body-xs-font-weight: 400;--type-body-xs-size: 12px;--type-body-xs-line-height: 16px;--type-body-xs-paragraph-spacing: 0px;--type-body-md-font-weight: 400;--type-body-md-size: 16px;--type-body-md-line-height: 20px;--type-body-md-paragraph-spacing: 4px;--type-body-lg-font-weight: 400;--type-body-lg-size: 20px;--type-body-lg-line-height: 26px;--type-body-lg-paragraph-spacing: 16px;--type-body-lg-medium-font-weight: 500;--type-body-lg-medium-size: 20px;--type-body-lg-medium-line-height: 32px;--type-body-lg-medium-paragraph-spacing: 16px;--type-body-md-medium-font-weight: 500;--type-body-md-medium-size: 16px;--type-body-md-medium-line-height: 20px;--type-body-md-medium-paragraph-spacing: 4px;--type-body-sm-bold-font-weight: 700;--type-body-sm-bold-size: 14px;--type-body-sm-bold-line-height: 20px;--type-body-sm-bold-paragraph-spacing: 8px;--type-body-sm-medium-font-weight: 500;--type-body-sm-medium-size: 14px;--type-body-sm-medium-line-height: 20px;--type-body-sm-medium-paragraph-spacing: 8px;--type-serif-md-font-weight: 400;--type-serif-md-size: 32px;--type-serif-md-paragraph-spacing: 0px;--type-serif-md-line-height: 40px;--type-serif-sm-font-weight: 400;--type-serif-sm-size: 24px;--type-serif-sm-paragraph-spacing: 0px;--type-serif-sm-line-height: 26px;--type-serif-lg-font-weight: 400;--type-serif-lg-size: 48px;--type-serif-lg-paragraph-spacing: 0px;--type-serif-lg-line-height: 52px;--type-serif-xs-font-weight: 400;--type-serif-xs-size: 18px;--type-serif-xs-line-height: 24px;--type-serif-xs-paragraph-spacing: 0px;--type-serif-xl-font-weight: 400;--type-serif-xl-size: 48px;--type-serif-xl-paragraph-spacing: 0px;--type-serif-xl-line-height: 58px;--type-mono-md-font-weight: 400;--type-mono-md-size: 22px;--type-mono-md-line-height: 24px;--type-mono-md-paragraph-spacing: 0px;--type-mono-lg-font-weight: 400;--type-mono-lg-size: 40px;--type-mono-lg-line-height: 40px;--type-mono-lg-paragraph-spacing: 0px;--type-mono-sm-font-weight: 400;--type-mono-sm-size: 14px;--type-mono-sm-line-height: 24px;--type-mono-sm-paragraph-spacing: 0px;--spacing-xs-4: 4px;--spacing-xs-8: 8px;--spacing-xs-16: 16px;--spacing-sm-24: 24px;--spacing-sm-32: 32px;--spacing-md-40: 40px;--spacing-md-48: 48px;--spacing-lg-64: 64px;--spacing-lg-80: 80px;--spacing-xlg-104: 104px;--spacing-xlg-152: 152px;--spacing-xs-12: 12px;--spacing-page-section: 80px;--spacing-card-list-spacing: 48px;--spacing-text-section-spacing: 64px;--spacing-md-xs-headings: 40px;--corner-radius-radius-lg: 16px;--corner-radius-radius-sm: 4px;--corner-radius-radius-md: 8px;--corner-radius-radius-round: 104px}}@media(min-width: 568px)and (max-width: 1279px){:root{--token-mode: Parity;--dropshadow: 0 2px 4px 0 #22223340;--primary-brand: #0645b1;--error-dark: #b60000;--success-dark: #05b01c;--inactive-fill: #ebebee;--hover: #0c3b8d;--pressed: #082f75;--button-primary-fill-inactive: #ebebee;--button-primary-fill: #0645b1;--button-primary-text: #ffffff;--button-primary-fill-hover: #0c3b8d;--button-primary-fill-press: #082f75;--button-primary-icon: #ffffff;--button-primary-fill-inverse: #ffffff;--button-primary-text-inverse: #082f75;--button-primary-icon-inverse: #0645b1;--button-primary-fill-inverse-hover: #cddaef;--button-primary-stroke-inverse-pressed: #0645b1;--button-secondary-stroke-inactive: #b1b1ba;--button-secondary-fill: #eef2f9;--button-secondary-text: #082f75;--button-secondary-fill-press: #cddaef;--button-secondary-fill-inactive: #ebebee;--button-secondary-stroke: #cddaef;--button-secondary-stroke-hover: #386ac1;--button-secondary-stroke-press: #0645b1;--button-secondary-text-inactive: #b1b1ba;--button-secondary-icon: #082f75;--button-secondary-fill-hover: #e6ecf7;--button-secondary-stroke-inverse: #ffffff;--button-secondary-fill-inverse: rgba(255, 255, 255, 0);--button-secondary-icon-inverse: #ffffff;--button-secondary-icon-hover: #082f75;--button-secondary-icon-press: #082f75;--button-secondary-text-inverse: #ffffff;--button-secondary-text-hover: #082f75;--button-secondary-text-press: #082f75;--button-secondary-fill-inverse-hover: #043059;--button-xs-stroke: #141413;--button-xs-stroke-hover: #0c3b8d;--button-xs-stroke-press: #082f75;--button-xs-stroke-inactive: #ebebee;--button-xs-text: #141413;--button-xs-text-hover: #0c3b8d;--button-xs-text-press: #082f75;--button-xs-text-inactive: #91919e;--button-xs-icon: #141413;--button-xs-icon-hover: #0c3b8d;--button-xs-icon-press: #082f75;--button-xs-icon-inactive: #91919e;--button-xs-fill: #ffffff;--button-xs-fill-hover: #f4f7fc;--button-xs-fill-press: #eef2f9;--buttons-button-text-inactive: #91919e;--buttons-button-focus: #0645b1;--buttons-button-icon-inactive: #91919e;--buttons-small-buttons-corner-radius: 8px;--buttons-small-buttons-l-r-padding: 12px;--buttons-small-buttons-height: 44px;--buttons-small-buttons-gap: 8px;--buttons-small-buttons-icon-only-width: 44px;--buttons-small-buttons-icon-size: 20px;--buttons-small-buttons-stroke-default: 1px;--buttons-small-buttons-stroke-thick: 2px;--buttons-large-buttons-l-r-padding: 20px;--buttons-large-buttons-height: 54px;--buttons-large-buttons-icon-only-width: 54px;--buttons-large-buttons-icon-size: 20px;--buttons-large-buttons-gap: 8px;--buttons-large-buttons-corner-radius: 8px;--buttons-large-buttons-stroke-default: 1px;--buttons-large-buttons-stroke-thick: 2px;--buttons-extra-small-buttons-l-r-padding: 8px;--buttons-extra-small-buttons-height: 32px;--buttons-extra-small-buttons-icon-size: 16px;--buttons-extra-small-buttons-gap: 4px;--buttons-extra-small-buttons-corner-radius: 8px;--buttons-stroke-default: 1px;--buttons-stroke-thick: 2px;--background-beige: #f9f7f4;--error-light: #fff2f2;--text-placeholder: #6d6d7d;--stroke-dark: #141413;--stroke-light: #dddde2;--stroke-medium: #535366;--accent-green: #ccffd4;--accent-turquoise: #ccf7ff;--accent-yellow: #f7ffcc;--accent-peach: #ffd4cc;--accent-violet: #f7ccff;--accent-purple: #f4f7fc;--text-primary: #141413;--secondary-brand: #141413;--text-hover: #0c3b8d;--text-white: #ffffff;--text-link: #0645b1;--text-press: #082f75;--success-light: #f0f8f1;--background-light-blue: #eef2f9;--background-white: #ffffff;--premium-dark: #877440;--premium-light: #f9f6ed;--stroke-white: #ffffff;--inactive-content: #b1b1ba;--annotate-light: #a35dff;--annotate-dark: #824acc;--grid: #eef2f9;--inactive-stroke: #ebebee;--shadow: rgba(34, 34, 51, 0.25);--text-inactive: #6d6d7d;--text-error: #b60000;--stroke-error: #b60000;--background-error: #fff2f2;--background-black: #141413;--icon-default: #141413;--icon-blue: #0645b1;--background-grey: #dddde2;--icon-grey: #b1b1ba;--text-focus: #082f75;--brand-colors-neutral-black: #141413;--brand-colors-neutral-900: #535366;--brand-colors-neutral-800: #6d6d7d;--brand-colors-neutral-700: #91919e;--brand-colors-neutral-600: #b1b1ba;--brand-colors-neutral-500: #c8c8cf;--brand-colors-neutral-400: #dddde2;--brand-colors-neutral-300: #ebebee;--brand-colors-neutral-200: #f8f8fb;--brand-colors-neutral-100: #fafafa;--brand-colors-neutral-white: #ffffff;--brand-colors-blue-900: #043059;--brand-colors-blue-800: #082f75;--brand-colors-blue-700: #0c3b8d;--brand-colors-blue-600: #0645b1;--brand-colors-blue-500: #386ac1;--brand-colors-blue-400: #cddaef;--brand-colors-blue-300: #e6ecf7;--brand-colors-blue-200: #eef2f9;--brand-colors-blue-100: #f4f7fc;--brand-colors-gold-500: #877440;--brand-colors-gold-400: #e9e3d4;--brand-colors-gold-300: #f2efe8;--brand-colors-gold-200: #f9f6ed;--brand-colors-gold-100: #f9f7f4;--brand-colors-error-900: #920000;--brand-colors-error-500: #b60000;--brand-colors-success-900: #035c0f;--brand-colors-green: #ccffd4;--brand-colors-turquoise: #ccf7ff;--brand-colors-yellow: #f7ffcc;--brand-colors-peach: #ffd4cc;--brand-colors-violet: #f7ccff;--brand-colors-error-100: #fff2f2;--brand-colors-success-500: #05b01c;--brand-colors-success-100: #f0f8f1;--text-secondary: #535366;--icon-white: #ffffff;--background-beige-darker: #f2efe8;--icon-dark-grey: #535366;--type-font-family-sans-serif: Roboto;--type-font-family-serif: Georgia;--type-font-family-mono: IBM Plex Mono;--type-weights-300: 300;--type-weights-400: 400;--type-weights-500: 500;--type-weights-700: 700;--type-sizes-12: 12px;--type-sizes-14: 14px;--type-sizes-16: 16px;--type-sizes-18: 18px;--type-sizes-20: 20px;--type-sizes-22: 22px;--type-sizes-24: 24px;--type-sizes-28: 28px;--type-sizes-30: 30px;--type-sizes-32: 32px;--type-sizes-40: 40px;--type-sizes-42: 42px;--type-sizes-48-2: 48px;--type-line-heights-16: 16px;--type-line-heights-20: 20px;--type-line-heights-23: 23px;--type-line-heights-24: 24px;--type-line-heights-25: 25px;--type-line-heights-26: 26px;--type-line-heights-29: 29px;--type-line-heights-30: 30px;--type-line-heights-32: 32px;--type-line-heights-34: 34px;--type-line-heights-35: 35px;--type-line-heights-36: 36px;--type-line-heights-38: 38px;--type-line-heights-40: 40px;--type-line-heights-46: 46px;--type-line-heights-48: 48px;--type-line-heights-52: 52px;--type-line-heights-58: 58px;--type-line-heights-68: 68px;--type-line-heights-74: 74px;--type-line-heights-82: 82px;--type-paragraph-spacings-0: 0px;--type-paragraph-spacings-4: 4px;--type-paragraph-spacings-8: 8px;--type-paragraph-spacings-16: 16px;--type-sans-serif-xl-font-weight: 400;--type-sans-serif-xl-size: 42px;--type-sans-serif-xl-line-height: 46px;--type-sans-serif-xl-paragraph-spacing: 16px;--type-sans-serif-lg-font-weight: 400;--type-sans-serif-lg-size: 32px;--type-sans-serif-lg-line-height: 36px;--type-sans-serif-lg-paragraph-spacing: 16px;--type-sans-serif-md-font-weight: 400;--type-sans-serif-md-line-height: 34px;--type-sans-serif-md-paragraph-spacing: 16px;--type-sans-serif-md-size: 28px;--type-sans-serif-xs-font-weight: 700;--type-sans-serif-xs-line-height: 25px;--type-sans-serif-xs-paragraph-spacing: 0px;--type-sans-serif-xs-size: 20px;--type-sans-serif-sm-font-weight: 400;--type-sans-serif-sm-line-height: 30px;--type-sans-serif-sm-paragraph-spacing: 16px;--type-sans-serif-sm-size: 24px;--type-body-xl-font-weight: 400;--type-body-xl-size: 24px;--type-body-xl-line-height: 36px;--type-body-xl-paragraph-spacing: 0px;--type-body-sm-font-weight: 400;--type-body-sm-size: 14px;--type-body-sm-line-height: 20px;--type-body-sm-paragraph-spacing: 8px;--type-body-xs-font-weight: 400;--type-body-xs-size: 12px;--type-body-xs-line-height: 16px;--type-body-xs-paragraph-spacing: 0px;--type-body-md-font-weight: 400;--type-body-md-size: 16px;--type-body-md-line-height: 20px;--type-body-md-paragraph-spacing: 4px;--type-body-lg-font-weight: 400;--type-body-lg-size: 20px;--type-body-lg-line-height: 26px;--type-body-lg-paragraph-spacing: 16px;--type-body-lg-medium-font-weight: 500;--type-body-lg-medium-size: 20px;--type-body-lg-medium-line-height: 32px;--type-body-lg-medium-paragraph-spacing: 16px;--type-body-md-medium-font-weight: 500;--type-body-md-medium-size: 16px;--type-body-md-medium-line-height: 20px;--type-body-md-medium-paragraph-spacing: 4px;--type-body-sm-bold-font-weight: 700;--type-body-sm-bold-size: 14px;--type-body-sm-bold-line-height: 20px;--type-body-sm-bold-paragraph-spacing: 8px;--type-body-sm-medium-font-weight: 500;--type-body-sm-medium-size: 14px;--type-body-sm-medium-line-height: 20px;--type-body-sm-medium-paragraph-spacing: 8px;--type-serif-md-font-weight: 400;--type-serif-md-size: 40px;--type-serif-md-paragraph-spacing: 0px;--type-serif-md-line-height: 48px;--type-serif-sm-font-weight: 400;--type-serif-sm-size: 28px;--type-serif-sm-paragraph-spacing: 0px;--type-serif-sm-line-height: 32px;--type-serif-lg-font-weight: 400;--type-serif-lg-size: 58px;--type-serif-lg-paragraph-spacing: 0px;--type-serif-lg-line-height: 68px;--type-serif-xs-font-weight: 400;--type-serif-xs-size: 18px;--type-serif-xs-line-height: 24px;--type-serif-xs-paragraph-spacing: 0px;--type-serif-xl-font-weight: 400;--type-serif-xl-size: 74px;--type-serif-xl-paragraph-spacing: 0px;--type-serif-xl-line-height: 82px;--type-mono-md-font-weight: 400;--type-mono-md-size: 22px;--type-mono-md-line-height: 24px;--type-mono-md-paragraph-spacing: 0px;--type-mono-lg-font-weight: 400;--type-mono-lg-size: 40px;--type-mono-lg-line-height: 40px;--type-mono-lg-paragraph-spacing: 0px;--type-mono-sm-font-weight: 400;--type-mono-sm-size: 14px;--type-mono-sm-line-height: 24px;--type-mono-sm-paragraph-spacing: 0px;--spacing-xs-4: 4px;--spacing-xs-8: 8px;--spacing-xs-16: 16px;--spacing-sm-24: 24px;--spacing-sm-32: 32px;--spacing-md-40: 40px;--spacing-md-48: 48px;--spacing-lg-64: 64px;--spacing-lg-80: 80px;--spacing-xlg-104: 104px;--spacing-xlg-152: 152px;--spacing-xs-12: 12px;--spacing-page-section: 104px;--spacing-card-list-spacing: 48px;--spacing-text-section-spacing: 80px;--spacing-md-xs-headings: 40px;--corner-radius-radius-lg: 16px;--corner-radius-radius-sm: 4px;--corner-radius-radius-md: 8px;--corner-radius-radius-round: 104px}}@media(min-width: 1280px){:root{--token-mode: Parity;--dropshadow: 0 2px 4px 0 #22223340;--primary-brand: #0645b1;--error-dark: #b60000;--success-dark: #05b01c;--inactive-fill: #ebebee;--hover: #0c3b8d;--pressed: #082f75;--button-primary-fill-inactive: #ebebee;--button-primary-fill: #0645b1;--button-primary-text: #ffffff;--button-primary-fill-hover: #0c3b8d;--button-primary-fill-press: #082f75;--button-primary-icon: #ffffff;--button-primary-fill-inverse: #ffffff;--button-primary-text-inverse: #082f75;--button-primary-icon-inverse: #0645b1;--button-primary-fill-inverse-hover: #cddaef;--button-primary-stroke-inverse-pressed: #0645b1;--button-secondary-stroke-inactive: #b1b1ba;--button-secondary-fill: #eef2f9;--button-secondary-text: #082f75;--button-secondary-fill-press: #cddaef;--button-secondary-fill-inactive: #ebebee;--button-secondary-stroke: #cddaef;--button-secondary-stroke-hover: #386ac1;--button-secondary-stroke-press: #0645b1;--button-secondary-text-inactive: #b1b1ba;--button-secondary-icon: #082f75;--button-secondary-fill-hover: #e6ecf7;--button-secondary-stroke-inverse: #ffffff;--button-secondary-fill-inverse: rgba(255, 255, 255, 0);--button-secondary-icon-inverse: #ffffff;--button-secondary-icon-hover: #082f75;--button-secondary-icon-press: #082f75;--button-secondary-text-inverse: #ffffff;--button-secondary-text-hover: #082f75;--button-secondary-text-press: #082f75;--button-secondary-fill-inverse-hover: #043059;--button-xs-stroke: #141413;--button-xs-stroke-hover: #0c3b8d;--button-xs-stroke-press: #082f75;--button-xs-stroke-inactive: #ebebee;--button-xs-text: #141413;--button-xs-text-hover: #0c3b8d;--button-xs-text-press: #082f75;--button-xs-text-inactive: #91919e;--button-xs-icon: #141413;--button-xs-icon-hover: #0c3b8d;--button-xs-icon-press: #082f75;--button-xs-icon-inactive: #91919e;--button-xs-fill: #ffffff;--button-xs-fill-hover: #f4f7fc;--button-xs-fill-press: #eef2f9;--buttons-button-text-inactive: #91919e;--buttons-button-focus: #0645b1;--buttons-button-icon-inactive: #91919e;--buttons-small-buttons-corner-radius: 8px;--buttons-small-buttons-l-r-padding: 12px;--buttons-small-buttons-height: 44px;--buttons-small-buttons-gap: 8px;--buttons-small-buttons-icon-only-width: 44px;--buttons-small-buttons-icon-size: 20px;--buttons-small-buttons-stroke-default: 1px;--buttons-small-buttons-stroke-thick: 2px;--buttons-large-buttons-l-r-padding: 20px;--buttons-large-buttons-height: 54px;--buttons-large-buttons-icon-only-width: 54px;--buttons-large-buttons-icon-size: 20px;--buttons-large-buttons-gap: 8px;--buttons-large-buttons-corner-radius: 8px;--buttons-large-buttons-stroke-default: 1px;--buttons-large-buttons-stroke-thick: 2px;--buttons-extra-small-buttons-l-r-padding: 8px;--buttons-extra-small-buttons-height: 32px;--buttons-extra-small-buttons-icon-size: 16px;--buttons-extra-small-buttons-gap: 4px;--buttons-extra-small-buttons-corner-radius: 8px;--buttons-stroke-default: 1px;--buttons-stroke-thick: 2px;--background-beige: #f9f7f4;--error-light: #fff2f2;--text-placeholder: #6d6d7d;--stroke-dark: #141413;--stroke-light: #dddde2;--stroke-medium: #535366;--accent-green: #ccffd4;--accent-turquoise: #ccf7ff;--accent-yellow: #f7ffcc;--accent-peach: #ffd4cc;--accent-violet: #f7ccff;--accent-purple: #f4f7fc;--text-primary: #141413;--secondary-brand: #141413;--text-hover: #0c3b8d;--text-white: #ffffff;--text-link: #0645b1;--text-press: #082f75;--success-light: #f0f8f1;--background-light-blue: #eef2f9;--background-white: #ffffff;--premium-dark: #877440;--premium-light: #f9f6ed;--stroke-white: #ffffff;--inactive-content: #b1b1ba;--annotate-light: #a35dff;--annotate-dark: #824acc;--grid: #eef2f9;--inactive-stroke: #ebebee;--shadow: rgba(34, 34, 51, 0.25);--text-inactive: #6d6d7d;--text-error: #b60000;--stroke-error: #b60000;--background-error: #fff2f2;--background-black: #141413;--icon-default: #141413;--icon-blue: #0645b1;--background-grey: #dddde2;--icon-grey: #b1b1ba;--text-focus: #082f75;--brand-colors-neutral-black: #141413;--brand-colors-neutral-900: #535366;--brand-colors-neutral-800: #6d6d7d;--brand-colors-neutral-700: #91919e;--brand-colors-neutral-600: #b1b1ba;--brand-colors-neutral-500: #c8c8cf;--brand-colors-neutral-400: #dddde2;--brand-colors-neutral-300: #ebebee;--brand-colors-neutral-200: #f8f8fb;--brand-colors-neutral-100: #fafafa;--brand-colors-neutral-white: #ffffff;--brand-colors-blue-900: #043059;--brand-colors-blue-800: #082f75;--brand-colors-blue-700: #0c3b8d;--brand-colors-blue-600: #0645b1;--brand-colors-blue-500: #386ac1;--brand-colors-blue-400: #cddaef;--brand-colors-blue-300: #e6ecf7;--brand-colors-blue-200: #eef2f9;--brand-colors-blue-100: #f4f7fc;--brand-colors-gold-500: #877440;--brand-colors-gold-400: #e9e3d4;--brand-colors-gold-300: #f2efe8;--brand-colors-gold-200: #f9f6ed;--brand-colors-gold-100: #f9f7f4;--brand-colors-error-900: #920000;--brand-colors-error-500: #b60000;--brand-colors-success-900: #035c0f;--brand-colors-green: #ccffd4;--brand-colors-turquoise: #ccf7ff;--brand-colors-yellow: #f7ffcc;--brand-colors-peach: #ffd4cc;--brand-colors-violet: #f7ccff;--brand-colors-error-100: #fff2f2;--brand-colors-success-500: #05b01c;--brand-colors-success-100: #f0f8f1;--text-secondary: #535366;--icon-white: #ffffff;--background-beige-darker: #f2efe8;--icon-dark-grey: #535366;--type-font-family-sans-serif: Roboto;--type-font-family-serif: Georgia;--type-font-family-mono: IBM Plex Mono;--type-weights-300: 300;--type-weights-400: 400;--type-weights-500: 500;--type-weights-700: 700;--type-sizes-12: 12px;--type-sizes-14: 14px;--type-sizes-16: 16px;--type-sizes-18: 18px;--type-sizes-20: 20px;--type-sizes-22: 22px;--type-sizes-24: 24px;--type-sizes-28: 28px;--type-sizes-30: 30px;--type-sizes-32: 32px;--type-sizes-40: 40px;--type-sizes-42: 42px;--type-sizes-48-2: 48px;--type-line-heights-16: 16px;--type-line-heights-20: 20px;--type-line-heights-23: 23px;--type-line-heights-24: 24px;--type-line-heights-25: 25px;--type-line-heights-26: 26px;--type-line-heights-29: 29px;--type-line-heights-30: 30px;--type-line-heights-32: 32px;--type-line-heights-34: 34px;--type-line-heights-35: 35px;--type-line-heights-36: 36px;--type-line-heights-38: 38px;--type-line-heights-40: 40px;--type-line-heights-46: 46px;--type-line-heights-48: 48px;--type-line-heights-52: 52px;--type-line-heights-58: 58px;--type-line-heights-68: 68px;--type-line-heights-74: 74px;--type-line-heights-82: 82px;--type-paragraph-spacings-0: 0px;--type-paragraph-spacings-4: 4px;--type-paragraph-spacings-8: 8px;--type-paragraph-spacings-16: 16px;--type-sans-serif-xl-font-weight: 400;--type-sans-serif-xl-size: 42px;--type-sans-serif-xl-line-height: 46px;--type-sans-serif-xl-paragraph-spacing: 16px;--type-sans-serif-lg-font-weight: 400;--type-sans-serif-lg-size: 32px;--type-sans-serif-lg-line-height: 38px;--type-sans-serif-lg-paragraph-spacing: 16px;--type-sans-serif-md-font-weight: 400;--type-sans-serif-md-line-height: 34px;--type-sans-serif-md-paragraph-spacing: 16px;--type-sans-serif-md-size: 28px;--type-sans-serif-xs-font-weight: 700;--type-sans-serif-xs-line-height: 25px;--type-sans-serif-xs-paragraph-spacing: 0px;--type-sans-serif-xs-size: 20px;--type-sans-serif-sm-font-weight: 400;--type-sans-serif-sm-line-height: 30px;--type-sans-serif-sm-paragraph-spacing: 16px;--type-sans-serif-sm-size: 24px;--type-body-xl-font-weight: 400;--type-body-xl-size: 24px;--type-body-xl-line-height: 36px;--type-body-xl-paragraph-spacing: 0px;--type-body-sm-font-weight: 400;--type-body-sm-size: 14px;--type-body-sm-line-height: 20px;--type-body-sm-paragraph-spacing: 8px;--type-body-xs-font-weight: 400;--type-body-xs-size: 12px;--type-body-xs-line-height: 16px;--type-body-xs-paragraph-spacing: 0px;--type-body-md-font-weight: 400;--type-body-md-size: 16px;--type-body-md-line-height: 20px;--type-body-md-paragraph-spacing: 4px;--type-body-lg-font-weight: 400;--type-body-lg-size: 20px;--type-body-lg-line-height: 26px;--type-body-lg-paragraph-spacing: 16px;--type-body-lg-medium-font-weight: 500;--type-body-lg-medium-size: 20px;--type-body-lg-medium-line-height: 32px;--type-body-lg-medium-paragraph-spacing: 16px;--type-body-md-medium-font-weight: 500;--type-body-md-medium-size: 16px;--type-body-md-medium-line-height: 20px;--type-body-md-medium-paragraph-spacing: 4px;--type-body-sm-bold-font-weight: 700;--type-body-sm-bold-size: 14px;--type-body-sm-bold-line-height: 20px;--type-body-sm-bold-paragraph-spacing: 8px;--type-body-sm-medium-font-weight: 500;--type-body-sm-medium-size: 14px;--type-body-sm-medium-line-height: 20px;--type-body-sm-medium-paragraph-spacing: 8px;--type-serif-md-font-weight: 400;--type-serif-md-size: 40px;--type-serif-md-paragraph-spacing: 0px;--type-serif-md-line-height: 48px;--type-serif-sm-font-weight: 400;--type-serif-sm-size: 28px;--type-serif-sm-paragraph-spacing: 0px;--type-serif-sm-line-height: 32px;--type-serif-lg-font-weight: 400;--type-serif-lg-size: 58px;--type-serif-lg-paragraph-spacing: 0px;--type-serif-lg-line-height: 68px;--type-serif-xs-font-weight: 400;--type-serif-xs-size: 18px;--type-serif-xs-line-height: 24px;--type-serif-xs-paragraph-spacing: 0px;--type-serif-xl-font-weight: 400;--type-serif-xl-size: 74px;--type-serif-xl-paragraph-spacing: 0px;--type-serif-xl-line-height: 82px;--type-mono-md-font-weight: 400;--type-mono-md-size: 22px;--type-mono-md-line-height: 24px;--type-mono-md-paragraph-spacing: 0px;--type-mono-lg-font-weight: 400;--type-mono-lg-size: 40px;--type-mono-lg-line-height: 40px;--type-mono-lg-paragraph-spacing: 0px;--type-mono-sm-font-weight: 400;--type-mono-sm-size: 14px;--type-mono-sm-line-height: 24px;--type-mono-sm-paragraph-spacing: 0px;--spacing-xs-4: 4px;--spacing-xs-8: 8px;--spacing-xs-16: 16px;--spacing-sm-24: 24px;--spacing-sm-32: 32px;--spacing-md-40: 40px;--spacing-md-48: 48px;--spacing-lg-64: 64px;--spacing-lg-80: 80px;--spacing-xlg-104: 104px;--spacing-xlg-152: 152px;--spacing-xs-12: 12px;--spacing-page-section: 152px;--spacing-card-list-spacing: 48px;--spacing-text-section-spacing: 80px;--spacing-md-xs-headings: 40px;--corner-radius-radius-lg: 16px;--corner-radius-radius-sm: 4px;--corner-radius-radius-md: 8px;--corner-radius-radius-round: 104px}}</style><link crossorigin="" href="https://fonts.gstatic.com/" rel="preconnect" /><link href="https://fonts.googleapis.com/css2?family=DM+Sans:ital,opsz,wght@0,9..40,100..1000;1,9..40,100..1000&amp;family=Gupter:wght@400;500;700&amp;family=IBM+Plex+Mono:wght@300;400&amp;family=Material+Symbols+Outlined:opsz,wght,FILL,GRAD@20,400,0,0&amp;display=swap" rel="stylesheet" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/common-57f9da13cef3fd4e2a8b655342c6488eded3e557e823fe67571f2ac77acd7b6f.css" /> <meta name="author" content="catherine guastavino" /> <meta name="description" content="Catherine Guastavino, McGill University: 182 Followers, 19 Following, 365 Research papers. Research interests: Music Performance, Soundscape, and Hearing." /> <meta name="google-site-verification" content="bKJMBZA7E43xhDOopFZkssMMkBRjvYERV-NaN4R6mrs" /> <script> var $controller_name = 'works'; var $action_name = "summary"; var $rails_env = 'production'; var $app_rev = '0b8ad487192af8d1cd4b80bd34002cf444c419e0'; var $domain = 'academia.edu'; var $app_host = "academia.edu"; var $asset_host = "academia-assets.com"; var $start_time = new Date().getTime(); var $recaptcha_key = "6LdxlRMTAAAAADnu_zyLhLg0YF9uACwz78shpjJB"; var $recaptcha_invisible_key = "6Lf3KHUUAAAAACggoMpmGJdQDtiyrjVlvGJ6BbAj"; var $disableClientRecordHit = false; </script> <script> window.Aedu = { hit_data: null }; window.Aedu.SiteStats = {"premium_universities_count":14000,"monthly_visitors":"31 million","monthly_visitor_count":31300000,"monthly_visitor_count_in_millions":31,"user_count":284425671,"paper_count":55203019,"paper_count_in_millions":55,"page_count":432000000,"page_count_in_millions":432,"pdf_count":16500000,"pdf_count_in_millions":16}; window.Aedu.serverRenderTime = new Date(1741339925000); window.Aedu.timeDifference = new Date().getTime() - 1741339925000; window.Aedu.isUsingCssV1 = false; window.Aedu.enableLocalization = true; window.Aedu.activateFullstory = false; window.Aedu.serviceAvailability = { status: {"attention_db":"on","bibliography_db":"on","contacts_db":"on","email_db":"on","indexability_db":"on","mentions_db":"on","news_db":"on","notifications_db":"on","offsite_mentions_db":"on","redshift":"on","redshift_exports_db":"on","related_works_db":"on","ring_db":"on","user_tests_db":"on"}, serviceEnabled: function(service) { return this.status[service] === "on"; }, readEnabled: function(service) { return this.serviceEnabled(service) || this.status[service] === "read_only"; }, }; window.Aedu.viewApmTrace = function() { // Check if x-apm-trace-id meta tag is set, and open the trace in APM // in a new window if it is. var apmTraceId = document.head.querySelector('meta[name="x-apm-trace-id"]'); if (apmTraceId) { var traceId = apmTraceId.content; // Use trace ID to construct URL, an example URL looks like: // https://app.datadoghq.com/apm/traces?query=trace_id%31298410148923562634 var apmUrl = 'https://app.datadoghq.com/apm/traces?query=trace_id%3A' + traceId; window.open(apmUrl, '_blank'); } }; </script> <!--[if lt IE 9]> <script src="//cdnjs.cloudflare.com/ajax/libs/html5shiv/3.7.2/html5shiv.min.js"></script> <![endif]--> <link href="https://fonts.googleapis.com/css?family=Roboto:100,100i,300,300i,400,400i,500,500i,700,700i,900,900i" rel="stylesheet"> <link rel="preload" href="//maxcdn.bootstrapcdn.com/font-awesome/4.3.0/css/font-awesome.min.css" as="style" onload="this.rel='stylesheet'"> <link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/libraries-a9675dcb01ec4ef6aa807ba772c7a5a00c1820d3ff661c1038a20f80d06bb4e4.css" /> <link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/academia-1eb081e01ca8bc0c1b1d866df79d9eb4dd2c484e4beecf76e79a7806c72fee08.css" /> <link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system_legacy-056a9113b9a0f5343d013b29ee1929d5a18be35fdcdceb616600b4db8bd20054.css" /> <script src="//a.academia-assets.com/assets/webpack_bundles/runtime-bundle-005434038af4252ca37c527588411a3d6a0eabb5f727fac83f8bbe7fd88d93bb.js"></script> <script src="//a.academia-assets.com/assets/webpack_bundles/webpack_libraries_and_infrequently_changed.wjs-bundle-01ce4b87eff03f3b56bd66ceda94cbe720cba1d8e809023211736b2903300f3f.js"></script> <script src="//a.academia-assets.com/assets/webpack_bundles/core_webpack.wjs-bundle-ac03a366e050150e192ff686014c748a548d266e91c59930ce516bdce30176f0.js"></script> <script src="//a.academia-assets.com/assets/webpack_bundles/sentry.wjs-bundle-5fe03fddca915c8ba0f7edbe64c194308e8ce5abaed7bffe1255ff37549c4808.js"></script> <script> jade = window.jade || {}; jade.helpers = window.$h; jade._ = window._; </script> <!-- Google Tag Manager --> <script id="tag-manager-head-root">(function(w,d,s,l,i){w[l]=w[l]||[];w[l].push({'gtm.start': new Date().getTime(),event:'gtm.js'});var f=d.getElementsByTagName(s)[0], j=d.createElement(s),dl=l!='dataLayer'?'&l='+l:'';j.async=true;j.src= 'https://www.googletagmanager.com/gtm.js?id='+i+dl;f.parentNode.insertBefore(j,f); })(window,document,'script','dataLayer_old','GTM-5G9JF7Z');</script> <!-- End Google Tag Manager --> <script> window.gptadslots = []; window.googletag = window.googletag || {}; window.googletag.cmd = window.googletag.cmd || []; </script> <script type="text/javascript"> // TODO(jacob): This should be defined, may be rare load order problem. // Checking if null is just a quick fix, will default to en if unset. // Better fix is to run this immedietely after I18n is set. if (window.I18n != null) { I18n.defaultLocale = "en"; I18n.locale = "en"; I18n.fallbacks = true; } </script> <link rel="canonical" href="https://mcgill.academia.edu/CatherineGuastavino" /> </head> <!--[if gte IE 9 ]> <body class='ie ie9 c-profiles/works a-summary logged_out'> <![endif]--> <!--[if !(IE) ]><!--> <body class='c-profiles/works a-summary logged_out'> <!--<![endif]--> <div id="fb-root"></div><script>window.fbAsyncInit = function() { FB.init({ appId: "2369844204", version: "v8.0", status: true, cookie: true, xfbml: true }); // Additional initialization code. if (window.InitFacebook) { // facebook.ts already loaded, set it up. window.InitFacebook(); } else { // Set a flag for facebook.ts to find when it loads. window.academiaAuthReadyFacebook = true; } };</script><script>window.fbAsyncLoad = function() { // Protection against double calling of this function if (window.FB) { return; } (function(d, s, id){ var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) {return;} js = d.createElement(s); js.id = id; js.src = "//connect.facebook.net/en_US/sdk.js"; fjs.parentNode.insertBefore(js, fjs); }(document, 'script', 'facebook-jssdk')); } if (!window.defer_facebook) { // Autoload if not deferred window.fbAsyncLoad(); } else { // Defer loading by 5 seconds setTimeout(function() { window.fbAsyncLoad(); }, 5000); }</script> <div id="google-root"></div><script>window.loadGoogle = function() { if (window.InitGoogle) { // google.ts already loaded, set it up. window.InitGoogle("331998490334-rsn3chp12mbkiqhl6e7lu2q0mlbu0f1b"); } else { // Set a flag for google.ts to use when it loads. window.GoogleClientID = "331998490334-rsn3chp12mbkiqhl6e7lu2q0mlbu0f1b"; } };</script><script>window.googleAsyncLoad = function() { // Protection against double calling of this function (function(d) { var js; var id = 'google-jssdk'; var ref = d.getElementsByTagName('script')[0]; if (d.getElementById(id)) { return; } js = d.createElement('script'); js.id = id; js.async = true; js.onload = loadGoogle; js.src = "https://accounts.google.com/gsi/client" ref.parentNode.insertBefore(js, ref); }(document)); } if (!window.defer_google) { // Autoload if not deferred window.googleAsyncLoad(); } else { // Defer loading by 5 seconds setTimeout(function() { window.googleAsyncLoad(); }, 5000); }</script> <div id="tag-manager-body-root"> <!-- Google Tag Manager (noscript) --> <noscript><iframe src="https://www.googletagmanager.com/ns.html?id=GTM-5G9JF7Z" height="0" width="0" style="display:none;visibility:hidden"></iframe></noscript> <!-- End Google Tag Manager (noscript) --> <!-- Event listeners for analytics --> <script> window.addEventListener('load', function() { if (document.querySelector('input[name="commit"]')) { document.querySelector('input[name="commit"]').addEventListener('click', function() { gtag('event', 'click', { event_category: 'button', event_label: 'Log In' }) }) } }); </script> </div> <script>var _comscore = _comscore || []; _comscore.push({ c1: "2", c2: "26766707" }); (function() { var s = document.createElement("script"), el = document.getElementsByTagName("script")[0]; s.async = true; s.src = (document.location.protocol == "https:" ? "https://sb" : "http://b") + ".scorecardresearch.com/beacon.js"; el.parentNode.insertBefore(s, el); })();</script><img src="https://sb.scorecardresearch.com/p?c1=2&amp;c2=26766707&amp;cv=2.0&amp;cj=1" style="position: absolute; visibility: hidden" /> <div id='react-modal'></div> <div class='DesignSystem'> <a class='u-showOnFocus' href='#site'> Skip to main content </a> </div> <div id="upgrade_ie_banner" style="display: none;"><p>Academia.edu no longer supports Internet Explorer.</p><p>To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to&nbsp;<a href="https://www.academia.edu/upgrade-browser">upgrade your browser</a>.</p></div><script>// Show this banner for all versions of IE if (!!window.MSInputMethodContext || /(MSIE)/.test(navigator.userAgent)) { document.getElementById('upgrade_ie_banner').style.display = 'block'; }</script> <div class="DesignSystem bootstrap ShrinkableNav"><div class="navbar navbar-default main-header"><div class="container-wrapper" id="main-header-container"><div class="container"><div class="navbar-header"><div class="nav-left-wrapper u-mt0x"><div class="nav-logo"><a data-main-header-link-target="logo_home" href="https://www.academia.edu/"><img class="visible-xs-inline-block" style="height: 24px;" alt="Academia.edu" src="//a.academia-assets.com/images/academia-logo-redesign-2015-A.svg" width="24" height="24" /><img width="145.2" height="18" class="hidden-xs" style="height: 24px;" alt="Academia.edu" src="//a.academia-assets.com/images/academia-logo-redesign-2015.svg" /></a></div><div class="nav-search"><div class="SiteSearch-wrapper select2-no-default-pills"><form class="js-SiteSearch-form DesignSystem" action="https://www.academia.edu/search" accept-charset="UTF-8" method="get"><i class="SiteSearch-icon fa fa-search u-fw700 u-positionAbsolute u-tcGrayDark"></i><input class="js-SiteSearch-form-input SiteSearch-form-input form-control" data-main-header-click-target="search_input" name="q" placeholder="Search" type="text" value="" /></form></div></div></div><div class="nav-right-wrapper pull-right"><ul class="NavLinks js-main-nav list-unstyled"><li class="NavLinks-link"><a class="js-header-login-url Button Button--inverseGray Button--sm u-mb4x" id="nav_log_in" rel="nofollow" href="https://www.academia.edu/login">Log In</a></li><li class="NavLinks-link u-p0x"><a class="Button Button--inverseGray Button--sm u-mb4x" rel="nofollow" href="https://www.academia.edu/signup">Sign Up</a></li></ul><button class="hidden-lg hidden-md hidden-sm u-ml4x navbar-toggle collapsed" data-target=".js-mobile-header-links" data-toggle="collapse" type="button"><span class="icon-bar"></span><span class="icon-bar"></span><span class="icon-bar"></span></button></div></div><div class="collapse navbar-collapse js-mobile-header-links"><ul class="nav navbar-nav"><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/login">Log In</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/signup">Sign Up</a></li><li class="u-borderColorGrayLight u-borderBottom1 js-mobile-nav-expand-trigger"><a href="#">more&nbsp<span class="caret"></span></a></li><li><ul class="js-mobile-nav-expand-section nav navbar-nav u-m0x collapse"><li class="u-borderColorGrayLight u-borderBottom1"><a rel="false" href="https://www.academia.edu/about">About</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/press">Press</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="false" href="https://www.academia.edu/documents">Papers</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/terms">Terms</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/privacy">Privacy</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/copyright">Copyright</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://www.academia.edu/hiring"><i class="fa fa-briefcase"></i>&nbsp;We're Hiring!</a></li><li class="u-borderColorGrayLight u-borderBottom1"><a rel="nofollow" href="https://support.academia.edu/hc/en-us"><i class="fa fa-question-circle"></i>&nbsp;Help Center</a></li><li class="js-mobile-nav-collapse-trigger u-borderColorGrayLight u-borderBottom1 dropup" style="display:none"><a href="#">less&nbsp<span class="caret"></span></a></li></ul></li></ul></div></div></div><script>(function(){ var $moreLink = $(".js-mobile-nav-expand-trigger"); var $lessLink = $(".js-mobile-nav-collapse-trigger"); var $section = $('.js-mobile-nav-expand-section'); $moreLink.click(function(ev){ ev.preventDefault(); $moreLink.hide(); $lessLink.show(); $section.collapse('show'); }); $lessLink.click(function(ev){ ev.preventDefault(); $moreLink.show(); $lessLink.hide(); $section.collapse('hide'); }); })() if ($a.is_logged_in() || false) { new Aedu.NavigationController({ el: '.js-main-nav', showHighlightedNotification: false }); } else { $(".js-header-login-url").attr("href", $a.loginUrlWithRedirect()); } Aedu.autocompleteSearch = new AutocompleteSearch({el: '.js-SiteSearch-form'});</script></div></div> <div id='site' class='fixed'> <div id="content" class="clearfix"> <script>document.addEventListener('DOMContentLoaded', function(){ var $dismissible = $(".dismissible_banner"); $dismissible.click(function(ev) { $dismissible.hide(); }); });</script> <script src="//a.academia-assets.com/assets/webpack_bundles/profile.wjs-bundle-73437c607b22c9916b71c43ebb9de6318ee6651f950e38659bcd7aaee7d11f28.js" defer="defer"></script><script>$viewedUser = Aedu.User.set_viewed( {"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino","photo":"https://0.academia-photos.com/45597668/117885521/107192199/s65_catherine.guastavino.jpg","has_photo":true,"department":{"id":26143,"name":"School of Information Studies","url":"https://mcgill.academia.edu/Departments/School_of_Information_Studies/Documents","university":{"id":626,"name":"McGill University","url":"https://mcgill.academia.edu/"}},"position":"Faculty Member","position_id":1,"is_analytics_public":false,"interests":[{"id":34477,"name":"Music Performance","url":"https://www.academia.edu/Documents/in/Music_Performance"},{"id":54333,"name":"Soundscape","url":"https://www.academia.edu/Documents/in/Soundscape"},{"id":59693,"name":"Hearing","url":"https://www.academia.edu/Documents/in/Hearing"},{"id":4158,"name":"Psychoacoustics","url":"https://www.academia.edu/Documents/in/Psychoacoustics"},{"id":31793,"name":"Sound Art","url":"https://www.academia.edu/Documents/in/Sound_Art"},{"id":6634,"name":"Soundscape Studies","url":"https://www.academia.edu/Documents/in/Soundscape_Studies"},{"id":4147,"name":"Audio Engineering","url":"https://www.academia.edu/Documents/in/Audio_Engineering"},{"id":3031,"name":"Auditory Perception","url":"https://www.academia.edu/Documents/in/Auditory_Perception"},{"id":11081,"name":"Multimodal Interaction","url":"https://www.academia.edu/Documents/in/Multimodal_Interaction"}]} ); if ($a.is_logged_in() && $viewedUser.is_current_user()) { $('body').addClass('profile-viewed-by-owner'); } $socialProfiles = []</script><div id="js-react-on-rails-context" style="display:none" data-rails-context="{&quot;inMailer&quot;:false,&quot;i18nLocale&quot;:&quot;en&quot;,&quot;i18nDefaultLocale&quot;:&quot;en&quot;,&quot;href&quot;:&quot;https://mcgill.academia.edu/CatherineGuastavino&quot;,&quot;location&quot;:&quot;/CatherineGuastavino&quot;,&quot;scheme&quot;:&quot;https&quot;,&quot;host&quot;:&quot;mcgill.academia.edu&quot;,&quot;port&quot;:null,&quot;pathname&quot;:&quot;/CatherineGuastavino&quot;,&quot;search&quot;:null,&quot;httpAcceptLanguage&quot;:null,&quot;serverSide&quot;:false}"></div> <div class="js-react-on-rails-component" style="display:none" data-component-name="ProfileCheckPaperUpdate" data-props="{}" data-trace="false" data-dom-id="ProfileCheckPaperUpdate-react-component-a1f0c142-34dd-4da5-a82b-737e0752d0be"></div> <div id="ProfileCheckPaperUpdate-react-component-a1f0c142-34dd-4da5-a82b-737e0752d0be"></div> <div class="DesignSystem"><div class="onsite-ping" id="onsite-ping"></div></div><div class="profile-user-info DesignSystem"><div class="social-profile-container"><div class="left-panel-container"><div class="user-info-component-wrapper"><div class="user-summary-cta-container"><div class="user-summary-container"><div class="social-profile-avatar-container"><img class="profile-avatar u-positionAbsolute" alt="Catherine Guastavino" border="0" onerror="if (this.src != &#39;//a.academia-assets.com/images/s200_no_pic.png&#39;) this.src = &#39;//a.academia-assets.com/images/s200_no_pic.png&#39;;" width="200" height="200" src="https://0.academia-photos.com/45597668/117885521/107192199/s200_catherine.guastavino.jpg" /></div><div class="title-container"><h1 class="ds2-5-heading-sans-serif-sm">Catherine Guastavino</h1><div class="affiliations-container fake-truncate js-profile-affiliations"><div><a class="u-tcGrayDarker" href="https://mcgill.academia.edu/">McGill University</a>, <a class="u-tcGrayDarker" href="https://mcgill.academia.edu/Departments/School_of_Information_Studies/Documents">School of Information Studies</a>, <span class="u-tcGrayDarker">Faculty Member</span></div></div></div></div><div class="sidebar-cta-container"><button class="ds2-5-button hidden profile-cta-button grow js-profile-follow-button" data-broccoli-component="user-info.follow-button" data-click-track="profile-user-info-follow-button" data-follow-user-fname="Catherine" data-follow-user-id="45597668" data-follow-user-source="profile_button" data-has-google="false"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">add</span>Follow</button><button class="ds2-5-button hidden profile-cta-button grow js-profile-unfollow-button" data-broccoli-component="user-info.unfollow-button" data-click-track="profile-user-info-unfollow-button" data-unfollow-user-id="45597668"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">done</span>Following</button></div></div><div class="user-stats-container"><a><div class="stat-container js-profile-followers"><p class="label">Followers</p><p class="data">182</p></div></a><a><div class="stat-container js-profile-followees" data-broccoli-component="user-info.followees-count" data-click-track="profile-expand-user-info-following"><p class="label">Following</p><p class="data">19</p></div></a><a><div class="stat-container js-profile-coauthors" data-broccoli-component="user-info.coauthors-count" data-click-track="profile-expand-user-info-coauthors"><p class="label">Co-authors</p><p class="data">18</p></div></a><div class="js-mentions-count-container" style="display: none;"><a href="/CatherineGuastavino/mentions"><div class="stat-container"><p class="label">Mentions</p><p class="data"></p></div></a></div><span><div class="stat-container"><p class="label"><span class="js-profile-total-view-text">Public Views</span></p><p class="data"><span class="js-profile-view-count"></span></p></div></span></div><div class="suggested-academics-container"><div class="suggested-academics--header"><p class="ds2-5-body-md-bold">Related Authors</p></div><ul class="suggested-user-card-list"><div class="suggested-user-card"><div class="suggested-user-card__avatar social-profile-avatar-container"><a href="https://uib-es.academia.edu/ElenaGarc%C3%ADavald%C3%A9s"><img class="profile-avatar u-positionAbsolute" border="0" alt="" src="//a.academia-assets.com/images/s200_no_pic.png" /></a></div><div class="suggested-user-card__user-info"><a class="suggested-user-card__user-info__header ds2-5-body-sm-bold ds2-5-body-link" href="https://uib-es.academia.edu/ElenaGarc%C3%ADavald%C3%A9s">Elena García-valdés</a><p class="suggested-user-card__user-info__subheader ds2-5-body-xs">Universitat de les Illes Balears</p></div></div><div class="suggested-user-card"><div class="suggested-user-card__avatar social-profile-avatar-container"><a href="https://concordia.academia.edu/MurielLuderowski"><img class="profile-avatar u-positionAbsolute" border="0" alt="" src="//a.academia-assets.com/images/s200_no_pic.png" /></a></div><div class="suggested-user-card__user-info"><a class="suggested-user-card__user-info__header ds2-5-body-sm-bold ds2-5-body-link" href="https://concordia.academia.edu/MurielLuderowski">Muriel Luderowski</a><p class="suggested-user-card__user-info__subheader ds2-5-body-xs">Concordia University (Canada)</p></div></div><div class="suggested-user-card"><div class="suggested-user-card__avatar social-profile-avatar-container"><a href="https://independent.academia.edu/jsadaba"><img class="profile-avatar u-positionAbsolute" alt="juan sadaba" border="0" onerror="if (this.src != &#39;//a.academia-assets.com/images/s200_no_pic.png&#39;) this.src = &#39;//a.academia-assets.com/images/s200_no_pic.png&#39;;" width="200" height="200" src="https://0.academia-photos.com/122497093/49469750/37445942/s200_juan.sadaba.jpeg" /></a></div><div class="suggested-user-card__user-info"><a class="suggested-user-card__user-info__header ds2-5-body-sm-bold ds2-5-body-link" href="https://independent.academia.edu/jsadaba">juan sadaba</a></div></div><div class="suggested-user-card"><div class="suggested-user-card__avatar social-profile-avatar-container"><a href="https://independent.academia.edu/MurrayParker2"><img class="profile-avatar u-positionAbsolute" border="0" alt="" src="//a.academia-assets.com/images/s200_no_pic.png" /></a></div><div class="suggested-user-card__user-info"><a class="suggested-user-card__user-info__header ds2-5-body-sm-bold ds2-5-body-link" href="https://independent.academia.edu/MurrayParker2">Murray Parker</a></div></div><div class="suggested-user-card"><div class="suggested-user-card__avatar social-profile-avatar-container"><a href="https://independent.academia.edu/SanjayKumar760"><img class="profile-avatar u-positionAbsolute" border="0" alt="" src="//a.academia-assets.com/images/s200_no_pic.png" /></a></div><div class="suggested-user-card__user-info"><a class="suggested-user-card__user-info__header ds2-5-body-sm-bold ds2-5-body-link" href="https://independent.academia.edu/SanjayKumar760">Sanjay Kumar</a></div></div><div class="suggested-user-card"><div class="suggested-user-card__avatar social-profile-avatar-container"><a href="https://jamd.academia.edu/KVolniansky"><img class="profile-avatar u-positionAbsolute" alt="Karel Volniansky" border="0" onerror="if (this.src != &#39;//a.academia-assets.com/images/s200_no_pic.png&#39;) this.src = &#39;//a.academia-assets.com/images/s200_no_pic.png&#39;;" width="200" height="200" src="https://0.academia-photos.com/185791183/66931455/55291051/s200_karel.volniansky.png" /></a></div><div class="suggested-user-card__user-info"><a class="suggested-user-card__user-info__header ds2-5-body-sm-bold ds2-5-body-link" href="https://jamd.academia.edu/KVolniansky">Karel Volniansky</a><p class="suggested-user-card__user-info__subheader ds2-5-body-xs">The Jerusalem Academy of Music and Dance</p></div></div><div class="suggested-user-card"><div class="suggested-user-card__avatar social-profile-avatar-container"><a href="https://independent.academia.edu/BlagovestaMomchedjikova"><img class="profile-avatar u-positionAbsolute" border="0" alt="" src="//a.academia-assets.com/images/s200_no_pic.png" /></a></div><div class="suggested-user-card__user-info"><a class="suggested-user-card__user-info__header ds2-5-body-sm-bold ds2-5-body-link" href="https://independent.academia.edu/BlagovestaMomchedjikova">Blagovesta Momchedjikova</a></div></div><div class="suggested-user-card"><div class="suggested-user-card__avatar social-profile-avatar-container"><a href="https://stuba.academia.edu/KatarinaKristianova"><img class="profile-avatar u-positionAbsolute" alt="Katarina Kristianova" border="0" onerror="if (this.src != &#39;//a.academia-assets.com/images/s200_no_pic.png&#39;) this.src = &#39;//a.academia-assets.com/images/s200_no_pic.png&#39;;" width="200" height="200" src="https://0.academia-photos.com/4826685/2069147/3412593/s200_katarina.kristianova.jpg" /></a></div><div class="suggested-user-card__user-info"><a class="suggested-user-card__user-info__header ds2-5-body-sm-bold ds2-5-body-link" href="https://stuba.academia.edu/KatarinaKristianova">Katarina Kristianova</a><p class="suggested-user-card__user-info__subheader ds2-5-body-xs">Slovak University of Technology</p></div></div><div class="suggested-user-card"><div class="suggested-user-card__avatar social-profile-avatar-container"><a href="https://ucl.academia.edu/TinOberman"><img class="profile-avatar u-positionAbsolute" border="0" alt="" src="//a.academia-assets.com/images/s200_no_pic.png" /></a></div><div class="suggested-user-card__user-info"><a class="suggested-user-card__user-info__header ds2-5-body-sm-bold ds2-5-body-link" href="https://ucl.academia.edu/TinOberman">Tin Oberman</a><p class="suggested-user-card__user-info__subheader ds2-5-body-xs">University College London</p></div></div><div class="suggested-user-card"><div class="suggested-user-card__avatar social-profile-avatar-container"><a href="https://upm-es.academia.edu/C%C3%A9sarAsensio"><img class="profile-avatar u-positionAbsolute" border="0" alt="" src="//a.academia-assets.com/images/s200_no_pic.png" /></a></div><div class="suggested-user-card__user-info"><a class="suggested-user-card__user-info__header ds2-5-body-sm-bold ds2-5-body-link" href="https://upm-es.academia.edu/C%C3%A9sarAsensio">César Asensio</a><p class="suggested-user-card__user-info__subheader ds2-5-body-xs">Universidad Politécnica de Madrid</p></div></div></ul></div><div class="ri-section"><div class="ri-section-header"><span>Interests</span><a class="ri-more-link js-profile-ri-list-card" data-click-track="profile-user-info-primary-research-interest" data-has-card-for-ri-list="45597668">View All (9)</a></div><div class="ri-tags-container"><a data-click-track="profile-user-info-expand-research-interests" data-has-card-for-ri-list="45597668" href="https://www.academia.edu/Documents/in/Music_Performance"><div id="js-react-on-rails-context" style="display:none" data-rails-context="{&quot;inMailer&quot;:false,&quot;i18nLocale&quot;:&quot;en&quot;,&quot;i18nDefaultLocale&quot;:&quot;en&quot;,&quot;href&quot;:&quot;https://mcgill.academia.edu/CatherineGuastavino&quot;,&quot;location&quot;:&quot;/CatherineGuastavino&quot;,&quot;scheme&quot;:&quot;https&quot;,&quot;host&quot;:&quot;mcgill.academia.edu&quot;,&quot;port&quot;:null,&quot;pathname&quot;:&quot;/CatherineGuastavino&quot;,&quot;search&quot;:null,&quot;httpAcceptLanguage&quot;:null,&quot;serverSide&quot;:false}"></div> <div class="js-react-on-rails-component" style="display:none" data-component-name="Pill" data-props="{&quot;color&quot;:&quot;gray&quot;,&quot;children&quot;:[&quot;Music Performance&quot;]}" data-trace="false" data-dom-id="Pill-react-component-e9ad618c-2c59-45f2-8163-8d991f52b561"></div> <div id="Pill-react-component-e9ad618c-2c59-45f2-8163-8d991f52b561"></div> </a><a data-click-track="profile-user-info-expand-research-interests" data-has-card-for-ri-list="45597668" href="https://www.academia.edu/Documents/in/Soundscape"><div class="js-react-on-rails-component" style="display:none" data-component-name="Pill" data-props="{&quot;color&quot;:&quot;gray&quot;,&quot;children&quot;:[&quot;Soundscape&quot;]}" data-trace="false" data-dom-id="Pill-react-component-77ad3c14-f677-452f-9dd4-eee0bc09e703"></div> <div id="Pill-react-component-77ad3c14-f677-452f-9dd4-eee0bc09e703"></div> </a><a data-click-track="profile-user-info-expand-research-interests" data-has-card-for-ri-list="45597668" href="https://www.academia.edu/Documents/in/Hearing"><div class="js-react-on-rails-component" style="display:none" data-component-name="Pill" data-props="{&quot;color&quot;:&quot;gray&quot;,&quot;children&quot;:[&quot;Hearing&quot;]}" data-trace="false" data-dom-id="Pill-react-component-e8d508d1-e435-4e91-a415-57aacca6fc63"></div> <div id="Pill-react-component-e8d508d1-e435-4e91-a415-57aacca6fc63"></div> </a><a data-click-track="profile-user-info-expand-research-interests" data-has-card-for-ri-list="45597668" href="https://www.academia.edu/Documents/in/Psychoacoustics"><div class="js-react-on-rails-component" style="display:none" data-component-name="Pill" data-props="{&quot;color&quot;:&quot;gray&quot;,&quot;children&quot;:[&quot;Psychoacoustics&quot;]}" data-trace="false" data-dom-id="Pill-react-component-8b3d6d2f-b92a-489e-b6cf-ff7f0b9d25fa"></div> <div id="Pill-react-component-8b3d6d2f-b92a-489e-b6cf-ff7f0b9d25fa"></div> </a><a data-click-track="profile-user-info-expand-research-interests" data-has-card-for-ri-list="45597668" href="https://www.academia.edu/Documents/in/Sound_Art"><div class="js-react-on-rails-component" style="display:none" data-component-name="Pill" data-props="{&quot;color&quot;:&quot;gray&quot;,&quot;children&quot;:[&quot;Sound Art&quot;]}" data-trace="false" data-dom-id="Pill-react-component-d87ef9cf-d03c-4189-b5a7-23f49c205d66"></div> <div id="Pill-react-component-d87ef9cf-d03c-4189-b5a7-23f49c205d66"></div> </a></div></div></div></div><div class="right-panel-container"><div class="user-content-wrapper"><div class="uploads-container" id="social-redesign-work-container"><div class="upload-header"><h2 class="ds2-5-heading-sans-serif-xs">Uploads</h2></div><div class="nav-container backbone-profile-documents-nav hidden-xs"><ul class="nav-tablist" role="tablist"><li class="nav-chip active" role="presentation"><a data-section-name="" data-toggle="tab" href="#all" role="tab">all</a></li><li class="nav-chip" role="presentation"><a class="js-profile-docs-nav-section u-textTruncate" data-click-track="profile-works-tab" data-section-name="Papers" data-toggle="tab" href="#papers" role="tab" title="Papers"><span>312</span>&nbsp;<span class="ds2-5-body-sm-bold">Papers</span></a></li><li class="nav-chip" role="presentation"><a class="js-profile-docs-nav-section u-textTruncate" data-click-track="profile-works-tab" data-section-name="Conference-Proceedings" data-toggle="tab" href="#conferenceproceedings" role="tab" title="Conference Proceedings"><span>22</span>&nbsp;<span class="ds2-5-body-sm-bold">Conference Proceedings</span></a></li><li class="nav-chip" role="presentation"><a class="js-profile-docs-nav-section u-textTruncate" data-click-track="profile-works-tab" data-section-name="Journal-Articles" data-toggle="tab" href="#journalarticles" role="tab" title="Journal Articles"><span>30</span>&nbsp;<span class="ds2-5-body-sm-bold">Journal Articles</span></a></li><li class="nav-chip more-tab" role="presentation"><a class="js-profile-documents-more-tab link-unstyled u-textTruncate" data-toggle="dropdown" role="tab">More&nbsp;&nbsp;<i class="fa fa-chevron-down"></i></a><ul class="js-profile-documents-more-dropdown dropdown-menu dropdown-menu-right profile-documents-more-dropdown" role="menu"><li role="presentation"><a data-click-track="profile-works-tab" data-section-name="Digital-effects" data-toggle="tab" href="#digitaleffects" role="tab" style="border: none;"><span>1</span>&nbsp;Digital effects</a></li></ul></li></ul></div><div class="divider ds-divider-16" style="margin: 0px;"></div><div class="documents-container backbone-social-profile-documents" style="width: 100%;"><div class="u-taCenter"></div><div class="profile--tab_content_container js-tab-pane tab-pane active" id="all"><div class="profile--tab_heading_container js-section-heading" data-section="Papers" id="Papers"><h3 class="profile--tab_heading_container">Papers by Catherine Guastavino</h3></div><div class="js-work-strip profile--work_container" data-work-id="110629292"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/110629292/Montreal_soundscapes_during_the_COVID_19_pandemic_A_spatial_analysis_of_noise_complaints_and_residents_surveys"><img alt="Research paper thumbnail of Montreal soundscapes during the COVID-19 pandemic: A spatial analysis of noise complaints and residents’ surveys" class="work-thumbnail" src="https://attachments.academia-assets.com/108387873/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/110629292/Montreal_soundscapes_during_the_COVID_19_pandemic_A_spatial_analysis_of_noise_complaints_and_residents_surveys">Montreal soundscapes during the COVID-19 pandemic: A spatial analysis of noise complaints and residents’ surveys</a></div><div class="wp-workCard_item"><span>Noise Mapping</span><span>, 2023</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Public health measures during the COVID-19 pandemic provided researchers with a quasi-experimenta...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Public health measures during the COVID-19 pandemic provided researchers with a quasi-experimental situation to examine what happens when anthropogenic noise sources (e.g., traffic) are greatly reduced. This article combines noise-related calls to Montreal&#39;s 311 service (29,891 calls from 2014 to 2022) with original survey data from 240 residents collected in 2020 after the lockdown and the summer reopening. The spatial analysis of the calls revealed that, across all pandemic phases, noise complaints increased with population density, the proportion of low-income residents, and the proportion of greenspace. However, the change in the spatial distribution of noise-related calls due to the pandemic measures is positively associated with the proportions of residential and greenspace land use. That is, areas with higher proportions of residential land use and greenspace experienced the greatest increase in noiserelated calls. The analysis of the survey revealed that the sounds of traffic and construction decreased during both the lockdown and the subsequent reopening, while the sounds of the neighborhood and nature increased. However, the decreased traffic noise in the downtown core also allowed for the emergence of noise from the heating, ventilation and air conditioning systems in the area. We discuss these results considering the interest in reducing noise levels in cities.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="38da13f0488c48a057fa8ddd7a854849" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:108387873,&quot;asset_id&quot;:110629292,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/108387873/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629292"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629292"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629292; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629292]").text(description); $(".js-view-count[data-work-id=110629292]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629292; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629292']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "38da13f0488c48a057fa8ddd7a854849" } } $('.js-work-strip[data-work-id=110629292]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629292,"title":"Montreal soundscapes during the COVID-19 pandemic: A spatial analysis of noise complaints and residents’ surveys","internal_url":"https://www.academia.edu/110629292/Montreal_soundscapes_during_the_COVID_19_pandemic_A_spatial_analysis_of_noise_complaints_and_residents_surveys","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":108387873,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/108387873/thumbnails/1.jpg","file_name":"pdf.pdf","download_url":"https://www.academia.edu/attachments/108387873/download_file","bulk_download_file_name":"Montreal_soundscapes_during_the_COVID_19.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/108387873/pdf-libre.pdf?1701783003=\u0026response-content-disposition=attachment%3B+filename%3DMontreal_soundscapes_during_the_COVID_19.pdf\u0026Expires=1741192868\u0026Signature=ToM9-xO~gxAkPJ86AtxVnctBhzJtTFWYRpXcr0hajP93LFd72mexsaUqCLce02tpqThy8Axw0Jh67WrT5rG23cd59vLT2PetRnz1BQPLmV2-aN5NZ~jDwyPbwaHRbZNJYNK~E46Gchj1ohRWsauDL4ET6KOcQYWPVfYQtdrbwidypWma6Q6fksIvhXW0mawz5~9YT2iCXE3-69l97KdzPb41JleBq4SqIy0e~0Xx8q4d0Jf1M6Deyb3s9TYteZRiRoJ5tM3SKMyuSXPa3OtQF33~p2sVwINxwHyy6vJcuaYA29W8G4ksNiwZsyxiuEC5H3SFDPCbyliyYEbGNZRgWA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"},{"id":108387874,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/108387874/thumbnails/1.jpg","file_name":"pdf.pdf","download_url":"https://www.academia.edu/attachments/108387874/download_file","bulk_download_file_name":"Montreal_soundscapes_during_the_COVID_19.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/108387874/pdf-libre.pdf?1701783005=\u0026response-content-disposition=attachment%3B+filename%3DMontreal_soundscapes_during_the_COVID_19.pdf\u0026Expires=1741192868\u0026Signature=fwrzumTwaryjtKfoptOvF5a3NpT3VHXOopGViq~jA2uR5XBlWlCYuybgNP1jMjuEuM3TpRgrq3qoMIYejZUBXfj2AhDOt5RlGxy4T7I9IXDTs6joIROe0jZciS-OyMOS4X9yOILNj4bebUzLoW3rFh1EZvarOnGHoBkh9Dl4SZCL4Voj11YDlYGSvNUwTUbbI-yc~oSALU-pGKVXqpfgRVpa0i7MZuQvOWhRFClbtVoiubhK~Z5Q3pxRps0ctZMWVpeo7RhYX~Agb7msHd89nXCI-jpvOhqQC~kuWBCWGh6Do5VGKnZvSl5fgxibmncU6gTSlPXRvIMyU6J5xgAksQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629291"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/110629291/Diffuse_Field_Modeling_Using_Physically_Inspired_Decorrelation_Filters_and_B_Format_Microphones_Part_I_Algorithm"><img alt="Research paper thumbnail of Diffuse Field Modeling Using Physically-Inspired Decorrelation Filters and B-Format Microphones: Part I Algorithm" class="work-thumbnail" src="https://attachments.academia-assets.com/108388013/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/110629291/Diffuse_Field_Modeling_Using_Physically_Inspired_Decorrelation_Filters_and_B_Format_Microphones_Part_I_Algorithm">Diffuse Field Modeling Using Physically-Inspired Decorrelation Filters and B-Format Microphones: Part I Algorithm</a></div><div class="wp-workCard_item"><span>Journal of The Audio Engineering Society</span><span>, Apr 21, 2016</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Diffuse Field Modeling (DFM) is a systematic means for simulating and reproducing a diffuse field...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Diffuse Field Modeling (DFM) is a systematic means for simulating and reproducing a diffuse field for arbitrary loudspeaker configurations. DFM is presented in two publications: Part I [1] presents the algorithm and this Part II reports the perceptual evaluation. Two experiments were conducted: in Experiment 1 sound recording professionals were to rate different treatments of DFM presented on a 20-channel array. The treatments under evaluation included the geometric modeling of reflections, strategies involving the early portion of the B-Format Room Impulse Response, and a comparison between 0 th and 1 st-order RIR. Results indicate that it is necessary to model the earliest reflections and to use all four channels of the B-Format room impulse response. In Experiment 2 musicians and sound recording professionals were asked to rate DFM and common microphone techniques presented on 3/2 stereophonic setup. DFM was found to be perceptually comparable with the Hamasaki Square technique.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="f62f1ee78d9f4851402884762ff61124" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:108388013,&quot;asset_id&quot;:110629291,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/108388013/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629291"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629291"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629291; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629291]").text(description); $(".js-view-count[data-work-id=110629291]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629291; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629291']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "f62f1ee78d9f4851402884762ff61124" } } $('.js-work-strip[data-work-id=110629291]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629291,"title":"Diffuse Field Modeling Using Physically-Inspired Decorrelation Filters and B-Format Microphones: Part I Algorithm","internal_url":"https://www.academia.edu/110629291/Diffuse_Field_Modeling_Using_Physically_Inspired_Decorrelation_Filters_and_B_Format_Microphones_Part_I_Algorithm","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":108388013,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/108388013/thumbnails/1.jpg","file_name":"jaes.2016.000220231205-1-pi4deb.pdf","download_url":"https://www.academia.edu/attachments/108388013/download_file","bulk_download_file_name":"Diffuse_Field_Modeling_Using_Physically.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/108388013/jaes.2016.000220231205-1-pi4deb-libre.pdf?1701787253=\u0026response-content-disposition=attachment%3B+filename%3DDiffuse_Field_Modeling_Using_Physically.pdf\u0026Expires=1741324679\u0026Signature=IltVXbeGfcScTvAR1FUYdHJnMevVpfYuv7K2raaa~~dZjQwblpLosw7aMvCKjgYaw1pqrTKAufe2xC-kygxk8Q3~k86oz0Xjbe4tddXPPSey1VqHzqtPRW-O0glpUBMJfwaI0UNl-d1tnIvs7k27dbEdYDIixwsQqpUUEsNUyr4tgLhUN8bLrAfkWzgmBiAX79BtBN3UBOkwrW1bM8BMKVJRPUFQ25efMhN8wPij-HS9AomQs4fE1M4pWzsfNb35yP3xE~B1UkH--95K0nLa1nXYJM3QJ2~pIHA25z5ZZfguyZNU39oIkyBtyj3hiEZPKDu6mhH3Z--kJGUkMUn~~w__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629290"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629290/Chapter_10_Questioning_sensory_experience"><img alt="Research paper thumbnail of Chapter 10. Questioning sensory experience" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629290/Chapter_10_Questioning_sensory_experience">Chapter 10. Questioning sensory experience</a></div><div class="wp-workCard_item"><span>John Benjamins Publishing Company eBooks</span><span>, Dec 15, 2021</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">International audienc</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629290"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629290"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629290; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629290]").text(description); $(".js-view-count[data-work-id=110629290]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629290; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629290']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629290]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629290,"title":"Chapter 10. Questioning sensory experience","internal_url":"https://www.academia.edu/110629290/Chapter_10_Questioning_sensory_experience","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629289"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629289/Perceived_suitability_of_reverberation_in_large_coupled_volume_concert_halls"><img alt="Research paper thumbnail of Perceived suitability of reverberation in large coupled volume concert halls" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629289/Perceived_suitability_of_reverberation_in_large_coupled_volume_concert_halls">Perceived suitability of reverberation in large coupled volume concert halls</a></div><div class="wp-workCard_item"><span>Psychomusicology: Music, Mind and Brain</span><span>, Sep 1, 2015</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629289"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629289"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629289; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629289]").text(description); $(".js-view-count[data-work-id=110629289]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629289; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629289']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629289]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629289,"title":"Perceived suitability of reverberation in large coupled volume concert halls","internal_url":"https://www.academia.edu/110629289/Perceived_suitability_of_reverberation_in_large_coupled_volume_concert_halls","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629288"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629288/Informing_sound_art_design_in_public_space_through_soundscape_simulation"><img alt="Research paper thumbnail of Informing sound art design in public space through soundscape simulation" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629288/Informing_sound_art_design_in_public_space_through_soundscape_simulation">Informing sound art design in public space through soundscape simulation</a></div><div class="wp-workCard_item"><span>NOISE-CON ... proceedings</span><span>, Feb 1, 2023</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Urban sound management often amounts to reducing sound levels with the underlying assumption of s...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Urban sound management often amounts to reducing sound levels with the underlying assumption of sound/noise as a nuisance. However, a reduction in sound level does not necessarily lead to a more pleasant auditory experience, especially in urban public spaces where vibrancy can be sought after. A proactive design approach that accounts for the human experience of sound environment is needed to improve the quality of urban spaces. Recent studies in soundscape research suggest that added sound and particularly sound art installations can have a positive influence on public space evaluations. Yet, the role of added sounds in urban context remains understudied and there is no existing method to date to inform sound art composition in public space through soundscape simulation. We present here a research-creation collaboration around the design of a permanent sound installation in an urban public space in Paris: Nadine Schütz&amp;#39;s Niches Acoustiques. We report on a series of listening tests involving High-Order Ambisonic soundscape simulations of different prototypes to inform the sound artist&amp;#39;s composition in order to optimize the quality of public space experience in the presence of the sound installation.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629288"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629288"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629288; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629288]").text(description); $(".js-view-count[data-work-id=110629288]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629288; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629288']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629288]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629288,"title":"Informing sound art design in public space through soundscape simulation","internal_url":"https://www.academia.edu/110629288/Informing_sound_art_design_in_public_space_through_soundscape_simulation","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629287"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629287/Applying_the_stratified_model_of_relevance_interactions_to_music_information_retrieval"><img alt="Research paper thumbnail of Applying the stratified model of relevance interactions to music information retrieval" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629287/Applying_the_stratified_model_of_relevance_interactions_to_music_information_retrieval">Applying the stratified model of relevance interactions to music information retrieval</a></div><div class="wp-workCard_item"><span>Proceedings Of The Association For Information Science And Technology</span><span>, 2013</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">ABSTRACT While research on the notion of relevance has a long and rich history in information ret...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">ABSTRACT While research on the notion of relevance has a long and rich history in information retrieval for textual documents, formal considerations of relevance concepts in Music Information Retrieval (MIR) remain scarce. We discuss the application of Saracevic&amp;amp;#39;s stratified model of relevance interactions to the music information domain. This model offers a tool for deliberation on the development of user-oriented MIR systems, and a framework for the aggregation of findings on the music information needs and behaviours of potential users.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629287"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629287"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629287; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629287]").text(description); $(".js-view-count[data-work-id=110629287]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629287; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629287']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629287]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629287,"title":"Applying the stratified model of relevance interactions to music information retrieval","internal_url":"https://www.academia.edu/110629287/Applying_the_stratified_model_of_relevance_interactions_to_music_information_retrieval","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629286"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629286/A_Perceptual_Evaluation_of_Room_Effect_Methods_for_Multichannel_Spatial_Audio"><img alt="Research paper thumbnail of A Perceptual Evaluation of Room Effect Methods for Multichannel Spatial Audio" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629286/A_Perceptual_Evaluation_of_Room_Effect_Methods_for_Multichannel_Spatial_Audio">A Perceptual Evaluation of Room Effect Methods for Multichannel Spatial Audio</a></div><div class="wp-workCard_item"><span>Journal of The Audio Engineering Society</span><span>, Oct 16, 2013</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629286"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629286"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629286; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629286]").text(description); $(".js-view-count[data-work-id=110629286]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629286; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629286']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629286]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629286,"title":"A Perceptual Evaluation of Room Effect Methods for Multichannel Spatial Audio","internal_url":"https://www.academia.edu/110629286/A_Perceptual_Evaluation_of_Room_Effect_Methods_for_Multichannel_Spatial_Audio","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629284"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629284/Supporting_Sounds_Design_and_Evaluation_of_an_Audio_Haptic_Interface"><img alt="Research paper thumbnail of Supporting Sounds: Design and Evaluation of an Audio-Haptic Interface" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629284/Supporting_Sounds_Design_and_Evaluation_of_an_Audio_Haptic_Interface">Supporting Sounds: Design and Evaluation of an Audio-Haptic Interface</a></div><div class="wp-workCard_item"><span>Springer eBooks</span><span>, 2012</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">ABSTRACT The design and evaluation of a multimodal interface is presented in order to investigate...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">ABSTRACT The design and evaluation of a multimodal interface is presented in order to investigate how spatial audio and haptic feedback can be used to convey the navigational structure of a virtual environment. The non-visual 3D virtual environment is composed of a number of parallel planes with either horizontal or vertical orientations. The interface was evaluated using a target-finding task to explore how auditory feedback can be used in isolation or combined with haptic feedback for navigation. Twenty-three users were asked to locate targets using auditory feedback in the virtual structure across both horizontal and vertical orientations of the planes, with and without haptic feedback. Findings from the evaluation experiment reveal that users performed the task faster in the bi-modal conditions (with combined auditory and haptic feedback) with a horizontal orientation of the virtual planes.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629284"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629284"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629284; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629284]").text(description); $(".js-view-count[data-work-id=110629284]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629284; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629284']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629284]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629284,"title":"Supporting Sounds: Design and Evaluation of an Audio-Haptic Interface","internal_url":"https://www.academia.edu/110629284/Supporting_Sounds_Design_and_Evaluation_of_an_Audio_Haptic_Interface","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629283"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/110629283/Perceptual_Evaluation_of_a_Real_time_Synthesis_Technique_for_Rolling_Sounds"><img alt="Research paper thumbnail of Perceptual Evaluation of a Real-time Synthesis Technique for Rolling Sounds" class="work-thumbnail" src="https://attachments.academia-assets.com/108791784/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/110629283/Perceptual_Evaluation_of_a_Real_time_Synthesis_Technique_for_Rolling_Sounds">Perceptual Evaluation of a Real-time Synthesis Technique for Rolling Sounds</a></div><div class="wp-workCard_item"><span>HAL (Le Centre pour la Communication Scientifique Directe)</span><span>, 2007</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">In this study 6 different versions of a new real-time synthesizer for contact sounds have been ev...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">In this study 6 different versions of a new real-time synthesizer for contact sounds have been evaluated in order to identify the most effective algorithm to create a realistic sound for rolling objects. 18 participants took part in a perceptual evaluation experiment. Results are presented in terms of both statistical analysis of the most effective synthesis algorithm and qualitative user comments. Finally recommendations for future implementations of synthesis techniques and subsequent perceptual evaluations are presented and discussed.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="e4e09133550f510a09e910f50ddfaf4e" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:108791784,&quot;asset_id&quot;:110629283,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/108791784/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629283"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629283"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629283; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629283]").text(description); $(".js-view-count[data-work-id=110629283]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629283; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629283']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "e4e09133550f510a09e910f50ddfaf4e" } } $('.js-work-strip[data-work-id=110629283]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629283,"title":"Perceptual Evaluation of a Real-time Synthesis Technique for Rolling Sounds","internal_url":"https://www.academia.edu/110629283/Perceptual_Evaluation_of_a_Real_time_Synthesis_Technique_for_Rolling_Sounds","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":108791784,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/108791784/thumbnails/1.jpg","file_name":"murphyEnactive08.pdf","download_url":"https://www.academia.edu/attachments/108791784/download_file","bulk_download_file_name":"Perceptual_Evaluation_of_a_Real_time_Syn.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/108791784/murphyEnactive08-libre.pdf?1702347208=\u0026response-content-disposition=attachment%3B+filename%3DPerceptual_Evaluation_of_a_Real_time_Syn.pdf\u0026Expires=1741324679\u0026Signature=WNIx6-70Ih-O6RVmOSj1-TDJw1PDRCM200afl1nWXr0Aa4SZnlXGW-~gZmsrFhbDUi4hFEYsE-zi5vAMPuCZXA0ilPcD4faFeqP1W7jzBQahIyQMrqTLMPkmReDpFZ5N0anoV5YaNNqyi5UBE5Zw5Yc1yo4flAWU7VXr8tGNfX1Vu8WFq8uPa80ay8csVOGvSR9VPmy719qLv7zYnd-P0DDBMFJYotcG6JxInaoqDF24JV6WpPmKOjshFdMnPiHYoLfxZJRfV-d6w28FZ5MrCXxrunpigxA8e0vy2~39-QibfolzN73eLpdpzTZ5m1pupJtDw-nk3evLeMidkaIldA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629282"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629282/Etude_s%C3%A9mantique_et_acoustique_de_la_perception_des_basses_fr%C3%A9quences_dans_lenvironnement_sonore_urbain"><img alt="Research paper thumbnail of Etude sémantique et acoustique de la perception des basses fréquences dans l&#39;environnement sonore urbain" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629282/Etude_s%C3%A9mantique_et_acoustique_de_la_perception_des_basses_fr%C3%A9quences_dans_lenvironnement_sonore_urbain">Etude sémantique et acoustique de la perception des basses fréquences dans l&#39;environnement sonore urbain</a></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629282"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629282"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629282; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629282]").text(description); $(".js-view-count[data-work-id=110629282]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629282; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629282']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629282]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629282,"title":"Etude sémantique et acoustique de la perception des basses fréquences dans l'environnement sonore urbain","internal_url":"https://www.academia.edu/110629282/Etude_s%C3%A9mantique_et_acoustique_de_la_perception_des_basses_fr%C3%A9quences_dans_lenvironnement_sonore_urbain","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629281"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/110629281/Constructing_a_true_LCSH_tree_of_a_science_and_engineering_collection"><img alt="Research paper thumbnail of Constructing a true LCSH tree of a science and engineering collection" class="work-thumbnail" src="https://attachments.academia-assets.com/108388096/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/110629281/Constructing_a_true_LCSH_tree_of_a_science_and_engineering_collection">Constructing a true LCSH tree of a science and engineering collection</a></div><div class="wp-workCard_item"><span>Journal of the Association for Information Science and Technology</span><span>, Nov 15, 2012</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The Library of Congress Subject Headings (LCSH) is a subject structure used to index large librar...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The Library of Congress Subject Headings (LCSH) is a subject structure used to index large library collections throughout the world. Browsing a collection through LCSH is difficult using current online tools in part because users cannot explore the structure using their existing experience navigating file hierarchies on their hard drives. This is due to inconsistencies in the LCSH structure, which does not adhere to the specific rules defining tree structures. This article proposes a method to adapt the LCSH structure to reflect a real-world collection from the domain of science and engineering. This structure is transformed into a valid tree structure using an automatic process. The analysis of the resulting LCSH tree shows a large and complex structure. The analysis of the distribution of information within the LCSH tree reveals a power law distribution where the vast majority of subjects contain few information items and a few subjects contain the vast majority of the collection.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="7b1c5b90c599f0ad2b84e86fe0cdba5d" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:108388096,&quot;asset_id&quot;:110629281,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/108388096/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629281"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629281"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629281; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629281]").text(description); $(".js-view-count[data-work-id=110629281]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629281; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629281']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "7b1c5b90c599f0ad2b84e86fe0cdba5d" } } $('.js-work-strip[data-work-id=110629281]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629281,"title":"Constructing a true LCSH tree of a science and engineering collection","internal_url":"https://www.academia.edu/110629281/Constructing_a_true_LCSH_tree_of_a_science_and_engineering_collection","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":108388096,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/108388096/thumbnails/1.jpg","file_name":"asi.2274920231205-1-iyva8q.pdf","download_url":"https://www.academia.edu/attachments/108388096/download_file","bulk_download_file_name":"Constructing_a_true_LCSH_tree_of_a_scien.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/108388096/asi.2274920231205-1-iyva8q-libre.pdf?1701787244=\u0026response-content-disposition=attachment%3B+filename%3DConstructing_a_true_LCSH_tree_of_a_scien.pdf\u0026Expires=1741324679\u0026Signature=Wzzy-vRtNbOOHjQo~RIaeicEx0T-9a9f4LBfuhSW8JfNl-txj1UTZxGY~JKKLlbmObRQKM3~VQSDnN7YOhNg3kGdragasHKOSdUQYcPmrkRLvIq2nM104QVRz01LD2NXkRUFwHQ~U~edtTFd4whbuiZucUATl2uJhuHNuWhaWx6Lq8MS~ahY4E6rI6zC4OaadKV63~ShARYc-gnfc0o7Z39FrEku-6mwsdWBxxwbHkQmErJtn7NhoPqrW~tp9YUiw5ur4aGKtOTd9vnRclBavE5IVceSmQ0L4wnhSHicJS05vY-q7NmYbrQCu5b9LgoBe-n2R8RYazLlZQp77plKgw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629280"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/110629280/User_Studies_in_the_Music_Information_Retrieval_Literature"><img alt="Research paper thumbnail of User Studies in the Music Information Retrieval Literature" class="work-thumbnail" src="https://attachments.academia-assets.com/108387871/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/110629280/User_Studies_in_the_Music_Information_Retrieval_Literature">User Studies in the Music Information Retrieval Literature</a></div><div class="wp-workCard_item"><span>International Symposium/Conference on Music Information Retrieval</span><span>, 2011</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">This paper presents an overview of user studies in the Music Information Retrieval (MIR) literatu...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">This paper presents an overview of user studies in the Music Information Retrieval (MIR) literature. A focus on the user has repeatedly been identified as a key requirement for future MIR research; yet empirical user studies have been relatively sparse in the literature, the overwhelming research attention in MIR remaining systems-focused. We present research topics, methodologies, and design implications covered in the user studies conducted thus far.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="62dd3b2e1498163e7d78712aef33e27b" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:108387871,&quot;asset_id&quot;:110629280,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/108387871/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629280"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629280"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629280; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629280]").text(description); $(".js-view-count[data-work-id=110629280]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629280; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629280']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "62dd3b2e1498163e7d78712aef33e27b" } } $('.js-work-strip[data-work-id=110629280]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629280,"title":"User Studies in the Music Information Retrieval Literature","internal_url":"https://www.academia.edu/110629280/User_Studies_in_the_Music_Information_Retrieval_Literature","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":108387871,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/108387871/thumbnails/1.jpg","file_name":"OS5-1.pdf","download_url":"https://www.academia.edu/attachments/108387871/download_file","bulk_download_file_name":"User_Studies_in_the_Music_Information_Re.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/108387871/OS5-1-libre.pdf?1701783003=\u0026response-content-disposition=attachment%3B+filename%3DUser_Studies_in_the_Music_Information_Re.pdf\u0026Expires=1741338292\u0026Signature=OKZKJpaG4sxT1NjqL3dKon3PQ2DmfFBbz~lL8n40PxQNZS~P65hxYDGtYQhpIv0J9~JCIvjADsVwO9vQO36uLz8GSmQvIij9Ohnu0VJxnXqDI0AdwDll9ZbYC5g7cqlwreJI853XUuFcAfxcUvzDNf-ksCQcQ~Pg-jD-~lIPR47JfxH0B7S2yPZ6pAwavw1LU14XDakrANjnF1x4LzIysphShUdm40-IcguN5R755xmhn3jWV8pKZ0IjPF3Oo3gnCqYF0U87N9gzAhEseeW4nJo1~IqsHE0nM5p63zCqR6dFdVfk7zYDEs27V7e4pjZ~Nln~LlvxnN7C2Qvj66yjKQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"},{"id":108387872,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/108387872/thumbnails/1.jpg","file_name":"OS5-1.pdf","download_url":"https://www.academia.edu/attachments/108387872/download_file","bulk_download_file_name":"User_Studies_in_the_Music_Information_Re.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/108387872/OS5-1-libre.pdf?1701783003=\u0026response-content-disposition=attachment%3B+filename%3DUser_Studies_in_the_Music_Information_Re.pdf\u0026Expires=1741338292\u0026Signature=AhtyuC7CfVwawNIWMNXZEAMJLDyu95BZc9CZpamv6Y-JBQjLid1TbtJ4Fxr69weMIBTBTIIUtg38QAExBQRlIR2~srZjRhkwndYYrbNx3u6xqKq5sUmMgIxGWJMeHrcIZ2bnQDF-JElE7mx3cnO-Confl676tTGnj19yif-pvIjNoPizNP4QEBI4ozAkSicP4v57TxHQUQLkdeX16AtLft-5XKu31n5P6-EFHNzY8uMsOZAdvJdibENxvh3s1NwmMSgqOb4q9m6sZoHerRsYnl9QAInr37F2uOh2p-iGpl9HD~qhc71SXmx953XWCGs8GbOOiswxRjxMLy45TIxKEw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629279"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629279/Perceptual_validation_of_sound_environment_reproduction_inside_an_aircraft_mock_up"><img alt="Research paper thumbnail of Perceptual validation of sound environment reproduction inside an aircraft mock-up" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629279/Perceptual_validation_of_sound_environment_reproduction_inside_an_aircraft_mock_up">Perceptual validation of sound environment reproduction inside an aircraft mock-up</a></div><div class="wp-workCard_item"><span>Applied Ergonomics</span><span>, 2022</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Auditory comfort evaluations are garnering increased attention in engineering and particularly in...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Auditory comfort evaluations are garnering increased attention in engineering and particularly in the context of air transportation. Being able to produce sound environments corresponding to various flight conditions in aircraft mock-ups would be a valuable tool to investigate acoustic comfort inside aircrafts in controlled environments. Before using such mock-ups, they must be developed and validated in physical and perceptual terms. This paper provides a perceptual validation of sound environment reproduction inside aircraft mock-up. To provide a faithfully reproduced sound environment, time, frequency and spatial characteristics should be preserved. Physical sound field reproduction approaches for spatial sound reproduction are required while properly preserving localization cues at the listener&amp;#39;s ears to recreate a realistic and immersing sound environment. We report a perceptual validation of a sound field reproduction system developed in an aircraft mock-up based on multichannel least-square methods and equalization. Twenty participants evaluated reproduced sound environments relative to a reference sound environment in an aircraft cabin mock-up equipped with a 41-actuator multichannel sound reproduction system. Results indicate that the preferred reproduction corresponds to the best physical reconstruction of the sound environment.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629279"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629279"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629279; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629279]").text(description); $(".js-view-count[data-work-id=110629279]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629279; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629279']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629279]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629279,"title":"Perceptual validation of sound environment reproduction inside an aircraft mock-up","internal_url":"https://www.academia.edu/110629279/Perceptual_validation_of_sound_environment_reproduction_inside_an_aircraft_mock_up","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629278"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629278/Perception_of_reverberation_in_coupled_volumes_discrimination_and_suitability"><img alt="Research paper thumbnail of Perception of reverberation in coupled volumes: discrimination and suitability" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629278/Perception_of_reverberation_in_coupled_volumes_discrimination_and_suitability">Perception of reverberation in coupled volumes: discrimination and suitability</a></div><div class="wp-workCard_item"><span>HAL (Le Centre pour la Communication Scientifique Directe)</span><span>, 2013</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629278"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629278"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629278; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629278]").text(description); $(".js-view-count[data-work-id=110629278]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629278; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629278']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629278]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629278,"title":"Perception of reverberation in coupled volumes: discrimination and suitability","internal_url":"https://www.academia.edu/110629278/Perception_of_reverberation_in_coupled_volumes_discrimination_and_suitability","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629277"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629277/Sharing_music_in_public_spaces_Social_insights_from_the_Musikiosk_project"><img alt="Research paper thumbnail of Sharing music in public spaces: Social insights from the Musikiosk project" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629277/Sharing_music_in_public_spaces_Social_insights_from_the_Musikiosk_project">Sharing music in public spaces: Social insights from the Musikiosk project</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">We argue for a reconsideration of the role of public sharing of music and technology in urban, pu...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">We argue for a reconsideration of the role of public sharing of music and technology in urban, public settings, based on the results of research involving an interactive sound system. While current legislation in Quebec prevents the playing of amplified music in public, with the support of a Montreal borough, we developed and installed an open, free sound system (Musikiosk) allowing users to choose and play their own music into a pocket park off a busy commercial street. The park and system usage were systematically studied in an interdisciplinary research project, framed by current debates on the relationship between music, publicness and the use of interactive music technologies in public spaces. It combined observations, questionnaires and interviews with park users and residents, which are analyzed through the lens of use patterns and engagement. Results indicate that both users and non-users of the system evaluate Musikiosk as a welcome addition to the park and as a benefit to its conviviality and dynamics, by allowing users to share their music in a novel way and thus to appropriate their park acoustically. Findings further indicate that the process of shared music consumption is an essential advantage of the system, extending the range of park functions and encouraging interaction and different forms of social dynamics by also attracting new users. The positive reactions to Musikiosk show the need for a reevaluation of existing norms and regulations on public space use, particularly in relation to new forms of publicness through the sharing of music and technology.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629277"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629277"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629277; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629277]").text(description); $(".js-view-count[data-work-id=110629277]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629277; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629277']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629277]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629277,"title":"Sharing music in public spaces: Social insights from the Musikiosk project","internal_url":"https://www.academia.edu/110629277/Sharing_music_in_public_spaces_Social_insights_from_the_Musikiosk_project","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629276"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629276/A_Comparison_of_Recording_Rendering_and_Reproduction_Techniques_for_Multichannel_Spatial_Audio"><img alt="Research paper thumbnail of A Comparison of Recording, Rendering, and Reproduction Techniques for Multichannel Spatial Audio" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629276/A_Comparison_of_Recording_Rendering_and_Reproduction_Techniques_for_Multichannel_Spatial_Audio">A Comparison of Recording, Rendering, and Reproduction Techniques for Multichannel Spatial Audio</a></div><div class="wp-workCard_item"><span>Journal of The Audio Engineering Society</span><span>, Oct 26, 2012</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629276"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629276"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629276; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629276]").text(description); $(".js-view-count[data-work-id=110629276]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629276; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629276']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629276]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629276,"title":"A Comparison of Recording, Rendering, and Reproduction Techniques for Multichannel Spatial Audio","internal_url":"https://www.academia.edu/110629276/A_Comparison_of_Recording_Rendering_and_Reproduction_Techniques_for_Multichannel_Spatial_Audio","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629274"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629274/Discrimination_Between_Phonograph_Playback_Systems"><img alt="Research paper thumbnail of Discrimination Between Phonograph Playback Systems" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629274/Discrimination_Between_Phonograph_Playback_Systems">Discrimination Between Phonograph Playback Systems</a></div><div class="wp-workCard_item"><span>Journal of The Audio Engineering Society</span><span>, Oct 19, 2011</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">... expert listeners, 10 males and 4 females, with a mean age of 33.07 (SD = 8.27), took part in ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">... expert listeners, 10 males and 4 females, with a mean age of 33.07 (SD = 8.27), took part in condition 2. We attempted to retain as many participants from the first condition as possible, and were able to retain 8. Participants received $20 CAD for their participation. <a href="http://coltrane" rel="nofollow">http://coltrane</a>. ...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629274"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629274"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629274; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629274]").text(description); $(".js-view-count[data-work-id=110629274]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629274; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629274']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629274]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629274,"title":"Discrimination Between Phonograph Playback Systems","internal_url":"https://www.academia.edu/110629274/Discrimination_Between_Phonograph_Playback_Systems","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629273"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629273/The_Performing_World_of_Digital_Archives"><img alt="Research paper thumbnail of The Performing World of Digital Archives" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629273/The_Performing_World_of_Digital_Archives">The Performing World of Digital Archives</a></div><div class="wp-workCard_item"><span>De Boeck Supérieur eBooks</span><span>, 2013</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629273"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629273"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629273; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629273]").text(description); $(".js-view-count[data-work-id=110629273]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629273; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629273']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629273]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629273,"title":"The Performing World of Digital Archives","internal_url":"https://www.academia.edu/110629273/The_Performing_World_of_Digital_Archives","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629272"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/110629272/Diffuse_Field_Modeling_Using_Physically_Inspired_Decorrelation_Filters_and_B_Format_Microphones_Part_II_Evaluation"><img alt="Research paper thumbnail of Diffuse Field Modeling Using Physically-Inspired Decorrelation Filters and B-Format Microphones: Part II Evaluation" class="work-thumbnail" src="https://attachments.academia-assets.com/108388026/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/110629272/Diffuse_Field_Modeling_Using_Physically_Inspired_Decorrelation_Filters_and_B_Format_Microphones_Part_II_Evaluation">Diffuse Field Modeling Using Physically-Inspired Decorrelation Filters and B-Format Microphones: Part II Evaluation</a></div><div class="wp-workCard_item"><span>Journal of The Audio Engineering Society</span><span>, Apr 21, 2016</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Diffuse Field Modeling (DFM) is a systematic means for simulating and reproducing a diffuse field...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Diffuse Field Modeling (DFM) is a systematic means for simulating and reproducing a diffuse field for arbitrary loudspeaker configurations. DFM is presented in two publications: Part I [1] presents the algorithm and this Part II reports the perceptual evaluation. Two experiments were conducted: in Experiment 1 sound recording professionals were to rate different treatments of DFM presented on a 20-channel array. The treatments under evaluation included the geometric modeling of reflections, strategies involving the early portion of the B-Format Room Impulse Response, and a comparison between 0 th and 1 st-order RIR. Results indicate that it is necessary to model the earliest reflections and to use all four channels of the B-Format room impulse response. In Experiment 2 musicians and sound recording professionals were asked to rate DFM and common microphone techniques presented on 3/2 stereophonic setup. DFM was found to be perceptually comparable with the Hamasaki Square technique.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="cc8594be08f5000d930d93b745eb4f16" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:108388026,&quot;asset_id&quot;:110629272,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/108388026/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629272"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629272"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629272; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629272]").text(description); $(".js-view-count[data-work-id=110629272]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629272; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629272']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "cc8594be08f5000d930d93b745eb4f16" } } $('.js-work-strip[data-work-id=110629272]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629272,"title":"Diffuse Field Modeling Using Physically-Inspired Decorrelation Filters and B-Format Microphones: Part II Evaluation","internal_url":"https://www.academia.edu/110629272/Diffuse_Field_Modeling_Using_Physically_Inspired_Decorrelation_Filters_and_B_Format_Microphones_Part_II_Evaluation","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":108388026,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/108388026/thumbnails/1.jpg","file_name":"jaes.2016.000220231205-1-3elcy3.pdf","download_url":"https://www.academia.edu/attachments/108388026/download_file","bulk_download_file_name":"Diffuse_Field_Modeling_Using_Physically.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/108388026/jaes.2016.000220231205-1-3elcy3-libre.pdf?1701787250=\u0026response-content-disposition=attachment%3B+filename%3DDiffuse_Field_Modeling_Using_Physically.pdf\u0026Expires=1741324679\u0026Signature=SzuoOQjUVezeqi-i0d7dlp6C01Sdqh3SDT4ukkiWE5e7gD-bce2vbrgRzN3V~aLManBxgU99GbSqrrqQIes89bbks3JuBAAIzQLJJU9RoOO31MQkZEc6MtO3BLivorg7y0mmmQQ2MFXFh2GkGWX8LD0L~XxDT-07WpHgZJLnc51frGRENtXQgPxBYIbDLr8Xzu6TXmIlT9YtfDkXbNNEBC2kvDSj-URX6plv9z~YXOj8IwArVHFaRM9v-PFBOc075P1OeaZi-YRCwilkiILihlsRjSYnuef4~-lxgVGjcrgitrtX90inpBbOqIRBtMjUkvF1~xltP9YkOn8JRxzzxw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629271"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629271/Exploiting_major_trends_in_subject_hierarchies_for_large_scale_collection_visualization"><img alt="Research paper thumbnail of Exploiting major trends in subject hierarchies for large-scale collection visualization" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629271/Exploiting_major_trends_in_subject_hierarchies_for_large_scale_collection_visualization">Exploiting major trends in subject hierarchies for large-scale collection visualization</a></div><div class="wp-workCard_item"><span>Proceedings of SPIE</span><span>, Jan 22, 2012</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">ABSTRACT Many large digital collections are currently organized by subject; however, these useful...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">ABSTRACT Many large digital collections are currently organized by subject; however, these useful information organization structures are large and complex, making them difficult to browse. Current online tools and visualization prototypes show small localized subsets and do not provide the ability to explore the predominant patterns of the overall subject structure. This research addresses this issue by simplifying the subject structure using two techniques based on the highly uneven distribution of real-world collections: level compression and child pruning. The approach is demonstrated using a sample of 130K records organized by the Library of Congress Subject Headings (LCSH). Promising results show that the subject hierarchy can be reduced down to 42% of its initial size, while maintaining access to 81% of the collection. The visual impact is demonstrated using a traditional outline view allowing searchers to dynamically change the amount of complexity that they feel necessary for the tasks at hand.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629271"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629271"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629271; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629271]").text(description); $(".js-view-count[data-work-id=110629271]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629271; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629271']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629271]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629271,"title":"Exploiting major trends in subject hierarchies for large-scale collection visualization","internal_url":"https://www.academia.edu/110629271/Exploiting_major_trends_in_subject_hierarchies_for_large_scale_collection_visualization","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> </div><div class="profile--tab_content_container js-tab-pane tab-pane" data-section-id="4884023" id="papers"><div class="js-work-strip profile--work_container" data-work-id="110629292"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/110629292/Montreal_soundscapes_during_the_COVID_19_pandemic_A_spatial_analysis_of_noise_complaints_and_residents_surveys"><img alt="Research paper thumbnail of Montreal soundscapes during the COVID-19 pandemic: A spatial analysis of noise complaints and residents’ surveys" class="work-thumbnail" src="https://attachments.academia-assets.com/108387873/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/110629292/Montreal_soundscapes_during_the_COVID_19_pandemic_A_spatial_analysis_of_noise_complaints_and_residents_surveys">Montreal soundscapes during the COVID-19 pandemic: A spatial analysis of noise complaints and residents’ surveys</a></div><div class="wp-workCard_item"><span>Noise Mapping</span><span>, 2023</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Public health measures during the COVID-19 pandemic provided researchers with a quasi-experimenta...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Public health measures during the COVID-19 pandemic provided researchers with a quasi-experimental situation to examine what happens when anthropogenic noise sources (e.g., traffic) are greatly reduced. This article combines noise-related calls to Montreal&#39;s 311 service (29,891 calls from 2014 to 2022) with original survey data from 240 residents collected in 2020 after the lockdown and the summer reopening. The spatial analysis of the calls revealed that, across all pandemic phases, noise complaints increased with population density, the proportion of low-income residents, and the proportion of greenspace. However, the change in the spatial distribution of noise-related calls due to the pandemic measures is positively associated with the proportions of residential and greenspace land use. That is, areas with higher proportions of residential land use and greenspace experienced the greatest increase in noiserelated calls. The analysis of the survey revealed that the sounds of traffic and construction decreased during both the lockdown and the subsequent reopening, while the sounds of the neighborhood and nature increased. However, the decreased traffic noise in the downtown core also allowed for the emergence of noise from the heating, ventilation and air conditioning systems in the area. We discuss these results considering the interest in reducing noise levels in cities.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="38da13f0488c48a057fa8ddd7a854849" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:108387873,&quot;asset_id&quot;:110629292,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/108387873/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629292"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629292"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629292; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629292]").text(description); $(".js-view-count[data-work-id=110629292]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629292; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629292']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "38da13f0488c48a057fa8ddd7a854849" } } $('.js-work-strip[data-work-id=110629292]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629292,"title":"Montreal soundscapes during the COVID-19 pandemic: A spatial analysis of noise complaints and residents’ surveys","internal_url":"https://www.academia.edu/110629292/Montreal_soundscapes_during_the_COVID_19_pandemic_A_spatial_analysis_of_noise_complaints_and_residents_surveys","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":108387873,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/108387873/thumbnails/1.jpg","file_name":"pdf.pdf","download_url":"https://www.academia.edu/attachments/108387873/download_file","bulk_download_file_name":"Montreal_soundscapes_during_the_COVID_19.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/108387873/pdf-libre.pdf?1701783003=\u0026response-content-disposition=attachment%3B+filename%3DMontreal_soundscapes_during_the_COVID_19.pdf\u0026Expires=1741192868\u0026Signature=ToM9-xO~gxAkPJ86AtxVnctBhzJtTFWYRpXcr0hajP93LFd72mexsaUqCLce02tpqThy8Axw0Jh67WrT5rG23cd59vLT2PetRnz1BQPLmV2-aN5NZ~jDwyPbwaHRbZNJYNK~E46Gchj1ohRWsauDL4ET6KOcQYWPVfYQtdrbwidypWma6Q6fksIvhXW0mawz5~9YT2iCXE3-69l97KdzPb41JleBq4SqIy0e~0Xx8q4d0Jf1M6Deyb3s9TYteZRiRoJ5tM3SKMyuSXPa3OtQF33~p2sVwINxwHyy6vJcuaYA29W8G4ksNiwZsyxiuEC5H3SFDPCbyliyYEbGNZRgWA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"},{"id":108387874,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/108387874/thumbnails/1.jpg","file_name":"pdf.pdf","download_url":"https://www.academia.edu/attachments/108387874/download_file","bulk_download_file_name":"Montreal_soundscapes_during_the_COVID_19.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/108387874/pdf-libre.pdf?1701783005=\u0026response-content-disposition=attachment%3B+filename%3DMontreal_soundscapes_during_the_COVID_19.pdf\u0026Expires=1741192868\u0026Signature=fwrzumTwaryjtKfoptOvF5a3NpT3VHXOopGViq~jA2uR5XBlWlCYuybgNP1jMjuEuM3TpRgrq3qoMIYejZUBXfj2AhDOt5RlGxy4T7I9IXDTs6joIROe0jZciS-OyMOS4X9yOILNj4bebUzLoW3rFh1EZvarOnGHoBkh9Dl4SZCL4Voj11YDlYGSvNUwTUbbI-yc~oSALU-pGKVXqpfgRVpa0i7MZuQvOWhRFClbtVoiubhK~Z5Q3pxRps0ctZMWVpeo7RhYX~Agb7msHd89nXCI-jpvOhqQC~kuWBCWGh6Do5VGKnZvSl5fgxibmncU6gTSlPXRvIMyU6J5xgAksQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629291"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/110629291/Diffuse_Field_Modeling_Using_Physically_Inspired_Decorrelation_Filters_and_B_Format_Microphones_Part_I_Algorithm"><img alt="Research paper thumbnail of Diffuse Field Modeling Using Physically-Inspired Decorrelation Filters and B-Format Microphones: Part I Algorithm" class="work-thumbnail" src="https://attachments.academia-assets.com/108388013/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/110629291/Diffuse_Field_Modeling_Using_Physically_Inspired_Decorrelation_Filters_and_B_Format_Microphones_Part_I_Algorithm">Diffuse Field Modeling Using Physically-Inspired Decorrelation Filters and B-Format Microphones: Part I Algorithm</a></div><div class="wp-workCard_item"><span>Journal of The Audio Engineering Society</span><span>, Apr 21, 2016</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Diffuse Field Modeling (DFM) is a systematic means for simulating and reproducing a diffuse field...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Diffuse Field Modeling (DFM) is a systematic means for simulating and reproducing a diffuse field for arbitrary loudspeaker configurations. DFM is presented in two publications: Part I [1] presents the algorithm and this Part II reports the perceptual evaluation. Two experiments were conducted: in Experiment 1 sound recording professionals were to rate different treatments of DFM presented on a 20-channel array. The treatments under evaluation included the geometric modeling of reflections, strategies involving the early portion of the B-Format Room Impulse Response, and a comparison between 0 th and 1 st-order RIR. Results indicate that it is necessary to model the earliest reflections and to use all four channels of the B-Format room impulse response. In Experiment 2 musicians and sound recording professionals were asked to rate DFM and common microphone techniques presented on 3/2 stereophonic setup. DFM was found to be perceptually comparable with the Hamasaki Square technique.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="f62f1ee78d9f4851402884762ff61124" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:108388013,&quot;asset_id&quot;:110629291,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/108388013/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629291"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629291"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629291; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629291]").text(description); $(".js-view-count[data-work-id=110629291]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629291; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629291']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "f62f1ee78d9f4851402884762ff61124" } } $('.js-work-strip[data-work-id=110629291]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629291,"title":"Diffuse Field Modeling Using Physically-Inspired Decorrelation Filters and B-Format Microphones: Part I Algorithm","internal_url":"https://www.academia.edu/110629291/Diffuse_Field_Modeling_Using_Physically_Inspired_Decorrelation_Filters_and_B_Format_Microphones_Part_I_Algorithm","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":108388013,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/108388013/thumbnails/1.jpg","file_name":"jaes.2016.000220231205-1-pi4deb.pdf","download_url":"https://www.academia.edu/attachments/108388013/download_file","bulk_download_file_name":"Diffuse_Field_Modeling_Using_Physically.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/108388013/jaes.2016.000220231205-1-pi4deb-libre.pdf?1701787253=\u0026response-content-disposition=attachment%3B+filename%3DDiffuse_Field_Modeling_Using_Physically.pdf\u0026Expires=1741324679\u0026Signature=IltVXbeGfcScTvAR1FUYdHJnMevVpfYuv7K2raaa~~dZjQwblpLosw7aMvCKjgYaw1pqrTKAufe2xC-kygxk8Q3~k86oz0Xjbe4tddXPPSey1VqHzqtPRW-O0glpUBMJfwaI0UNl-d1tnIvs7k27dbEdYDIixwsQqpUUEsNUyr4tgLhUN8bLrAfkWzgmBiAX79BtBN3UBOkwrW1bM8BMKVJRPUFQ25efMhN8wPij-HS9AomQs4fE1M4pWzsfNb35yP3xE~B1UkH--95K0nLa1nXYJM3QJ2~pIHA25z5ZZfguyZNU39oIkyBtyj3hiEZPKDu6mhH3Z--kJGUkMUn~~w__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629290"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629290/Chapter_10_Questioning_sensory_experience"><img alt="Research paper thumbnail of Chapter 10. Questioning sensory experience" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629290/Chapter_10_Questioning_sensory_experience">Chapter 10. Questioning sensory experience</a></div><div class="wp-workCard_item"><span>John Benjamins Publishing Company eBooks</span><span>, Dec 15, 2021</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">International audienc</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629290"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629290"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629290; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629290]").text(description); $(".js-view-count[data-work-id=110629290]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629290; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629290']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629290]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629290,"title":"Chapter 10. Questioning sensory experience","internal_url":"https://www.academia.edu/110629290/Chapter_10_Questioning_sensory_experience","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629289"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629289/Perceived_suitability_of_reverberation_in_large_coupled_volume_concert_halls"><img alt="Research paper thumbnail of Perceived suitability of reverberation in large coupled volume concert halls" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629289/Perceived_suitability_of_reverberation_in_large_coupled_volume_concert_halls">Perceived suitability of reverberation in large coupled volume concert halls</a></div><div class="wp-workCard_item"><span>Psychomusicology: Music, Mind and Brain</span><span>, Sep 1, 2015</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629289"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629289"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629289; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629289]").text(description); $(".js-view-count[data-work-id=110629289]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629289; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629289']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629289]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629289,"title":"Perceived suitability of reverberation in large coupled volume concert halls","internal_url":"https://www.academia.edu/110629289/Perceived_suitability_of_reverberation_in_large_coupled_volume_concert_halls","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629288"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629288/Informing_sound_art_design_in_public_space_through_soundscape_simulation"><img alt="Research paper thumbnail of Informing sound art design in public space through soundscape simulation" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629288/Informing_sound_art_design_in_public_space_through_soundscape_simulation">Informing sound art design in public space through soundscape simulation</a></div><div class="wp-workCard_item"><span>NOISE-CON ... proceedings</span><span>, Feb 1, 2023</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Urban sound management often amounts to reducing sound levels with the underlying assumption of s...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Urban sound management often amounts to reducing sound levels with the underlying assumption of sound/noise as a nuisance. However, a reduction in sound level does not necessarily lead to a more pleasant auditory experience, especially in urban public spaces where vibrancy can be sought after. A proactive design approach that accounts for the human experience of sound environment is needed to improve the quality of urban spaces. Recent studies in soundscape research suggest that added sound and particularly sound art installations can have a positive influence on public space evaluations. Yet, the role of added sounds in urban context remains understudied and there is no existing method to date to inform sound art composition in public space through soundscape simulation. We present here a research-creation collaboration around the design of a permanent sound installation in an urban public space in Paris: Nadine Schütz&amp;#39;s Niches Acoustiques. We report on a series of listening tests involving High-Order Ambisonic soundscape simulations of different prototypes to inform the sound artist&amp;#39;s composition in order to optimize the quality of public space experience in the presence of the sound installation.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629288"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629288"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629288; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629288]").text(description); $(".js-view-count[data-work-id=110629288]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629288; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629288']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629288]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629288,"title":"Informing sound art design in public space through soundscape simulation","internal_url":"https://www.academia.edu/110629288/Informing_sound_art_design_in_public_space_through_soundscape_simulation","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629287"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629287/Applying_the_stratified_model_of_relevance_interactions_to_music_information_retrieval"><img alt="Research paper thumbnail of Applying the stratified model of relevance interactions to music information retrieval" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629287/Applying_the_stratified_model_of_relevance_interactions_to_music_information_retrieval">Applying the stratified model of relevance interactions to music information retrieval</a></div><div class="wp-workCard_item"><span>Proceedings Of The Association For Information Science And Technology</span><span>, 2013</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">ABSTRACT While research on the notion of relevance has a long and rich history in information ret...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">ABSTRACT While research on the notion of relevance has a long and rich history in information retrieval for textual documents, formal considerations of relevance concepts in Music Information Retrieval (MIR) remain scarce. We discuss the application of Saracevic&amp;amp;#39;s stratified model of relevance interactions to the music information domain. This model offers a tool for deliberation on the development of user-oriented MIR systems, and a framework for the aggregation of findings on the music information needs and behaviours of potential users.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629287"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629287"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629287; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629287]").text(description); $(".js-view-count[data-work-id=110629287]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629287; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629287']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629287]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629287,"title":"Applying the stratified model of relevance interactions to music information retrieval","internal_url":"https://www.academia.edu/110629287/Applying_the_stratified_model_of_relevance_interactions_to_music_information_retrieval","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629286"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629286/A_Perceptual_Evaluation_of_Room_Effect_Methods_for_Multichannel_Spatial_Audio"><img alt="Research paper thumbnail of A Perceptual Evaluation of Room Effect Methods for Multichannel Spatial Audio" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629286/A_Perceptual_Evaluation_of_Room_Effect_Methods_for_Multichannel_Spatial_Audio">A Perceptual Evaluation of Room Effect Methods for Multichannel Spatial Audio</a></div><div class="wp-workCard_item"><span>Journal of The Audio Engineering Society</span><span>, Oct 16, 2013</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629286"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629286"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629286; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629286]").text(description); $(".js-view-count[data-work-id=110629286]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629286; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629286']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629286]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629286,"title":"A Perceptual Evaluation of Room Effect Methods for Multichannel Spatial Audio","internal_url":"https://www.academia.edu/110629286/A_Perceptual_Evaluation_of_Room_Effect_Methods_for_Multichannel_Spatial_Audio","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629284"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629284/Supporting_Sounds_Design_and_Evaluation_of_an_Audio_Haptic_Interface"><img alt="Research paper thumbnail of Supporting Sounds: Design and Evaluation of an Audio-Haptic Interface" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629284/Supporting_Sounds_Design_and_Evaluation_of_an_Audio_Haptic_Interface">Supporting Sounds: Design and Evaluation of an Audio-Haptic Interface</a></div><div class="wp-workCard_item"><span>Springer eBooks</span><span>, 2012</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">ABSTRACT The design and evaluation of a multimodal interface is presented in order to investigate...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">ABSTRACT The design and evaluation of a multimodal interface is presented in order to investigate how spatial audio and haptic feedback can be used to convey the navigational structure of a virtual environment. The non-visual 3D virtual environment is composed of a number of parallel planes with either horizontal or vertical orientations. The interface was evaluated using a target-finding task to explore how auditory feedback can be used in isolation or combined with haptic feedback for navigation. Twenty-three users were asked to locate targets using auditory feedback in the virtual structure across both horizontal and vertical orientations of the planes, with and without haptic feedback. Findings from the evaluation experiment reveal that users performed the task faster in the bi-modal conditions (with combined auditory and haptic feedback) with a horizontal orientation of the virtual planes.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629284"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629284"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629284; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629284]").text(description); $(".js-view-count[data-work-id=110629284]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629284; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629284']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629284]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629284,"title":"Supporting Sounds: Design and Evaluation of an Audio-Haptic Interface","internal_url":"https://www.academia.edu/110629284/Supporting_Sounds_Design_and_Evaluation_of_an_Audio_Haptic_Interface","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629283"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/110629283/Perceptual_Evaluation_of_a_Real_time_Synthesis_Technique_for_Rolling_Sounds"><img alt="Research paper thumbnail of Perceptual Evaluation of a Real-time Synthesis Technique for Rolling Sounds" class="work-thumbnail" src="https://attachments.academia-assets.com/108791784/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/110629283/Perceptual_Evaluation_of_a_Real_time_Synthesis_Technique_for_Rolling_Sounds">Perceptual Evaluation of a Real-time Synthesis Technique for Rolling Sounds</a></div><div class="wp-workCard_item"><span>HAL (Le Centre pour la Communication Scientifique Directe)</span><span>, 2007</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">In this study 6 different versions of a new real-time synthesizer for contact sounds have been ev...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">In this study 6 different versions of a new real-time synthesizer for contact sounds have been evaluated in order to identify the most effective algorithm to create a realistic sound for rolling objects. 18 participants took part in a perceptual evaluation experiment. Results are presented in terms of both statistical analysis of the most effective synthesis algorithm and qualitative user comments. Finally recommendations for future implementations of synthesis techniques and subsequent perceptual evaluations are presented and discussed.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="e4e09133550f510a09e910f50ddfaf4e" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:108791784,&quot;asset_id&quot;:110629283,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/108791784/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629283"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629283"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629283; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629283]").text(description); $(".js-view-count[data-work-id=110629283]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629283; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629283']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "e4e09133550f510a09e910f50ddfaf4e" } } $('.js-work-strip[data-work-id=110629283]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629283,"title":"Perceptual Evaluation of a Real-time Synthesis Technique for Rolling Sounds","internal_url":"https://www.academia.edu/110629283/Perceptual_Evaluation_of_a_Real_time_Synthesis_Technique_for_Rolling_Sounds","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":108791784,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/108791784/thumbnails/1.jpg","file_name":"murphyEnactive08.pdf","download_url":"https://www.academia.edu/attachments/108791784/download_file","bulk_download_file_name":"Perceptual_Evaluation_of_a_Real_time_Syn.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/108791784/murphyEnactive08-libre.pdf?1702347208=\u0026response-content-disposition=attachment%3B+filename%3DPerceptual_Evaluation_of_a_Real_time_Syn.pdf\u0026Expires=1741324679\u0026Signature=WNIx6-70Ih-O6RVmOSj1-TDJw1PDRCM200afl1nWXr0Aa4SZnlXGW-~gZmsrFhbDUi4hFEYsE-zi5vAMPuCZXA0ilPcD4faFeqP1W7jzBQahIyQMrqTLMPkmReDpFZ5N0anoV5YaNNqyi5UBE5Zw5Yc1yo4flAWU7VXr8tGNfX1Vu8WFq8uPa80ay8csVOGvSR9VPmy719qLv7zYnd-P0DDBMFJYotcG6JxInaoqDF24JV6WpPmKOjshFdMnPiHYoLfxZJRfV-d6w28FZ5MrCXxrunpigxA8e0vy2~39-QibfolzN73eLpdpzTZ5m1pupJtDw-nk3evLeMidkaIldA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629282"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629282/Etude_s%C3%A9mantique_et_acoustique_de_la_perception_des_basses_fr%C3%A9quences_dans_lenvironnement_sonore_urbain"><img alt="Research paper thumbnail of Etude sémantique et acoustique de la perception des basses fréquences dans l&#39;environnement sonore urbain" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629282/Etude_s%C3%A9mantique_et_acoustique_de_la_perception_des_basses_fr%C3%A9quences_dans_lenvironnement_sonore_urbain">Etude sémantique et acoustique de la perception des basses fréquences dans l&#39;environnement sonore urbain</a></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629282"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629282"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629282; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629282]").text(description); $(".js-view-count[data-work-id=110629282]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629282; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629282']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629282]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629282,"title":"Etude sémantique et acoustique de la perception des basses fréquences dans l'environnement sonore urbain","internal_url":"https://www.academia.edu/110629282/Etude_s%C3%A9mantique_et_acoustique_de_la_perception_des_basses_fr%C3%A9quences_dans_lenvironnement_sonore_urbain","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629281"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/110629281/Constructing_a_true_LCSH_tree_of_a_science_and_engineering_collection"><img alt="Research paper thumbnail of Constructing a true LCSH tree of a science and engineering collection" class="work-thumbnail" src="https://attachments.academia-assets.com/108388096/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/110629281/Constructing_a_true_LCSH_tree_of_a_science_and_engineering_collection">Constructing a true LCSH tree of a science and engineering collection</a></div><div class="wp-workCard_item"><span>Journal of the Association for Information Science and Technology</span><span>, Nov 15, 2012</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The Library of Congress Subject Headings (LCSH) is a subject structure used to index large librar...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The Library of Congress Subject Headings (LCSH) is a subject structure used to index large library collections throughout the world. Browsing a collection through LCSH is difficult using current online tools in part because users cannot explore the structure using their existing experience navigating file hierarchies on their hard drives. This is due to inconsistencies in the LCSH structure, which does not adhere to the specific rules defining tree structures. This article proposes a method to adapt the LCSH structure to reflect a real-world collection from the domain of science and engineering. This structure is transformed into a valid tree structure using an automatic process. The analysis of the resulting LCSH tree shows a large and complex structure. The analysis of the distribution of information within the LCSH tree reveals a power law distribution where the vast majority of subjects contain few information items and a few subjects contain the vast majority of the collection.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="7b1c5b90c599f0ad2b84e86fe0cdba5d" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:108388096,&quot;asset_id&quot;:110629281,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/108388096/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629281"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629281"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629281; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629281]").text(description); $(".js-view-count[data-work-id=110629281]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629281; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629281']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "7b1c5b90c599f0ad2b84e86fe0cdba5d" } } $('.js-work-strip[data-work-id=110629281]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629281,"title":"Constructing a true LCSH tree of a science and engineering collection","internal_url":"https://www.academia.edu/110629281/Constructing_a_true_LCSH_tree_of_a_science_and_engineering_collection","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":108388096,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/108388096/thumbnails/1.jpg","file_name":"asi.2274920231205-1-iyva8q.pdf","download_url":"https://www.academia.edu/attachments/108388096/download_file","bulk_download_file_name":"Constructing_a_true_LCSH_tree_of_a_scien.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/108388096/asi.2274920231205-1-iyva8q-libre.pdf?1701787244=\u0026response-content-disposition=attachment%3B+filename%3DConstructing_a_true_LCSH_tree_of_a_scien.pdf\u0026Expires=1741324679\u0026Signature=Wzzy-vRtNbOOHjQo~RIaeicEx0T-9a9f4LBfuhSW8JfNl-txj1UTZxGY~JKKLlbmObRQKM3~VQSDnN7YOhNg3kGdragasHKOSdUQYcPmrkRLvIq2nM104QVRz01LD2NXkRUFwHQ~U~edtTFd4whbuiZucUATl2uJhuHNuWhaWx6Lq8MS~ahY4E6rI6zC4OaadKV63~ShARYc-gnfc0o7Z39FrEku-6mwsdWBxxwbHkQmErJtn7NhoPqrW~tp9YUiw5ur4aGKtOTd9vnRclBavE5IVceSmQ0L4wnhSHicJS05vY-q7NmYbrQCu5b9LgoBe-n2R8RYazLlZQp77plKgw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629280"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/110629280/User_Studies_in_the_Music_Information_Retrieval_Literature"><img alt="Research paper thumbnail of User Studies in the Music Information Retrieval Literature" class="work-thumbnail" src="https://attachments.academia-assets.com/108387871/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/110629280/User_Studies_in_the_Music_Information_Retrieval_Literature">User Studies in the Music Information Retrieval Literature</a></div><div class="wp-workCard_item"><span>International Symposium/Conference on Music Information Retrieval</span><span>, 2011</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">This paper presents an overview of user studies in the Music Information Retrieval (MIR) literatu...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">This paper presents an overview of user studies in the Music Information Retrieval (MIR) literature. A focus on the user has repeatedly been identified as a key requirement for future MIR research; yet empirical user studies have been relatively sparse in the literature, the overwhelming research attention in MIR remaining systems-focused. We present research topics, methodologies, and design implications covered in the user studies conducted thus far.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="62dd3b2e1498163e7d78712aef33e27b" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:108387871,&quot;asset_id&quot;:110629280,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/108387871/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629280"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629280"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629280; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629280]").text(description); $(".js-view-count[data-work-id=110629280]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629280; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629280']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "62dd3b2e1498163e7d78712aef33e27b" } } $('.js-work-strip[data-work-id=110629280]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629280,"title":"User Studies in the Music Information Retrieval Literature","internal_url":"https://www.academia.edu/110629280/User_Studies_in_the_Music_Information_Retrieval_Literature","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":108387871,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/108387871/thumbnails/1.jpg","file_name":"OS5-1.pdf","download_url":"https://www.academia.edu/attachments/108387871/download_file","bulk_download_file_name":"User_Studies_in_the_Music_Information_Re.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/108387871/OS5-1-libre.pdf?1701783003=\u0026response-content-disposition=attachment%3B+filename%3DUser_Studies_in_the_Music_Information_Re.pdf\u0026Expires=1741338292\u0026Signature=OKZKJpaG4sxT1NjqL3dKon3PQ2DmfFBbz~lL8n40PxQNZS~P65hxYDGtYQhpIv0J9~JCIvjADsVwO9vQO36uLz8GSmQvIij9Ohnu0VJxnXqDI0AdwDll9ZbYC5g7cqlwreJI853XUuFcAfxcUvzDNf-ksCQcQ~Pg-jD-~lIPR47JfxH0B7S2yPZ6pAwavw1LU14XDakrANjnF1x4LzIysphShUdm40-IcguN5R755xmhn3jWV8pKZ0IjPF3Oo3gnCqYF0U87N9gzAhEseeW4nJo1~IqsHE0nM5p63zCqR6dFdVfk7zYDEs27V7e4pjZ~Nln~LlvxnN7C2Qvj66yjKQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"},{"id":108387872,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/108387872/thumbnails/1.jpg","file_name":"OS5-1.pdf","download_url":"https://www.academia.edu/attachments/108387872/download_file","bulk_download_file_name":"User_Studies_in_the_Music_Information_Re.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/108387872/OS5-1-libre.pdf?1701783003=\u0026response-content-disposition=attachment%3B+filename%3DUser_Studies_in_the_Music_Information_Re.pdf\u0026Expires=1741338292\u0026Signature=AhtyuC7CfVwawNIWMNXZEAMJLDyu95BZc9CZpamv6Y-JBQjLid1TbtJ4Fxr69weMIBTBTIIUtg38QAExBQRlIR2~srZjRhkwndYYrbNx3u6xqKq5sUmMgIxGWJMeHrcIZ2bnQDF-JElE7mx3cnO-Confl676tTGnj19yif-pvIjNoPizNP4QEBI4ozAkSicP4v57TxHQUQLkdeX16AtLft-5XKu31n5P6-EFHNzY8uMsOZAdvJdibENxvh3s1NwmMSgqOb4q9m6sZoHerRsYnl9QAInr37F2uOh2p-iGpl9HD~qhc71SXmx953XWCGs8GbOOiswxRjxMLy45TIxKEw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629279"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629279/Perceptual_validation_of_sound_environment_reproduction_inside_an_aircraft_mock_up"><img alt="Research paper thumbnail of Perceptual validation of sound environment reproduction inside an aircraft mock-up" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629279/Perceptual_validation_of_sound_environment_reproduction_inside_an_aircraft_mock_up">Perceptual validation of sound environment reproduction inside an aircraft mock-up</a></div><div class="wp-workCard_item"><span>Applied Ergonomics</span><span>, 2022</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Auditory comfort evaluations are garnering increased attention in engineering and particularly in...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Auditory comfort evaluations are garnering increased attention in engineering and particularly in the context of air transportation. Being able to produce sound environments corresponding to various flight conditions in aircraft mock-ups would be a valuable tool to investigate acoustic comfort inside aircrafts in controlled environments. Before using such mock-ups, they must be developed and validated in physical and perceptual terms. This paper provides a perceptual validation of sound environment reproduction inside aircraft mock-up. To provide a faithfully reproduced sound environment, time, frequency and spatial characteristics should be preserved. Physical sound field reproduction approaches for spatial sound reproduction are required while properly preserving localization cues at the listener&amp;#39;s ears to recreate a realistic and immersing sound environment. We report a perceptual validation of a sound field reproduction system developed in an aircraft mock-up based on multichannel least-square methods and equalization. Twenty participants evaluated reproduced sound environments relative to a reference sound environment in an aircraft cabin mock-up equipped with a 41-actuator multichannel sound reproduction system. Results indicate that the preferred reproduction corresponds to the best physical reconstruction of the sound environment.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629279"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629279"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629279; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629279]").text(description); $(".js-view-count[data-work-id=110629279]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629279; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629279']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629279]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629279,"title":"Perceptual validation of sound environment reproduction inside an aircraft mock-up","internal_url":"https://www.academia.edu/110629279/Perceptual_validation_of_sound_environment_reproduction_inside_an_aircraft_mock_up","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629278"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629278/Perception_of_reverberation_in_coupled_volumes_discrimination_and_suitability"><img alt="Research paper thumbnail of Perception of reverberation in coupled volumes: discrimination and suitability" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629278/Perception_of_reverberation_in_coupled_volumes_discrimination_and_suitability">Perception of reverberation in coupled volumes: discrimination and suitability</a></div><div class="wp-workCard_item"><span>HAL (Le Centre pour la Communication Scientifique Directe)</span><span>, 2013</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629278"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629278"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629278; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629278]").text(description); $(".js-view-count[data-work-id=110629278]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629278; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629278']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629278]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629278,"title":"Perception of reverberation in coupled volumes: discrimination and suitability","internal_url":"https://www.academia.edu/110629278/Perception_of_reverberation_in_coupled_volumes_discrimination_and_suitability","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629277"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629277/Sharing_music_in_public_spaces_Social_insights_from_the_Musikiosk_project"><img alt="Research paper thumbnail of Sharing music in public spaces: Social insights from the Musikiosk project" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629277/Sharing_music_in_public_spaces_Social_insights_from_the_Musikiosk_project">Sharing music in public spaces: Social insights from the Musikiosk project</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">We argue for a reconsideration of the role of public sharing of music and technology in urban, pu...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">We argue for a reconsideration of the role of public sharing of music and technology in urban, public settings, based on the results of research involving an interactive sound system. While current legislation in Quebec prevents the playing of amplified music in public, with the support of a Montreal borough, we developed and installed an open, free sound system (Musikiosk) allowing users to choose and play their own music into a pocket park off a busy commercial street. The park and system usage were systematically studied in an interdisciplinary research project, framed by current debates on the relationship between music, publicness and the use of interactive music technologies in public spaces. It combined observations, questionnaires and interviews with park users and residents, which are analyzed through the lens of use patterns and engagement. Results indicate that both users and non-users of the system evaluate Musikiosk as a welcome addition to the park and as a benefit to its conviviality and dynamics, by allowing users to share their music in a novel way and thus to appropriate their park acoustically. Findings further indicate that the process of shared music consumption is an essential advantage of the system, extending the range of park functions and encouraging interaction and different forms of social dynamics by also attracting new users. The positive reactions to Musikiosk show the need for a reevaluation of existing norms and regulations on public space use, particularly in relation to new forms of publicness through the sharing of music and technology.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629277"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629277"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629277; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629277]").text(description); $(".js-view-count[data-work-id=110629277]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629277; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629277']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629277]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629277,"title":"Sharing music in public spaces: Social insights from the Musikiosk project","internal_url":"https://www.academia.edu/110629277/Sharing_music_in_public_spaces_Social_insights_from_the_Musikiosk_project","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629276"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629276/A_Comparison_of_Recording_Rendering_and_Reproduction_Techniques_for_Multichannel_Spatial_Audio"><img alt="Research paper thumbnail of A Comparison of Recording, Rendering, and Reproduction Techniques for Multichannel Spatial Audio" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629276/A_Comparison_of_Recording_Rendering_and_Reproduction_Techniques_for_Multichannel_Spatial_Audio">A Comparison of Recording, Rendering, and Reproduction Techniques for Multichannel Spatial Audio</a></div><div class="wp-workCard_item"><span>Journal of The Audio Engineering Society</span><span>, Oct 26, 2012</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629276"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629276"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629276; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629276]").text(description); $(".js-view-count[data-work-id=110629276]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629276; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629276']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629276]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629276,"title":"A Comparison of Recording, Rendering, and Reproduction Techniques for Multichannel Spatial Audio","internal_url":"https://www.academia.edu/110629276/A_Comparison_of_Recording_Rendering_and_Reproduction_Techniques_for_Multichannel_Spatial_Audio","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629274"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629274/Discrimination_Between_Phonograph_Playback_Systems"><img alt="Research paper thumbnail of Discrimination Between Phonograph Playback Systems" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629274/Discrimination_Between_Phonograph_Playback_Systems">Discrimination Between Phonograph Playback Systems</a></div><div class="wp-workCard_item"><span>Journal of The Audio Engineering Society</span><span>, Oct 19, 2011</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">... expert listeners, 10 males and 4 females, with a mean age of 33.07 (SD = 8.27), took part in ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">... expert listeners, 10 males and 4 females, with a mean age of 33.07 (SD = 8.27), took part in condition 2. We attempted to retain as many participants from the first condition as possible, and were able to retain 8. Participants received $20 CAD for their participation. <a href="http://coltrane" rel="nofollow">http://coltrane</a>. ...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629274"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629274"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629274; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629274]").text(description); $(".js-view-count[data-work-id=110629274]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629274; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629274']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629274]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629274,"title":"Discrimination Between Phonograph Playback Systems","internal_url":"https://www.academia.edu/110629274/Discrimination_Between_Phonograph_Playback_Systems","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629273"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629273/The_Performing_World_of_Digital_Archives"><img alt="Research paper thumbnail of The Performing World of Digital Archives" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629273/The_Performing_World_of_Digital_Archives">The Performing World of Digital Archives</a></div><div class="wp-workCard_item"><span>De Boeck Supérieur eBooks</span><span>, 2013</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629273"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629273"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629273; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629273]").text(description); $(".js-view-count[data-work-id=110629273]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629273; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629273']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629273]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629273,"title":"The Performing World of Digital Archives","internal_url":"https://www.academia.edu/110629273/The_Performing_World_of_Digital_Archives","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629272"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/110629272/Diffuse_Field_Modeling_Using_Physically_Inspired_Decorrelation_Filters_and_B_Format_Microphones_Part_II_Evaluation"><img alt="Research paper thumbnail of Diffuse Field Modeling Using Physically-Inspired Decorrelation Filters and B-Format Microphones: Part II Evaluation" class="work-thumbnail" src="https://attachments.academia-assets.com/108388026/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/110629272/Diffuse_Field_Modeling_Using_Physically_Inspired_Decorrelation_Filters_and_B_Format_Microphones_Part_II_Evaluation">Diffuse Field Modeling Using Physically-Inspired Decorrelation Filters and B-Format Microphones: Part II Evaluation</a></div><div class="wp-workCard_item"><span>Journal of The Audio Engineering Society</span><span>, Apr 21, 2016</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Diffuse Field Modeling (DFM) is a systematic means for simulating and reproducing a diffuse field...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Diffuse Field Modeling (DFM) is a systematic means for simulating and reproducing a diffuse field for arbitrary loudspeaker configurations. DFM is presented in two publications: Part I [1] presents the algorithm and this Part II reports the perceptual evaluation. Two experiments were conducted: in Experiment 1 sound recording professionals were to rate different treatments of DFM presented on a 20-channel array. The treatments under evaluation included the geometric modeling of reflections, strategies involving the early portion of the B-Format Room Impulse Response, and a comparison between 0 th and 1 st-order RIR. Results indicate that it is necessary to model the earliest reflections and to use all four channels of the B-Format room impulse response. In Experiment 2 musicians and sound recording professionals were asked to rate DFM and common microphone techniques presented on 3/2 stereophonic setup. DFM was found to be perceptually comparable with the Hamasaki Square technique.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="cc8594be08f5000d930d93b745eb4f16" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:108388026,&quot;asset_id&quot;:110629272,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/108388026/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629272"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629272"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629272; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629272]").text(description); $(".js-view-count[data-work-id=110629272]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629272; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629272']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "cc8594be08f5000d930d93b745eb4f16" } } $('.js-work-strip[data-work-id=110629272]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629272,"title":"Diffuse Field Modeling Using Physically-Inspired Decorrelation Filters and B-Format Microphones: Part II Evaluation","internal_url":"https://www.academia.edu/110629272/Diffuse_Field_Modeling_Using_Physically_Inspired_Decorrelation_Filters_and_B_Format_Microphones_Part_II_Evaluation","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":108388026,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/108388026/thumbnails/1.jpg","file_name":"jaes.2016.000220231205-1-3elcy3.pdf","download_url":"https://www.academia.edu/attachments/108388026/download_file","bulk_download_file_name":"Diffuse_Field_Modeling_Using_Physically.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/108388026/jaes.2016.000220231205-1-3elcy3-libre.pdf?1701787250=\u0026response-content-disposition=attachment%3B+filename%3DDiffuse_Field_Modeling_Using_Physically.pdf\u0026Expires=1741324679\u0026Signature=SzuoOQjUVezeqi-i0d7dlp6C01Sdqh3SDT4ukkiWE5e7gD-bce2vbrgRzN3V~aLManBxgU99GbSqrrqQIes89bbks3JuBAAIzQLJJU9RoOO31MQkZEc6MtO3BLivorg7y0mmmQQ2MFXFh2GkGWX8LD0L~XxDT-07WpHgZJLnc51frGRENtXQgPxBYIbDLr8Xzu6TXmIlT9YtfDkXbNNEBC2kvDSj-URX6plv9z~YXOj8IwArVHFaRM9v-PFBOc075P1OeaZi-YRCwilkiILihlsRjSYnuef4~-lxgVGjcrgitrtX90inpBbOqIRBtMjUkvF1~xltP9YkOn8JRxzzxw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="110629271"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/110629271/Exploiting_major_trends_in_subject_hierarchies_for_large_scale_collection_visualization"><img alt="Research paper thumbnail of Exploiting major trends in subject hierarchies for large-scale collection visualization" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/110629271/Exploiting_major_trends_in_subject_hierarchies_for_large_scale_collection_visualization">Exploiting major trends in subject hierarchies for large-scale collection visualization</a></div><div class="wp-workCard_item"><span>Proceedings of SPIE</span><span>, Jan 22, 2012</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">ABSTRACT Many large digital collections are currently organized by subject; however, these useful...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">ABSTRACT Many large digital collections are currently organized by subject; however, these useful information organization structures are large and complex, making them difficult to browse. Current online tools and visualization prototypes show small localized subsets and do not provide the ability to explore the predominant patterns of the overall subject structure. This research addresses this issue by simplifying the subject structure using two techniques based on the highly uneven distribution of real-world collections: level compression and child pruning. The approach is demonstrated using a sample of 130K records organized by the Library of Congress Subject Headings (LCSH). Promising results show that the subject hierarchy can be reduced down to 42% of its initial size, while maintaining access to 81% of the collection. The visual impact is demonstrated using a traditional outline view allowing searchers to dynamically change the amount of complexity that they feel necessary for the tasks at hand.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="110629271"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="110629271"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 110629271; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=110629271]").text(description); $(".js-view-count[data-work-id=110629271]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 110629271; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='110629271']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=110629271]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":110629271,"title":"Exploiting major trends in subject hierarchies for large-scale collection visualization","internal_url":"https://www.academia.edu/110629271/Exploiting_major_trends_in_subject_hierarchies_for_large_scale_collection_visualization","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> </div><div class="profile--tab_content_container js-tab-pane tab-pane" data-section-id="4884032" id="conferenceproceedings"><div class="js-work-strip profile--work_container" data-work-id="23566261"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23566261/Discrimination_Between_Phonograph_Playback_Systems"><img alt="Research paper thumbnail of Discrimination Between Phonograph Playback Systems" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23566261/Discrimination_Between_Phonograph_Playback_Systems">Discrimination Between Phonograph Playback Systems</a></div><div class="wp-workCard_item wp-workCard--coauthors"><span>by </span><span><a class="" data-click-track="profile-work-strip-authors" href="https://mcgill.academia.edu/CatherineGuastavino">Catherine Guastavino</a> and <a class="" data-click-track="profile-work-strip-authors" href="https://mcgill.academia.edu/IchiroFujinaga">Ichiro Fujinaga</a></span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">... expert listeners, 10 males and 4 females, with a mean age of 33.07 (SD = 8.27), took part in ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">... expert listeners, 10 males and 4 females, with a mean age of 33.07 (SD = 8.27), took part in condition 2. We attempted to retain as many participants from the first condition as possible, and were able to retain 8. Participants received $20 CAD for their participation. <a href="http://coltrane" rel="nofollow">http://coltrane</a>. ...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566261"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566261"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566261; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566261]").text(description); $(".js-view-count[data-work-id=23566261]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566261; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566261']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23566261]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566261,"title":"Discrimination Between Phonograph Playback Systems","internal_url":"https://www.academia.edu/23566261/Discrimination_Between_Phonograph_Playback_Systems","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566265"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/23566265/An_interdisciplinary_approach_to_audio_effect_classification"><img alt="Research paper thumbnail of An interdisciplinary approach to audio effect classification" class="work-thumbnail" src="https://attachments.academia-assets.com/43991435/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/23566265/An_interdisciplinary_approach_to_audio_effect_classification">An interdisciplinary approach to audio effect classification</a></div><div class="wp-workCard_item wp-workCard--coauthors"><span>by </span><span><a class="" data-click-track="profile-work-strip-authors" href="https://mcgill.academia.edu/CatherineGuastavino">Catherine Guastavino</a>, <a class="" data-click-track="profile-work-strip-authors" href="https://umontreal.academia.edu/vincentverfaille">Vincent Verfaille</a>, and <a class="" data-click-track="profile-work-strip-authors" href="https://independent.academia.edu/CarolineTraube">Caroline Traube</a></span></div><div class="wp-workCard_item"><span>… of the 9th International Conference on …</span><span>, 2006</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The aim of this paper is to propose an interdisciplinary classification of digital audio effects ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The aim of this paper is to propose an interdisciplinary classification of digital audio effects to facilitate communication and collaborations between DSP programmers, sound engineers, composers, performers and musicologists. After reviewing classifications reflecting technological, technical and perceptual points of view, we introduce a transverse classification to link disciplinespecific classifications into a single network containing various layers of descriptors, ranging from low-level features to high-level features. Simple tools using the interdisciplinary classification are introduced to facilitate the navigation between effects, underlying techniques, perceptual attributes and semantic descriptors. Finally, concluding remarks on implications for teaching purposes and for the development of audio effects user interfaces based on perceptual features rather than technical parameters are presented.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="b3fb08c4945e68699426e288816306f6" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:43991435,&quot;asset_id&quot;:23566265,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/43991435/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566265"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566265"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566265; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566265]").text(description); $(".js-view-count[data-work-id=23566265]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566265; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566265']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "b3fb08c4945e68699426e288816306f6" } } $('.js-work-strip[data-work-id=23566265]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566265,"title":"An interdisciplinary approach to audio effect classification","internal_url":"https://www.academia.edu/23566265/An_interdisciplinary_approach_to_audio_effect_classification","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":43991435,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/43991435/thumbnails/1.jpg","file_name":"VerfailleGuastavinoTraube2006DAfx.pdf","download_url":"https://www.academia.edu/attachments/43991435/download_file","bulk_download_file_name":"An_interdisciplinary_approach_to_audio_e.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/43991435/VerfailleGuastavinoTraube2006DAfx-libre.pdf?1458656446=\u0026response-content-disposition=attachment%3B+filename%3DAn_interdisciplinary_approach_to_audio_e.pdf\u0026Expires=1741324680\u0026Signature=IkO4jQpKP3jMAEO0a3S-6uPj-wQsWD5iiq7ONttJv2sL11c4Z0c6ehHw-DdHNJFvnntRVtrCwe8bZA82O-irp6TNesMnFiftliET6ATZBWNOQYCQ6mz12R7yhnr6hdSZ-eQ3gxlSB~81ezmucOwVDseFGJIcnzQNQaD-WcrlFiElcaMAUhnYcpXef-U-NCj6IOcHYkwWx7PYdfoJUF1TzH1TgK6K0Hhf4vPnXtU6Xe46b2xBRmYojPo1Kph3SoSL5zOcagEpZ~R626RbMPnbTBiB4cgAI6KqgX1BPIwxEcAaA~UbQCtnq0nMB~nsg6Xk1MYKyYI3kOmFeREXmWuALA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566272"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23566272/Common_and_distinctive_features_of_similarity_Effect_of_behavioral_method"><img alt="Research paper thumbnail of Common and distinctive features of similarity: Effect of behavioral method" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23566272/Common_and_distinctive_features_of_similarity_Effect_of_behavioral_method">Common and distinctive features of similarity: Effect of behavioral method</a></div><div class="wp-workCard_item"><span>PsycEXTRA Dataset</span><span>, 2000</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Giordano, BL, Guastavino, C., Murphy, E., Ogg, M., Smith, BK, and McAdams, S. (2009) Common and d...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Giordano, BL, Guastavino, C., Murphy, E., Ogg, M., Smith, BK, and McAdams, S. (2009) Common and distinctive features of similarity: effect of behavioral method. In: 50th Annual Meeting of the Psychonomic Society, 19-22 Nov 2009, Boston, MA, USA. ... Full text not currently available from Enlighten. ... Giordano, BL, Guastavino, C., Murphy, E., Ogg, M., Smith, BK, and McAdams, S.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566272"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566272"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566272; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566272]").text(description); $(".js-view-count[data-work-id=23566272]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566272; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566272']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23566272]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566272,"title":"Common and distinctive features of similarity: Effect of behavioral method","internal_url":"https://www.academia.edu/23566272/Common_and_distinctive_features_of_similarity_Effect_of_behavioral_method","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566283"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/23566283/Moving_forward_conceptualizing_comfort_in_information_sources_for_enthusiast_cyclists"><img alt="Research paper thumbnail of Moving forward: conceptualizing comfort in information sources for enthusiast cyclists" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/23566283/Moving_forward_conceptualizing_comfort_in_information_sources_for_enthusiast_cyclists">Moving forward: conceptualizing comfort in information sources for enthusiast cyclists</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">This research aims to identify how the notion of comfort in the context of bicycling is conveyed ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">This research aims to identify how the notion of comfort in the context of bicycling is conveyed in an American bicycling magazine and online foru m. An iterative approach, comprising of a content analysis and linguistic discourse analysis aims to go beyond the generallyaccepted definition that focuses on vibrations, and identify the different concepts and typical situations relevant to study comfort for enthusiast cyclists. Are discussed the selection criteria for the magazine and online foru m, the development of the coding protocol (including the final operational definit ion of each concepts) and the method for retriev ing and analyzing online foru m posts. A quantitative analysis, looking at the number of occurrences for each concept and theme, co mbined with a qualitative analysis of pronoun use, positive and negative descriptors, and opposing statements shows a complex lin k between the cyclist, the bicycle, and the environment. The behaviour of the bicycle, environmental factors, what the cyclists thinks and feels as well as the goal of the ride affect how comfort is conceptualized.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="98683adab8ae78017af82a8764de13bc" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:43991449,&quot;asset_id&quot;:23566283,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/43991449/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566283"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566283"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566283; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566283]").text(description); $(".js-view-count[data-work-id=23566283]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566283; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566283']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "98683adab8ae78017af82a8764de13bc" } } $('.js-work-strip[data-work-id=23566283]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566283,"title":"Moving forward: conceptualizing comfort in information sources for enthusiast cyclists","internal_url":"https://www.academia.edu/23566283/Moving_forward_conceptualizing_comfort_in_information_sources_for_enthusiast_cyclists","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":43991449,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://a.academia-assets.com/images/blank-paper.jpg","file_name":"meet.2011.14504801187.pdf20160322-4458-1cwip9r","download_url":"https://www.academia.edu/attachments/43991449/download_file","bulk_download_file_name":"Moving_forward_conceptualizing_comfort_i.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/43991449/meet.2011.14504801187.pdf20160322-4458-1cwip9r?1738303829=\u0026response-content-disposition=attachment%3B+filename%3DMoving_forward_conceptualizing_comfort_i.pdf\u0026Expires=1741324680\u0026Signature=S464u-h2o1dTO9XfVHePkK64ZkYRYadvNZyGGmSHT66xi5YJN5WacUXJcIgQ-V6dzS~X~kN6AkhVP8Or6jPl7wPfhLPvOIYE~Tmgz~x5lbjzdigixpC749W~o9pzT6Gkp6BFsf5bwH58N9zh13apOU0jWIVSl7i1QG3f2MdnWWbiFkuDrBdhmLkNfbJG1ZK4D9~ni5ALjymnYDMWKV342vThXNuRxUeyHgubHLNZjFR58hDeFjLyUqk24kDnwr8M9uu3EcPUZxbkQU96sQUMKdBJVv47hcKezdvaw-WWVfC~nWGLOeVeTudoUYwDq5xcKjc2RCychxMHjRQeOuojWw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566285"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/23566285/Investigating_soundscape_affordances_through_activity_appropriateness"><img alt="Research paper thumbnail of Investigating soundscape affordances through activity appropriateness" class="work-thumbnail" src="https://attachments.academia-assets.com/43991444/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/23566285/Investigating_soundscape_affordances_through_activity_appropriateness">Investigating soundscape affordances through activity appropriateness</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Central to the concept of soundscape is the understanding of the acoustic environment in context....</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Central to the concept of soundscape is the understanding of the acoustic environment in context. Previous research indicates that people understand soundscapes through their potential for activities. One way to look at activities is through the concept of affordances -defined as the actionable properties of an object. In this study, the object is a location and time in the city. Fifteen participants listened to stereo recordings of 8 outdoor sites in Paris and Montreal. In each trial, they evaluated on a continuous scale how appropriate the soundscapes were for a given activity. Four activities were considered and presented in random order: studying for an exam, meeting up with a friend, riding a bike and relaxing. Participants justified their ratings in free-format comments. A 8(Soundscapes) x 4(Activities) factorial ANOVAs revealed significant effects of Soundscape and Activities and Soundscape*Activities on appropriateness ratings. Certain soundscapes were found to accommodate specific activities only while others were found to potentially accommodate all activities or none (prominent mechanical/traffic noise.) We also analyzed comments to further understand how participants envision utilizing the soundscape/environment and attribute meanings to the various sounds present.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="0d6119e5eefaa97b27b2f220e5e0999d" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:43991444,&quot;asset_id&quot;:23566285,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/43991444/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566285"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566285"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566285; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566285]").text(description); $(".js-view-count[data-work-id=23566285]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566285; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566285']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "0d6119e5eefaa97b27b2f220e5e0999d" } } $('.js-work-strip[data-work-id=23566285]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566285,"title":"Investigating soundscape affordances through activity appropriateness","internal_url":"https://www.academia.edu/23566285/Investigating_soundscape_affordances_through_activity_appropriateness","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":43991444,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/43991444/thumbnails/1.jpg","file_name":"Investigating_soundscape_affordances_thr20160322-7932-r214we.pdf","download_url":"https://www.academia.edu/attachments/43991444/download_file","bulk_download_file_name":"Investigating_soundscape_affordances_thr.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/43991444/Investigating_soundscape_affordances_thr20160322-7932-r214we-libre.pdf?1458656445=\u0026response-content-disposition=attachment%3B+filename%3DInvestigating_soundscape_affordances_thr.pdf\u0026Expires=1741324680\u0026Signature=d4jkIkw1F~aqY9pQgibwpuyIynj5pOCifooa~i-rw-rt9RkMcdNg5i8~LssxpJRdrzot-6L~dEUlkdzMtHOnjS29DDnGdRXBU44x-hUwy3u0apSZlHJ-GkImZKm4YdDvNZomINVHSSult1A6-e05FyQVC3mMmGQTQpxtp7-mtntsrEQaXyeZh5yqBSYTBr4S009SQRnWkg2Sal2BKbHthcV8qMJHH8rmKYe0jO9OXYTHTjAAKiil~T1JDgw7bK5GPdOYaUCEzHO-Fklfr3WSmIW6GcRFMBffW6tqnXGh2bIOZv4wdhBjMo7lC7DVTqgtQHCAGWfaGxgFG38cUdHGwg__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566289"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23566289/The_information_world_of_enthusiast_cyclists"><img alt="Research paper thumbnail of The information world of enthusiast cyclists" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23566289/The_information_world_of_enthusiast_cyclists">The information world of enthusiast cyclists</a></div><div class="wp-workCard_item wp-workCard--coauthors"><span>by </span><span><a class="" data-click-track="profile-work-strip-authors" href="https://mcgill.academia.edu/CatherineGuastavino">Catherine Guastavino</a> and <a class="" data-click-track="profile-work-strip-authors" href="https://mcgill.academia.edu/FouazAyachi">Fouaz Ayachi</a></span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566289"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566289"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566289; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566289]").text(description); $(".js-view-count[data-work-id=23566289]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566289; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566289']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23566289]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566289,"title":"The information world of enthusiast cyclists","internal_url":"https://www.academia.edu/23566289/The_information_world_of_enthusiast_cyclists","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566290"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23566290/A_Perceptual_Evaluation_of_Recording_Rendering_and_Reproduction_Techniques_for_Multichannel_Spatial_Audio"><img alt="Research paper thumbnail of A Perceptual Evaluation of Recording, Rendering, and Reproduction Techniques for Multichannel Spatial Audio" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23566290/A_Perceptual_Evaluation_of_Recording_Rendering_and_Reproduction_Techniques_for_Multichannel_Spatial_Audio">A Perceptual Evaluation of Recording, Rendering, and Reproduction Techniques for Multichannel Spatial Audio</a></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566290"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566290"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566290; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566290]").text(description); $(".js-view-count[data-work-id=23566290]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566290; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566290']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23566290]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566290,"title":"A Perceptual Evaluation of Recording, Rendering, and Reproduction Techniques for Multichannel Spatial Audio","internal_url":"https://www.academia.edu/23566290/A_Perceptual_Evaluation_of_Recording_Rendering_and_Reproduction_Techniques_for_Multichannel_Spatial_Audio","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566291"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23566291/A_Perceptual_Evaluation_of_Room_Effect_Methods_for_Multichannel_Spatial_Audio"><img alt="Research paper thumbnail of A Perceptual Evaluation of Room Effect Methods for Multichannel Spatial Audio" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23566291/A_Perceptual_Evaluation_of_Room_Effect_Methods_for_Multichannel_Spatial_Audio">A Perceptual Evaluation of Room Effect Methods for Multichannel Spatial Audio</a></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566291"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566291"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566291; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566291]").text(description); $(".js-view-count[data-work-id=23566291]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566291; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566291']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23566291]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566291,"title":"A Perceptual Evaluation of Room Effect Methods for Multichannel Spatial Audio","internal_url":"https://www.academia.edu/23566291/A_Perceptual_Evaluation_of_Room_Effect_Methods_for_Multichannel_Spatial_Audio","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566297"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23566297/Soundscapes_from_noise_annoyance_to_the_music_of_urban_life"><img alt="Research paper thumbnail of Soundscapes: from noise annoyance to the music of urban life" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23566297/Soundscapes_from_noise_annoyance_to_the_music_of_urban_life">Soundscapes: from noise annoyance to the music of urban life</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">ABSTRACT Through an overview of empirical research over the past ten years, we present an interpr...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">ABSTRACT Through an overview of empirical research over the past ten years, we present an interpretation of the evolution of the concept of soundscape. From a strictly acoustic definition in terms of physical descriptors, it evolved into a more complex notion integrating the effects of noise on citizens. Soundscape research therefore requires the contribution of human sciences (psychology of perception, cognitive psychology, sociology, anthropology of senses). There is converging evidence that people judgments of noise is related to the meaning given to the activities producing the noise. Consequently, physical descriptions in acoustics have to face the diversity of human reactions to noise correlated with the diversity of activities, the diversity of source producing noises within a diversity of physical environments (natural, architectural). This paper focuses on major emerging issues in soundscapes research, namely - the integration of such a diversity of pieces of knowledge within a general knowledge - the translation of these diverse conceptualizations into physical descriptions - the elaboration of convincing physical measurements for decision makers. We will draw consequences for further developments in the cooperative and pluridisciplinary research and for producing guidelines for new orientations in community policies.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566297"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566297"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566297; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566297]").text(description); $(".js-view-count[data-work-id=23566297]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566297; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566297']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23566297]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566297,"title":"Soundscapes: from noise annoyance to the music of urban life","internal_url":"https://www.academia.edu/23566297/Soundscapes_from_noise_annoyance_to_the_music_of_urban_life","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566303"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23566303/Applying_the_stratified_model_of_relevance_interactions_to_music_information_retrieval"><img alt="Research paper thumbnail of Applying the stratified model of relevance interactions to music information retrieval" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23566303/Applying_the_stratified_model_of_relevance_interactions_to_music_information_retrieval">Applying the stratified model of relevance interactions to music information retrieval</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">ABSTRACT While research on the notion of relevance has a long and rich history in information ret...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">ABSTRACT While research on the notion of relevance has a long and rich history in information retrieval for textual documents, formal considerations of relevance concepts in Music Information Retrieval (MIR) remain scarce. We discuss the application of Saracevic&amp;amp;#39;s stratified model of relevance interactions to the music information domain. This model offers a tool for deliberation on the development of user-oriented MIR systems, and a framework for the aggregation of findings on the music information needs and behaviours of potential users.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566303"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566303"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566303; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566303]").text(description); $(".js-view-count[data-work-id=23566303]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566303; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566303']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23566303]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566303,"title":"Applying the stratified model of relevance interactions to music information retrieval","internal_url":"https://www.academia.edu/23566303/Applying_the_stratified_model_of_relevance_interactions_to_music_information_retrieval","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566304"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23566304/Languages_and_conceptualization_of_soundscapes_A_cross_linguistic_analysis"><img alt="Research paper thumbnail of Languages and conceptualization of soundscapes: A cross-linguistic analysis" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23566304/Languages_and_conceptualization_of_soundscapes_A_cross_linguistic_analysis">Languages and conceptualization of soundscapes: A cross-linguistic analysis</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">ABSTRACT In the past decade, soundscape research has emerged accounting for acoustic phenomena as...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">ABSTRACT In the past decade, soundscape research has emerged accounting for acoustic phenomena as they are perceived and assessed by humans. In this view, concepts and methodologies from social sciences and humanities are needed to identify the diversity of conceptualizations across time, space, and languages. Specifically, our approach relies on linguistics and psychology in analyzing how people describe their sensory experience (what is being said and how it is being said), in order to identify different conceptualizations conveyed in their discourse. We first investigate the linguistic resources available in different languages with a cross-linguistic survey of free-format verbal descriptions of acoustic phenomena in European languages (e.g., French, English, Dutch, Spanish), extending the pilot investigation by Dubois and Guastavino (2008). Then, coupling this linguistic analysis with cognitive theories on categorization, we can infer a diversity of conceptualizations for the same acoustic phenomena. This approach further allows us to overcome some limitations of current survey design: the use of closed-ended questions confining responses to categories pre-defined by the experimenter, and basic translations not taking into consideration semantic languages specificities. Our results provide a theoretical grounding and methodological guidelines for designing questionnaires for cross-cultural evaluation of soundscapes.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566304"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566304"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566304; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566304]").text(description); $(".js-view-count[data-work-id=23566304]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566304; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566304']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23566304]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566304,"title":"Languages and conceptualization of soundscapes: A cross-linguistic analysis","internal_url":"https://www.academia.edu/23566304/Languages_and_conceptualization_of_soundscapes_A_cross_linguistic_analysis","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566306"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23566306/Voice_Categorization_Using_Free_Sorting_Tasks"><img alt="Research paper thumbnail of Voice Categorization Using Free Sorting Tasks" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23566306/Voice_Categorization_Using_Free_Sorting_Tasks">Voice Categorization Using Free Sorting Tasks</a></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566306"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566306"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566306; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566306]").text(description); $(".js-view-count[data-work-id=23566306]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566306; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566306']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23566306]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566306,"title":"Voice Categorization Using Free Sorting Tasks","internal_url":"https://www.academia.edu/23566306/Voice_Categorization_Using_Free_Sorting_Tasks","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566317"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/23566317/The_Sensor_City_Initiative_cognitive_sensors_for_soundscape_transformations"><img alt="Research paper thumbnail of The Sensor City Initiative: cognitive sensors for soundscape transformations" class="work-thumbnail" src="https://attachments.academia-assets.com/43991458/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/23566317/The_Sensor_City_Initiative_cognitive_sensors_for_soundscape_transformations">The Sensor City Initiative: cognitive sensors for soundscape transformations</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The authors introduce a novel urban measurement system. Sensor City, located in the Netherlands, ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The authors introduce a novel urban measurement system. Sensor City, located in the Netherlands, is a large network of outdoor sensor nodes with high-bandwidth communication to a central database and GIS. The network expands the ability to systematically investigate human perception and evaluation in real urban settings and it is the first known cognitive sensor network available to soundscape researchers. The emphasis for this paper is on soundscape evaluation, or the way the acoustic environment is perceived. The authors outline the challenges posed by a cognitive sensor system and how it can approach a more human- like understanding of soundscape, such as through the detection of meaningful events, deciding what data to record, and in-field sensor placement. The authors then explore the potential benefits to the host city in a number of domains and highlight upcoming research directions that rely on this technology, such as the automatic judgment of appropriateness in an urban se...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="97187f88650458c41f298dd91efbace5" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:43991458,&quot;asset_id&quot;:23566317,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/43991458/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566317"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566317"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566317; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566317]").text(description); $(".js-view-count[data-work-id=23566317]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566317; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566317']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "97187f88650458c41f298dd91efbace5" } } $('.js-work-strip[data-work-id=23566317]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566317,"title":"The Sensor City Initiative: cognitive sensors for soundscape transformations","internal_url":"https://www.academia.edu/23566317/The_Sensor_City_Initiative_cognitive_sensors_for_soundscape_transformations","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":43991458,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/43991458/thumbnails/1.jpg","file_name":"The_Sensor_City_Initiative_cognitive_sen20160322-7935-1bbose4.pdf","download_url":"https://www.academia.edu/attachments/43991458/download_file","bulk_download_file_name":"The_Sensor_City_Initiative_cognitive_sen.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/43991458/The_Sensor_City_Initiative_cognitive_sen20160322-7935-1bbose4-libre.pdf?1458656448=\u0026response-content-disposition=attachment%3B+filename%3DThe_Sensor_City_Initiative_cognitive_sen.pdf\u0026Expires=1741338292\u0026Signature=asbcP~x5nm3fgth4RhewgQeFAabU2joNOclba0aCUU64PJDJoLcpZenUJZWi2QTYXWfqP~aF98zoAASYGHtAxOdOXG~CWJdALmH0syoRLM293RSn6VZKDuXsZ8Utedv7lR6bGs5ZH2yOTYNdXtjAssCZbUrV-J9t5lODyL8GoZnFtShGWI1ZP9r3JZ-yy1BAtasqOkq~GmZCnoCZb5MeTu9vbMQ-cluDwrNT4F2u26ivZtfcvvIkd8nG~EjFjCRmzhbLKbdZInHkzDqaa18diWGK3PUMFwY0bQW1~LyjsRmNso33LATYThHUBxr8Z1e-J5fZZIAFj5upUksBgbIokg__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23624830"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23624830/Soundscape_ecology_A_worldwide_network"><img alt="Research paper thumbnail of Soundscape ecology: A worldwide network" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23624830/Soundscape_ecology_A_worldwide_network">Soundscape ecology: A worldwide network</a></div><div class="wp-workCard_item"><span>The Journal of the Acoustical Society of America</span><span>, 2011</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The overarching objective of our network is to bring together acousticians, cognitive psychologis...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The overarching objective of our network is to bring together acousticians, cognitive psychologists, ecologists, and creative artists to integrate how they study and perceive soundscapes and use this knowledge to help shape a research agenda for the conservation of soundscapes. Many natural soundscapes are being threatened from various directions, eg, habitat destruction, climate change, invasive species. This project aims at recording and documenting soundscapes in remote locations and identifying conceptualizations of these ...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23624830"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23624830"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23624830; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23624830]").text(description); $(".js-view-count[data-work-id=23624830]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23624830; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23624830']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23624830]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23624830,"title":"Soundscape ecology: A worldwide network","internal_url":"https://www.academia.edu/23624830/Soundscape_ecology_A_worldwide_network","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23624846"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23624846/Measuring_the_perceived_restorativeness_of_soundscapes_is_it_about_the_sounds_the_person_or_the_environment"><img alt="Research paper thumbnail of Measuring the perceived restorativeness of soundscapes: is it about the sounds, the person, or the environment?" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23624846/Measuring_the_perceived_restorativeness_of_soundscapes_is_it_about_the_sounds_the_person_or_the_environment">Measuring the perceived restorativeness of soundscapes: is it about the sounds, the person, or the environment?</a></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23624846"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23624846"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23624846; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23624846]").text(description); $(".js-view-count[data-work-id=23624846]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23624846; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23624846']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23624846]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23624846,"title":"Measuring the perceived restorativeness of soundscapes: is it about the sounds, the person, or the environment?","internal_url":"https://www.academia.edu/23624846/Measuring_the_perceived_restorativeness_of_soundscapes_is_it_about_the_sounds_the_person_or_the_environment","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23624849"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23624849/Different_perceptions_of_information_design_according_to_the_disciplines_and_the_areas_of_expertise_Analysis_of_the_English_scientific_literature"><img alt="Research paper thumbnail of Different perceptions of&quot; information design&quot; according to the disciplines and the areas of expertise: Analysis of the English scientific literature" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23624849/Different_perceptions_of_information_design_according_to_the_disciplines_and_the_areas_of_expertise_Analysis_of_the_English_scientific_literature">Different perceptions of&quot; information design&quot; according to the disciplines and the areas of expertise: Analysis of the English scientific literature</a></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23624849"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23624849"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23624849; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23624849]").text(description); $(".js-view-count[data-work-id=23624849]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23624849; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23624849']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23624849]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23624849,"title":"Different perceptions of\" information design\" according to the disciplines and the areas of expertise: Analysis of the English scientific literature","internal_url":"https://www.academia.edu/23624849/Different_perceptions_of_information_design_according_to_the_disciplines_and_the_areas_of_expertise_Analysis_of_the_English_scientific_literature","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23624856"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23624856/Analyzing_Melodic_Similarity_Judgements_in_Flamenco_a_Cappella_Singing"><img alt="Research paper thumbnail of Analyzing Melodic Similarity Judgements in Flamenco a Cappella Singing" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23624856/Analyzing_Melodic_Similarity_Judgements_in_Flamenco_a_Cappella_Singing">Analyzing Melodic Similarity Judgements in Flamenco a Cappella Singing</a></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23624856"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23624856"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23624856; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23624856]").text(description); $(".js-view-count[data-work-id=23624856]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23624856; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23624856']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23624856]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23624856,"title":"Analyzing Melodic Similarity Judgements in Flamenco a Cappella Singing","internal_url":"https://www.academia.edu/23624856/Analyzing_Melodic_Similarity_Judgements_in_Flamenco_a_Cappella_Singing","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="36185518"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/36185518/Sampling_Rate_Discrimination_44_1_KHz_Vs_88_2_KHz"><img alt="Research paper thumbnail of Sampling Rate Discrimination: 44.1 KHz Vs. 88.2 KHz" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/36185518/Sampling_Rate_Discrimination_44_1_KHz_Vs_88_2_KHz">Sampling Rate Discrimination: 44.1 KHz Vs. 88.2 KHz</a></div><div class="wp-workCard_item wp-workCard--coauthors"><span>by </span><span><a class="" data-click-track="profile-work-strip-authors" href="https://mcgill.academia.edu/CatherineGuastavino">Catherine Guastavino</a> and <a class="" data-click-track="profile-work-strip-authors" href="https://independent.academia.edu/AmandinePras">Amandine Pras</a></span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">It is currently common practice for sound engineers to record digital music using high-resolution...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">It is currently common practice for sound engineers to record digital music using high-resolution formats, and then down sample the files to 44.1kHz for commercial release. This study aims at investigating whether listeners can perceive differences between musical files recorded ...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="36185518"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="36185518"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 36185518; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=36185518]").text(description); $(".js-view-count[data-work-id=36185518]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 36185518; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='36185518']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=36185518]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":36185518,"title":"Sampling Rate Discrimination: 44.1 KHz Vs. 88.2 KHz","internal_url":"https://www.academia.edu/36185518/Sampling_Rate_Discrimination_44_1_KHz_Vs_88_2_KHz","owner_id":57510318,"coauthors_can_edit":true,"owner":{"id":57510318,"first_name":"Amandine","middle_initials":null,"last_name":"Pras","page_name":"AmandinePras","domain_name":"independent","created_at":"2016-12-01T08:44:38.884-08:00","display_name":"Amandine Pras","url":"https://independent.academia.edu/AmandinePras"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="36185531"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/36185531/Qualitative_evaluation_of_Wave_Field_Synthesis_with_expert_listeners"><img alt="Research paper thumbnail of Qualitative evaluation of Wave Field Synthesis with expert listeners" class="work-thumbnail" src="https://attachments.academia-assets.com/56085689/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/36185531/Qualitative_evaluation_of_Wave_Field_Synthesis_with_expert_listeners">Qualitative evaluation of Wave Field Synthesis with expert listeners</a></div><div class="wp-workCard_item wp-workCard--coauthors"><span>by </span><span><a class="" data-click-track="profile-work-strip-authors" href="https://independent.academia.edu/AmandinePras">Amandine Pras</a> and <a class="" data-click-track="profile-work-strip-authors" href="https://mcgill.academia.edu/CatherineGuastavino">Catherine Guastavino</a></span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="6461388cd3ccaf1a925b72c160dd8b6d" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:56085689,&quot;asset_id&quot;:36185531,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/56085689/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="36185531"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="36185531"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 36185531; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=36185531]").text(description); $(".js-view-count[data-work-id=36185531]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 36185531; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='36185531']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "6461388cd3ccaf1a925b72c160dd8b6d" } } $('.js-work-strip[data-work-id=36185531]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":36185531,"title":"Qualitative evaluation of Wave Field Synthesis with expert listeners","internal_url":"https://www.academia.edu/36185531/Qualitative_evaluation_of_Wave_Field_Synthesis_with_expert_listeners","owner_id":57510318,"coauthors_can_edit":true,"owner":{"id":57510318,"first_name":"Amandine","middle_initials":null,"last_name":"Pras","page_name":"AmandinePras","domain_name":"independent","created_at":"2016-12-01T08:44:38.884-08:00","display_name":"Amandine Pras","url":"https://independent.academia.edu/AmandinePras"},"attachments":[{"id":56085689,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/56085689/thumbnails/1.jpg","file_name":"Qualitative_evaluation_of_Wave_Field_Syn20180317-9297-yekrye.pdf","download_url":"https://www.academia.edu/attachments/56085689/download_file","bulk_download_file_name":"Qualitative_evaluation_of_Wave_Field_Syn.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/56085689/Qualitative_evaluation_of_Wave_Field_Syn20180317-9297-yekrye-libre.pdf?1521301435=\u0026response-content-disposition=attachment%3B+filename%3DQualitative_evaluation_of_Wave_Field_Syn.pdf\u0026Expires=1741192869\u0026Signature=Jf8vGh2yPeBEMe05XXOwNgsBJB8lq2VTLrKgHNucLVv9UWGxt3fML1TyhhiaWbn3PLeBJ9kI9w-NOTo2h2TwMYHyL4vzwpMdHo4E8aRQAgJQYVGJW0OiihRIEVpUaed-xWrNDbUguG-iDmDwToa5d4Y0jVZtyTwIA6sEqQx34xK~f72HYn1rkIlOLHtnKY1MP5nRpGtHPq4nemg-QaHdo5FOdFWYEY1~iUUoD3Njs~3Pz~bljDyAP0zJqtaU5b1O0zycZksz~1~pyW5u7jNyXKXHe4h-7u~zFRF9NUDuTeR8k2vIp4N3AbXqS4irUHqKynUgcOyFFMjY-~cbjZ4RZQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="441307"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/441307/Qualitative_evaluation_of_Wave_Field_Synthesis_with_expert_listeners"><img alt="Research paper thumbnail of Qualitative evaluation of Wave Field Synthesis with expert listeners" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/441307/Qualitative_evaluation_of_Wave_Field_Synthesis_with_expert_listeners">Qualitative evaluation of Wave Field Synthesis with expert listeners</a></div><div class="wp-workCard_item wp-workCard--coauthors"><span>by </span><span><a class="" data-click-track="profile-work-strip-authors" href="https://york.academia.edu/AmandinePras">Amandine Pras</a> and <a class="" data-click-track="profile-work-strip-authors" href="https://mcgill.academia.edu/CatherineGuastavino">Catherine Guastavino</a></span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="74b6716869cda8656c9ff6ba2a50c567" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:34367788,&quot;asset_id&quot;:441307,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/34367788/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="441307"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="441307"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 441307; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=441307]").text(description); $(".js-view-count[data-work-id=441307]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 441307; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='441307']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "74b6716869cda8656c9ff6ba2a50c567" } } $('.js-work-strip[data-work-id=441307]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":441307,"title":"Qualitative evaluation of Wave Field Synthesis with expert listeners","internal_url":"https://www.academia.edu/441307/Qualitative_evaluation_of_Wave_Field_Synthesis_with_expert_listeners","owner_id":153328,"coauthors_can_edit":true,"owner":{"id":153328,"first_name":"Amandine","middle_initials":null,"last_name":"Pras","page_name":"AmandinePras","domain_name":"york","created_at":"2010-03-27T23:55:27.067-07:00","display_name":"Amandine Pras","url":"https://york.academia.edu/AmandinePras"},"attachments":[{"id":34367788,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://a.academia-assets.com/images/blank-paper.jpg","file_name":"PrasCorteelGuastavino-DAGA2009.pdf","download_url":"https://www.academia.edu/attachments/34367788/download_file","bulk_download_file_name":"Qualitative_evaluation_of_Wave_Field_Syn.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/34367788/PrasCorteelGuastavino-DAGA2009-libre.pdf?1407225287=\u0026response-content-disposition=attachment%3B+filename%3DQualitative_evaluation_of_Wave_Field_Syn.pdf\u0026Expires=1741192869\u0026Signature=ZcmiM8Fp0LM~5w6sVYrP08snTa1Oq9hH6pGk~gXfZ9t57y1b-H0Ivd0ZBS-LnNFVbg2OtcR2lF8PplCRSaN7xtJVbBGYwFAxg2PbS5rv2LvkrWFIgDpuVmhy~hzHvMyeQCNf48-oqF20~J7zMVjpdMPVzsy-JKhHTUH~k5Ocid0uH6tKw8UCVhUXUAV2MSvluXR1aAxKZJVtwzDle9aYTzUDhk4kgm7tjdbntcdgHKux192FOqC1isYTjrZaq-eSub3TJlILBwAbih4uWailoqAW5ZtaOlLAYEQXxSzQWFc-3tDHERhFxrwtqcz--2Y7i7CL2SORD-KzwZZMIEWeVw__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> </div><div class="profile--tab_content_container js-tab-pane tab-pane" data-section-id="4884033" id="journalarticles"><div class="js-work-strip profile--work_container" data-work-id="83817852"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/83817852/A_sonic_perspective_for_the_post_pandemic_future_of_entertainment_districts_the_case_of_Montreal_s_Quartier_des_Spectacles"><img alt="Research paper thumbnail of A sonic perspective for the post-pandemic future of entertainment districts: the case of Montreal’s Quartier des Spectacles" class="work-thumbnail" src="https://attachments.academia-assets.com/89298460/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/83817852/A_sonic_perspective_for_the_post_pandemic_future_of_entertainment_districts_the_case_of_Montreal_s_Quartier_des_Spectacles">A sonic perspective for the post-pandemic future of entertainment districts: the case of Montreal’s Quartier des Spectacles</a></div><div class="wp-workCard_item wp-workCard--coauthors"><span>by </span><span><a class="" data-click-track="profile-work-strip-authors" href="https://iuav1.academia.edu/NicolaDiCroce">Nicola Di Croce</a> and <a class="" data-click-track="profile-work-strip-authors" href="https://mcgill.academia.edu/CatherineGuastavino">Catherine Guastavino</a></span></div><div class="wp-workCard_item"><span>Journal of Environmental Planning and Management </span><span>, 2022</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">In 2020, the pandemic impacted the social and economic dynamics of cities around the world. Enter...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">In 2020, the pandemic impacted the social and economic dynamics of cities around the world. Entertainment districts hosting events, festivals, and other cultural activities were particularly affected, as their loss of attractiveness also impacted their livability. Reflecting on how the experience of the sonic environment contributes to attractiveness and livability in an urban environment, we propose a sonic perspective to investigate the impact of the COVID-19 pandemic in Montreal’s entertainment district – Quartier des Spectacles (QDS). Through semi-structured interviews, we focus on how the sonic experience of QDS’s residents changed throughout 2020, and on how their experiences can provide valuable insight into addressing the district’s future planning and management. Looking at QDS as a case study to orient the post-pandemic trajectories of entertainment districts, we present a number of sound-related governance recommendations aimed at strengthening QDS residents’ involvement in the neighborhood’s cultural, artistic, and political life and its decision-making processes.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="e4ca0e520576946be09992e2ea7eea5b" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:89298460,&quot;asset_id&quot;:83817852,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/89298460/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="83817852"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="83817852"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 83817852; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=83817852]").text(description); $(".js-view-count[data-work-id=83817852]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 83817852; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='83817852']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "e4ca0e520576946be09992e2ea7eea5b" } } $('.js-work-strip[data-work-id=83817852]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":83817852,"title":"A sonic perspective for the post-pandemic future of entertainment districts: the case of Montreal’s Quartier des Spectacles","internal_url":"https://www.academia.edu/83817852/A_sonic_perspective_for_the_post_pandemic_future_of_entertainment_districts_the_case_of_Montreal_s_Quartier_des_Spectacles","owner_id":10688207,"coauthors_can_edit":true,"owner":{"id":10688207,"first_name":"Nicola","middle_initials":null,"last_name":"Di Croce","page_name":"NicolaDiCroce","domain_name":"iuav1","created_at":"2014-03-31T20:32:52.340-07:00","display_name":"Nicola Di Croce","url":"https://iuav1.academia.edu/NicolaDiCroce"},"attachments":[{"id":89298460,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/89298460/thumbnails/1.jpg","file_name":"Di_Croce_et_al._2020_JPEM.pdf","download_url":"https://www.academia.edu/attachments/89298460/download_file","bulk_download_file_name":"A_sonic_perspective_for_the_post_pandemi.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/89298460/Di_Croce_et_al._2020_JPEM-libre.pdf?1659716515=\u0026response-content-disposition=attachment%3B+filename%3DA_sonic_perspective_for_the_post_pandemi.pdf\u0026Expires=1741338292\u0026Signature=Gwnd9u1FIcLE2xFSfdWn4Lge4Ljhl501ftEVhr~sYC6DrwtiXNHv2OpiR~cWJ60eYKEA5fzJ-9QnlDSRFkmOGr9R9SXGe~fXo8y9CsrLufK8yo-Qy2aM40p6-~QBseXn1BMtMpQxAHARV6LWPBUOU~yzIJhmKtsw2~T4z47FJuA8SRO0RPQNW~gkGKZl5aPnyxlk8-w6b6o9ODhrTweBBNi8LQuxwfJWIi1wcUSXcWo426rPAlVvOOs1LWs1z6ID11TwLxTNKQc4X4jwxo4rmCYEYDDSKrnzIn6sAhzv3OTWAWcVZeu1ZwvCETTuBFv-9pzSnahqfrwiR-uVTWg3ug__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566267"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23566267/Archiving_electroacoustic_and_mixed_music_Significant_knowledge_involved_in_the_creative_process_of_works_with_spatialisation"><img alt="Research paper thumbnail of Archiving electroacoustic and mixed music: Significant knowledge involved in the creative process of works with spatialisation" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23566267/Archiving_electroacoustic_and_mixed_music_Significant_knowledge_involved_in_the_creative_process_of_works_with_spatialisation">Archiving electroacoustic and mixed music: Significant knowledge involved in the creative process of works with spatialisation</a></div><div class="wp-workCard_item"><span>Journal of Documentation</span><span>, 2012</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">ABSTRACT Purpose – The purpose of this paper is to identify, operationalise, and test a knowledge...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">ABSTRACT Purpose – The purpose of this paper is to identify, operationalise, and test a knowledge management model in the context of electroacoustic and mixed music preservation. This operationalisation intends to provide an interdisciplinary framework for the specification of meaningful usability for idiosyncratic technological artefacts build up during the creative process of these works. Design/methodology/approach – The design of the questionnaire was based on semi-structured interviews with seven composers. The resulting questionnaire was used for an online survey targeting composers registered at electroacoustic and mixed music online associations. Data were collected from 33 composers. Findings – This article demonstrates the relevance of Boisot’s knowledge management model in order to categorize the knowledge involved during the creative process of electroacoustic and mixed music with spatialisation. Research limitations/implications – In terms of Boisot’s model operationalisation, the authors identified limitations with regards to composers’ ability to discriminate between different levels of abstraction and diffusion. Since multiple agents, both human and non-human, are involved in the creative process of electroacoustic and mixed music, further studies should address their interaction throughout the creative process. Originality/value – Based on the findings of the survey, the authors propose the concept of significant knowledge as an extension of significant properties in order to provide a meaningful usability of digital objects. Since similar technologies are used in theatre, dance, and fine arts, the authors expect this research to benefit the artistic community at large in terms of preservation.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566267"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566267"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566267; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566267]").text(description); $(".js-view-count[data-work-id=23566267]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566267; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566267']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23566267]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566267,"title":"Archiving electroacoustic and mixed music: Significant knowledge involved in the creative process of works with spatialisation","internal_url":"https://www.academia.edu/23566267/Archiving_electroacoustic_and_mixed_music_Significant_knowledge_involved_in_the_creative_process_of_works_with_spatialisation","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="13652496"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/13652496/Upper_limits_of_auditory_rotational_motion_perception"><img alt="Research paper thumbnail of Upper limits of auditory rotational motion perception" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/13652496/Upper_limits_of_auditory_rotational_motion_perception">Upper limits of auditory rotational motion perception</a></div><div class="wp-workCard_item wp-workCard--coauthors"><span>by </span><span><a class="" data-click-track="profile-work-strip-authors" href="https://mcgill.academia.edu/CatherineGuastavino">Catherine Guastavino</a>, <a class="" data-click-track="profile-work-strip-authors" href="https://independent.academia.edu/IljaFrissen">Ilja Frissen</a>, and <a class="" data-click-track="profile-work-strip-authors" href="https://independent.academia.edu/Fran%C3%A7oisxavierF%C3%A9ron">François-xavier Féron</a></span></div><div class="wp-workCard_item"><span>Journal of The Acoustical Society of America</span><span>, 2010</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Three experiments are reported, which investigated the auditory velocity thresholds beyond which ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Three experiments are reported, which investigated the auditory velocity thresholds beyond which listeners are no longer able to perceptually resolve a smooth circular trajectory. These thresholds were measured for band-limited noises, white noise, and harmonic sounds (HS), and in different acoustical environments. Experiments 1 and 2 were conducted in an acoustically dry laboratory. Observed thresholds varied as a function of stimulus type and spectral content. Thresholds for band-limited noises were unaffected by center frequency and equal to that of white noise. For HS, however, thresholds decreased as the fundamental frequency of the stimulus increased. The third experiment was a replication of the second in a reverberant concert hall, which produced qualitatively similar results except that thresholds were significantly higher than in the acoustically dry laboratory.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="13652496"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="13652496"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 13652496; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=13652496]").text(description); $(".js-view-count[data-work-id=13652496]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 13652496; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='13652496']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=13652496]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":13652496,"title":"Upper limits of auditory rotational motion perception","internal_url":"https://www.academia.edu/13652496/Upper_limits_of_auditory_rotational_motion_perception","owner_id":32807768,"coauthors_can_edit":true,"owner":{"id":32807768,"first_name":"Ilja","middle_initials":null,"last_name":"Frissen","page_name":"IljaFrissen","domain_name":"independent","created_at":"2015-07-05T07:20:17.763-07:00","display_name":"Ilja Frissen","url":"https://independent.academia.edu/IljaFrissen"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566269"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/23566269/The_impact_of_technological_advances_on_recording_studio_practices"><img alt="Research paper thumbnail of The impact of technological advances on recording studio practices" class="work-thumbnail" src="https://attachments.academia-assets.com/43991443/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/23566269/The_impact_of_technological_advances_on_recording_studio_practices">The impact of technological advances on recording studio practices</a></div><div class="wp-workCard_item"><span>Journal of the American Society for Information Science and Technology</span><span>, 2013</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Since the invention of sound reproduction in the late 19th century, studio practices in musical r...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Since the invention of sound reproduction in the late 19th century, studio practices in musical recording evolved in parallel with technological improvements. Recently, digital technology and Internet file sharing led to the delocalization of professional recording studios and the decline of traditional record companies. A direct consequence of this new paradigm is that studio professions found themselves in a transitional phase, needing to be reinvented. To understand the scope of these recent technological advances, we first offer an overview of musical recording culture and history and show how studio recordings became a sophisticated form of musical artwork that differed from concert representations. We then trace the economic evolution of the recording industry through technological advances and present positive and negative impacts of the decline of the traditional business model on studio practices and professions. Finally, we report findings from interviews with six world-renowned record producers reflecting on their recording approaches, the impact of recent technological advances on their careers, and the future of their profession. Interviewees appreciate working on a wider variety of projects than they have in the past, but they all discuss trade-offs between artistic expectations and budget constraints in the current paradigm. Our investigations converge to show that studio professionals have adjusted their working settings to the new economic situation, although they still rely on the same aesthetic approaches as in the traditional business model to produce musical recordings.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="1a8292bd0417310ae3004a8c9b8772ff" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:43991443,&quot;asset_id&quot;:23566269,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/43991443/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566269"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566269"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566269; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566269]").text(description); $(".js-view-count[data-work-id=23566269]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566269; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566269']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "1a8292bd0417310ae3004a8c9b8772ff" } } $('.js-work-strip[data-work-id=23566269]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566269,"title":"The impact of technological advances on recording studio practices","internal_url":"https://www.academia.edu/23566269/The_impact_of_technological_advances_on_recording_studio_practices","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino","email":"RjlZMElORWt0WGJoUWY2dXFUaEpGVXdTNEwza0JIdTF2cUZJRWR4Rnl2Q3B5K2VsZlRpSEZndE80Z0tqOXo1SS0tdlFKTis5dUxVZXg1NHo2TUIrVmVhQT09--5ab33442166b72f9f32f73496a267518a05a5602"},"attachments":[{"id":43991443,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/43991443/thumbnails/1.jpg","file_name":"The_impact_of_technological_advances_on_20160322-7932-1673fa6.pdf","download_url":"https://www.academia.edu/attachments/43991443/download_file","bulk_download_file_name":"The_impact_of_technological_advances_on.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/43991443/The_impact_of_technological_advances_on_20160322-7932-1673fa6-libre.pdf?1458656444=\u0026response-content-disposition=attachment%3B+filename%3DThe_impact_of_technological_advances_on.pdf\u0026Expires=1741013805\u0026Signature=OamJO4widJg5NmVjn95f0Zj3AhBUdT7ZSjYw9qW0uoSbzbJ2T1f8gjPtEkKYXWx1TWxWYrbqWLNN3-2jjZDBlbEEPICI3kg0cal0FQcCUz9KcG2WRcKejCcCS~8QIg7uVvX3xeLVQjRxHOmsV~W6ysLPNf~FJUDCnlMXs2BxbACo6FKLYCyXKhgZfY6A3LVKvVvwwJ8Co23O3Dz8OaokUZuOSsDBuDutwizPuXPg1~cEEM5ZMQZWnj1uKJvmW9V0VIksLN7gbSBPENa7Q0G7VKCKL~sYloy~4OEBK9yBQozkHAYjzpII4cQGtkcBXbrz6HpQA5iU4ggq33AS2BT7RQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566271"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/23566271/Comparison_of_Methods_for_Collecting_and_Modeling_Dissimilarity_Data_Applications_to_Complex_Sound_Stimuli"><img alt="Research paper thumbnail of Comparison of Methods for Collecting and Modeling Dissimilarity Data: Applications to Complex Sound Stimuli" class="work-thumbnail" src="https://attachments.academia-assets.com/43991440/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/23566271/Comparison_of_Methods_for_Collecting_and_Modeling_Dissimilarity_Data_Applications_to_Complex_Sound_Stimuli">Comparison of Methods for Collecting and Modeling Dissimilarity Data: Applications to Complex Sound Stimuli</a></div><div class="wp-workCard_item"><span>Multivariate Behavioral Research</span><span>, 2011</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Sorting procedures are frequently adopted as an alternative to dissimilarity ratings to measure t...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Sorting procedures are frequently adopted as an alternative to dissimilarity ratings to measure the dissimilarity of large sets of stimuli in a comparatively short time. However, systematic empirical research on the consequences of this experiment-design choice is lacking. We carried out a behavioral experiment to assess the extent to which sorting procedures compare to dissimilarity ratings in terms of efficiency,</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="ae25b7964fdb3cf641529d350c345a05" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:43991440,&quot;asset_id&quot;:23566271,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/43991440/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566271"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566271"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566271; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566271]").text(description); $(".js-view-count[data-work-id=23566271]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566271; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566271']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "ae25b7964fdb3cf641529d350c345a05" } } $('.js-work-strip[data-work-id=23566271]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566271,"title":"Comparison of Methods for Collecting and Modeling Dissimilarity Data: Applications to Complex Sound Stimuli","internal_url":"https://www.academia.edu/23566271/Comparison_of_Methods_for_Collecting_and_Modeling_Dissimilarity_Data_Applications_to_Complex_Sound_Stimuli","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":43991440,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/43991440/thumbnails/1.jpg","file_name":"BLG_2011_MBR.pdf","download_url":"https://www.academia.edu/attachments/43991440/download_file","bulk_download_file_name":"Comparison_of_Methods_for_Collecting_and.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/43991440/BLG_2011_MBR-libre.pdf?1458656446=\u0026response-content-disposition=attachment%3B+filename%3DComparison_of_Methods_for_Collecting_and.pdf\u0026Expires=1741324680\u0026Signature=DhkZbZiKzammpCyTiCt5AgP1ZIMVvHUmyUP7NYscFQOJ8QVrmGZchqpV0FdCkCVuyE89KShbtMq3frTKsKesVjcSPVAMEEVYLjTMt-pfdC3kGpH2oh-DnV8hCyrNEE38ePMTX8q0lxq3iLYy~ntPppsvoa5k5Wj0JLp~k69I66uYFFSStWgJ5FWNq19AtxRe3JQVoijoheYOf5FeHhxElDA29k5xniAa9IAiQZWtjtIt8YgXxjgWdYnTs~Tyfk70I7n1h0iHTmKER9rdPnvOFKmUMSk4ABLXYuMO5h-WOC7YneoBlxRnBPmZ4CUyiU5ZS5QKrE~589QXWuy1TstmIA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566260"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/23566260/Perceptual_Evaluation_of_Rolling_Sound_Synthesis"><img alt="Research paper thumbnail of Perceptual Evaluation of Rolling Sound Synthesis" class="work-thumbnail" src="https://attachments.academia-assets.com/43991434/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/23566260/Perceptual_Evaluation_of_Rolling_Sound_Synthesis">Perceptual Evaluation of Rolling Sound Synthesis</a></div><div class="wp-workCard_item"><span>Acta Acustica united with Acustica</span><span>, 2011</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Three listening tests were conducted to perceptually evaluate different versions of an ew real-ti...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Three listening tests were conducted to perceptually evaluate different versions of an ew real-time synthesis approach for sounds of sustained contact interactions. This study aims to identify the most effective algorithm to create ar ealistic sound for rolling objects. In Experiment 1a nd 2, participants were asked to rate the extent to which 6different versions sounded likerolling sounds. Subsequently,inExperiment 3, participants compared the 6versions best rated in Experiment 1and 2, to the original recordings. Results are presented in terms of both statistical analysis of the most effective synthesis algorithm and qualitative user comments. On methodological grounds, the comparison of Experiments 1, 2and 3highlights major differences between judgments collected in reference to the original recordings as opposed to judgments based on memory representations of rolling sounds. PACS no. 43.66.Lj, 43.75.Zz Received16September 2010, accepted 15 May 2011. * currently at the School of Computing at Dublin City University. † currently at IRCAM, Paris. 1 This study waspart of alarger research project entitled &quot;Haptics, Sound and Interaction in the Design of Enactive Interfaces&quot;. This collaborative project concerned the areas of sound synthesis and interface design, haptic devices, perception and cognition. 840 ©S.Hirzel Verlag · EAA Murphy et al.:R olling sound synthesis ACTA ACUSTICA UNITED WITH ACUSTICA Vol. 97 (2011)</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="fb85b9d7f34e70c108c8df61673970a0" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:43991434,&quot;asset_id&quot;:23566260,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/43991434/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566260"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566260"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566260; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566260]").text(description); $(".js-view-count[data-work-id=23566260]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566260; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566260']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "fb85b9d7f34e70c108c8df61673970a0" } } $('.js-work-strip[data-work-id=23566260]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566260,"title":"Perceptual Evaluation of Rolling Sound Synthesis","internal_url":"https://www.academia.edu/23566260/Perceptual_Evaluation_of_Rolling_Sound_Synthesis","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":43991434,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/43991434/thumbnails/1.jpg","file_name":"Perceptual_Evaluation_of_Rolling_Sound_S20160322-12462-1mwhpu5.pdf","download_url":"https://www.academia.edu/attachments/43991434/download_file","bulk_download_file_name":"Perceptual_Evaluation_of_Rolling_Sound_S.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/43991434/Perceptual_Evaluation_of_Rolling_Sound_S20160322-12462-1mwhpu5-libre.pdf?1458656450=\u0026response-content-disposition=attachment%3B+filename%3DPerceptual_Evaluation_of_Rolling_Sound_S.pdf\u0026Expires=1741338292\u0026Signature=K6r1YuN4QxwNlfsxKxS7PiwPjjWAozu6PfIh4z5MtpDpH0fr2~bLB7RxwxszbJZEjbGowI1EDcfzmNuV6XuoVBuPLofhVhIbodeSUWOhJxOp8uV0~4xyg7MoCsWFi5us8Zq9ZxupHSeVAYENaC2o32ZKnNqOU3EaiQy0gJLXPOmhawV4OW3-DsHhG4niL2Uoi4JFSbUqxWSJw0SKBChn5GaGEETvEbCDl6qGGy8PdNSIaNGiKpFt7DxwCCVr5XoovCzPKZ-HxhxI-Z3nrq0iPkn5RfmobBtAjcFJ8HIEkgP~7gFpYW17v8ryfbgtViyEzJKqTs2LAmccxD6GTH58BQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566262"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/23566262/Ecological_validity_of_soundscape_reproduction"><img alt="Research paper thumbnail of Ecological validity of soundscape reproduction" class="work-thumbnail" src="https://attachments.academia-assets.com/43991437/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/23566262/Ecological_validity_of_soundscape_reproduction">Ecological validity of soundscape reproduction</a></div><div class="wp-workCard_item wp-workCard--coauthors"><span>by </span><span><a class="" data-click-track="profile-work-strip-authors" href="https://mcgill.academia.edu/CatherineGuastavino">Catherine Guastavino</a> and <a class="" data-click-track="profile-work-strip-authors" href="https://sorbonne-fr.academia.edu/BrianKatz">Brian F G Katz</a></span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">We introduce a methodology based on linguistic exploration of verbal data to investigate the infl...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">We introduce a methodology based on linguistic exploration of verbal data to investigate the influence of reproduction method on cognitive processing of environmental sounds in laboratory conditions. Three experiments were carried out to explore the ecological validity of reproduction systems. The reference study consisted of interviews conducted in actual environments, which were also recorded simultaneously. The recordings were used for two listening tests, the first one using stereophonic reproduction and the second one using multichannel reproduction. The comparison of the verbal data collected in the different contexts sketches some theoretical and methodological issues concerning the reproduction of everyday life scenes in laboratory conditions. The linguistic analyses indicate that the &quot;same&quot; acoustic phenomenon gives rise to different cognitive representations, depending on the spatial presentation of the stimuli. It follows that the quality of the reproduction system must be adapted to specific properties of mental representations (here, spatial immersion vs. source identification). On methodological grounds, the analysis of spontaneous language representations gives access to cognitive representations elaborated in real life situations and in experimental conditions. The comparison of the linguistic exploration can then be used as a psycholinguistic measure of the ecological validity of experimental settings.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="7b7184697eaab9566e91ff976c0ab3d8" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:43991437,&quot;asset_id&quot;:23566262,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/43991437/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566262"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566262"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566262; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566262]").text(description); $(".js-view-count[data-work-id=23566262]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566262; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566262']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "7b7184697eaab9566e91ff976c0ab3d8" } } $('.js-work-strip[data-work-id=23566262]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566262,"title":"Ecological validity of soundscape reproduction","internal_url":"https://www.academia.edu/23566262/Ecological_validity_of_soundscape_reproduction","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":43991437,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/43991437/thumbnails/1.jpg","file_name":"Ecological_Validity_of_soundscape_reprod20160322-4458-u8ae5m.pdf","download_url":"https://www.academia.edu/attachments/43991437/download_file","bulk_download_file_name":"Ecological_validity_of_soundscape_reprod.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/43991437/Ecological_Validity_of_soundscape_reprod20160322-4458-u8ae5m-libre.pdf?1458656445=\u0026response-content-disposition=attachment%3B+filename%3DEcological_validity_of_soundscape_reprod.pdf\u0026Expires=1741324680\u0026Signature=WgIKegICjMi5SLaKso9Qpw3PMdhzhhRcdFGe3EGWV9UkQ4PAfIZ8W2rNhzfHJeK1rflWehPb~E0B5znDf5YnYtSSlf49c58dYvBPV9rr0VUIL9uGFuJQqXOiUcLcorqmjgK5p1adEzdckDTkLu17rbZetK6AZuHn9Ftu5adxf4PFfnU1r~437j-QuhV2O8KOhzUN05-ggSENa3f1j8~itucYU-V50nGiLFZEdLDdZGw2RMCsLV8TDFbgsXCIXOWk~RiXvrlmyhPLzH8zu3pvGfReY3qESU9-h72EsmhADrRSJ~MJSlvHifCOdqUY7s-w7WJGfbxwAAVsycIrS11glg__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="7587251"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/7587251/Measuring_Similarity_between_Flamenco_Rhythmic_Patterns"><img alt="Research paper thumbnail of Measuring Similarity between Flamenco Rhythmic Patterns" class="work-thumbnail" src="https://attachments.academia-assets.com/48411192/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/7587251/Measuring_Similarity_between_Flamenco_Rhythmic_Patterns">Measuring Similarity between Flamenco Rhythmic Patterns</a></div><div class="wp-workCard_item wp-workCard--coauthors"><span>by </span><span><a class="" data-click-track="profile-work-strip-authors" href="https://kentwalker.academia.edu/FabriceMarandola">Fabrice Marandola</a> and <a class="" data-click-track="profile-work-strip-authors" href="https://mcgill.academia.edu/CatherineGuastavino">Catherine Guastavino</a></span></div><div class="wp-workCard_item"><span>Journal of New Music Research</span><span>, 2009</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Music similarity underlies a large part of a listener&#39;s experience, as it relates to familiarity ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Music similarity underlies a large part of a listener&#39;s experience, as it relates to familiarity and associations between different pieces or parts. Rhythmic similarity has received scant research attention in comparison with other aspects of music similarity such as melody or harmony. Mathematical measures of rhythmic similarity have been proposed, but none of them has been compared to human judgments. We present a first study consisting of two listening tests conducted to compare two mathematical similarity measures, the chronotonic distance and the directed swap distance, to perceptual measures of similarity. In order to investigate the effect of expertise on the perception of rhythmic similarity, we contrasted three groups of participants, namely non-musicians, classically trained percussionists and flamenco musicians. Results are presented in terms of statistical analysis of the raw ratings, phylogenetic analysis of the dissimilarity matrices, correlation with mathematical measures and qualitative analysis of spontaneous verbal descriptions reported by participants. A main effect of expertise was observed on the raw ratings, but not on the dissimilarity matrices. No effect of tempo was observed. Results of both listening tests converge to show that the directed-swap distance best matches human judgments of similarity regardless of expertise. The analysis of verbal descriptions indicates that novice listeners focused on &#39;surface&#39; features, while musicians focused on the underlying rhythmic structure and used more specialized vocabulary.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="debf774aec35874cfb21a06690fc9588" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:48411192,&quot;asset_id&quot;:7587251,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/48411192/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="7587251"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="7587251"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 7587251; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=7587251]").text(description); $(".js-view-count[data-work-id=7587251]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 7587251; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='7587251']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "debf774aec35874cfb21a06690fc9588" } } $('.js-work-strip[data-work-id=7587251]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":7587251,"title":"Measuring Similarity between Flamenco Rhythmic Patterns","internal_url":"https://www.academia.edu/7587251/Measuring_Similarity_between_Flamenco_Rhythmic_Patterns","owner_id":13673044,"coauthors_can_edit":true,"owner":{"id":13673044,"first_name":"Fabrice","middle_initials":null,"last_name":"Marandola","page_name":"FabriceMarandola","domain_name":"kentwalker","created_at":"2014-07-07T13:19:16.963-07:00","display_name":"Fabrice Marandola","url":"https://kentwalker.academia.edu/FabriceMarandola","email":"K3hGYlFtalh2eWlhSzlUdGxUMmZsbnpid1RkRFNlZVJtczltbEJHRkppeWUrYVpvblNwVE9hdFM3dGZKMWJFQS0tK2hUUkVoQWxuMVV5azV4ZXpjdmkvUT09--6bf2c8168a05a736f23d59a1b6f529db63b6cef4"},"attachments":[{"id":48411192,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/48411192/thumbnails/1.jpg","file_name":"Measuring_Similarity_between_Flamenco_Rh20160829-9420-1hopj34.pdf","download_url":"https://www.academia.edu/attachments/48411192/download_file","bulk_download_file_name":"Measuring_Similarity_between_Flamenco_Rh.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/48411192/Measuring_Similarity_between_Flamenco_Rh20160829-9420-1hopj34-libre.pdf?1472484464=\u0026response-content-disposition=attachment%3B+filename%3DMeasuring_Similarity_between_Flamenco_Rh.pdf\u0026Expires=1741013805\u0026Signature=Y9clQQ5OwYG-84XSiG5olFwNqU2quwi-yiZuFPVLFaKredQb34PPFupkNusnHwW1NhEzHGztCgiSz3IhB1xocTGHlUlH2l11mY~1eqMGxnj4QIXNL0x4g1GYORR~6l-eBWH297cGCLzvuuv0-rK1rJPx8S6I-Yjl-BgJWN4kDWVpizxt7uFss6VBVlT6xwJNSBMlsC4HoUx738iMhvywQer~OwQJqmltWZx-4TwATrwHac5Oiaxf7F1FDUGz~XnAmDrDFe0sF-~CQbjrvK58k7cfhC5~ElqgcKGQn0fnNwYHJqfPGcBuGMC9ewwe1ftF93~kznGQRGamBf6O8cemqg__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566281"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23566281/Following_gesture_following_Grounding_the_documentation_of_a_multi_agent_music_creation_process"><img alt="Research paper thumbnail of Following gesture following: Grounding the documentation of a multi-agent music creation process" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23566281/Following_gesture_following_Grounding_the_documentation_of_a_multi_agent_music_creation_process">Following gesture following: Grounding the documentation of a multi-agent music creation process</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">ABSTRACT The documentation of electroacoustic and mixedmusical works typically relies on a poster...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">ABSTRACT The documentation of electroacoustic and mixedmusical works typically relies on a posteriori data collection. In this article, we argue that the preservation of musical works having technological components should be grounded in a thorough documentation of the creative process that accounts for both human and nonhuman agents of creation. The present research aims at providing a ground for documentation policies that account for the creative process and provide relevant information for performance, migration, and analysis. To do so, we analyzed secondary ethnographic data from a two-year creation and production process of a musical work having a focus on gesture following. Using grounded theory, we developed a conceptual framework with different levels of abstraction and consequent levels of transferability to other creative contexts. Finally, we propose several paths for grounding a subsequent documentation framework in this conceptual framework.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566281"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566281"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566281; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566281]").text(description); $(".js-view-count[data-work-id=23566281]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566281; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566281']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23566281]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566281,"title":"Following gesture following: Grounding the documentation of a multi-agent music creation process","internal_url":"https://www.academia.edu/23566281/Following_gesture_following_Grounding_the_documentation_of_a_multi_agent_music_creation_process","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="13652481"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/13652481/Upper_limits_of_auditory_motion_perception_the_case_of_rotating_sounds"><img alt="Research paper thumbnail of Upper limits of auditory motion perception: the case of rotating sounds" class="work-thumbnail" src="https://attachments.academia-assets.com/45104308/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/13652481/Upper_limits_of_auditory_motion_perception_the_case_of_rotating_sounds">Upper limits of auditory motion perception: the case of rotating sounds</a></div><div class="wp-workCard_item wp-workCard--coauthors"><span>by </span><span><a class="" data-click-track="profile-work-strip-authors" href="https://mcgill.academia.edu/CatherineGuastavino">Catherine Guastavino</a>, <a class="" data-click-track="profile-work-strip-authors" href="https://independent.academia.edu/IljaFrissen">Ilja Frissen</a>, and <a class="" data-click-track="profile-work-strip-authors" href="https://independent.academia.edu/Fran%C3%A7oisxavierF%C3%A9ron">François-xavier Féron</a></span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">We report two experiments investigating rotating sounds presented on a circular array of 12 speak...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">We report two experiments investigating rotating sounds presented on a circular array of 12 speakers. Velocity thresholds were measured for three different types of stimuli (broadband noises, white noise, harmonic sounds). In the first experiment, we gradually increased or decreased the velocity and asked participants to indicate the point at which they stopped or started (respectively) perceiving a rotating sound. The thresholds ranged between 1.95-2.80 rot/s for noises and 1.65-2.75 rot/s for harmonic sounds. We observed significant effects of the direction of velocity change (acceleration or deceleration), stimulus type and fundamental frequencies for harmonic sounds, but no effect of centre frequency was observed for broadband noises. In the second experiment, stimuli were presented at constant velocities in a single-interval forcedchoice paradigm: listeners were asked to indicate if the sound was rotating or not. The thresholds obtained were within the range of those of the first experiment. The effect of frequency for harmonic sounds was confirmed.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="6cebb715fda51fb9dae119f9cdc34cfd" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:45104308,&quot;asset_id&quot;:13652481,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/45104308/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="13652481"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="13652481"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 13652481; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=13652481]").text(description); $(".js-view-count[data-work-id=13652481]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 13652481; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='13652481']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "6cebb715fda51fb9dae119f9cdc34cfd" } } $('.js-work-strip[data-work-id=13652481]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":13652481,"title":"Upper limits of auditory motion perception: the case of rotating sounds","internal_url":"https://www.academia.edu/13652481/Upper_limits_of_auditory_motion_perception_the_case_of_rotating_sounds","owner_id":32807768,"coauthors_can_edit":true,"owner":{"id":32807768,"first_name":"Ilja","middle_initials":null,"last_name":"Frissen","page_name":"IljaFrissen","domain_name":"independent","created_at":"2015-07-05T07:20:17.763-07:00","display_name":"Ilja Frissen","url":"https://independent.academia.edu/IljaFrissen"},"attachments":[{"id":45104308,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/45104308/thumbnails/1.jpg","file_name":"UPPER_LIMITS_OF_AUDITORY_MOTION_PERCEPTI20160426-10373-1xpjm7w.pdf","download_url":"https://www.academia.edu/attachments/45104308/download_file","bulk_download_file_name":"Upper_limits_of_auditory_motion_percepti.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/45104308/UPPER_LIMITS_OF_AUDITORY_MOTION_PERCEPTI20160426-10373-1xpjm7w-libre.pdf?1461685985=\u0026response-content-disposition=attachment%3B+filename%3DUpper_limits_of_auditory_motion_percepti.pdf\u0026Expires=1741324680\u0026Signature=bM8lH5qqzebl9g2xJJzvMuOOwZkE0y1jN795aknnFBCHIUtUtoDDoMDjPz4zPSDyWWzaELBipTASMM2PSY4kuI1gVLfvEGrElt7ZnLTC0FG8pNQLJkSCDvRLMl5~iQJL4UAk9OPbuO6huIS9KPy8NE5fAxvnaivEw~At-vL7ZFU8vQ8hdmC2p-zODFGU8uT5~~-MiMOFeVXfG4kKCPQ64M2ylNbDMn2XLq2R8uiBKidhTskPobM6mWf2dHeIFdAi-1CLpBFU59jtZQ5rv9NU8nBrNfZxZNbY0nycVUeex2b9FzklZnrQNa6XeR4rh2wG6NqQfUuEkp7XXvi18W6cxQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566292"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/23566292/Spectral_and_Spatial_Multichannel_Analysis_Synthesis_of_Interior_Aircraft_Sounds"><img alt="Research paper thumbnail of Spectral and Spatial Multichannel Analysis/Synthesis of Interior Aircraft Sounds" class="work-thumbnail" src="https://attachments.academia-assets.com/43991451/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/23566292/Spectral_and_Spatial_Multichannel_Analysis_Synthesis_of_Interior_Aircraft_Sounds">Spectral and Spatial Multichannel Analysis/Synthesis of Interior Aircraft Sounds</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">A method for spectral and spatial multichannel analysis/synthesis of interior aircraft sounds is ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">A method for spectral and spatial multichannel analysis/synthesis of interior aircraft sounds is presented. We propose two extensions of the classical sinusoids+noise model, adapted to multichannel stationary sounds. First, a spectral estimator is described, using average information across channels for spectral peak detection. Second, the residual modeling is extended to integrate two interchannel spatial cues (i.e., coherence and phase difference). This approach allows real-time synthesis and control of sounds spectral and spatial characteristics. It finds applications for multichannel aircraft sound reproduction, and more generally for musical and environmental sound synthesis. The ability of the model to reproduce multichannel aircraft sounds is assessed by a numerical simulation.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="011b55d07e762a74426cf5e803c9a5a2" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:43991451,&quot;asset_id&quot;:23566292,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/43991451/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566292"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566292"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566292; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566292]").text(description); $(".js-view-count[data-work-id=23566292]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566292; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566292']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "011b55d07e762a74426cf5e803c9a5a2" } } $('.js-work-strip[data-work-id=23566292]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566292,"title":"Spectral and Spatial Multichannel Analysis/Synthesis of Interior Aircraft Sounds","internal_url":"https://www.academia.edu/23566292/Spectral_and_Spatial_Multichannel_Analysis_Synthesis_of_Interior_Aircraft_Sounds","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":43991451,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/43991451/thumbnails/1.jpg","file_name":"Spectral_and_Spatial_Multichannel_Analys20160322-22364-1idaqg5.pdf","download_url":"https://www.academia.edu/attachments/43991451/download_file","bulk_download_file_name":"Spectral_and_Spatial_Multichannel_Analys.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/43991451/Spectral_and_Spatial_Multichannel_Analys20160322-22364-1idaqg5-libre.pdf?1458656450=\u0026response-content-disposition=attachment%3B+filename%3DSpectral_and_Spatial_Multichannel_Analys.pdf\u0026Expires=1741324680\u0026Signature=hCoR9VgJn3OdVRkDPNIxOYwKCmU6bzgNdyD9aba7U3sRy13Di1H3EtciuiDEeXHZYb-vbGkl35q222bpLMmXze7nwmZefq-lKcXW3QcJEaFDN6W49r4XbKLZYTyEqrlLDVIIUYohtbVA-WGuRmnhkcv7bYyyY8Pw0GnHZ-TO3Z8gqT4WinVWmE2ThJaoSEW2HlbjvOKDBA3oyzlO~RyjnnTQ34-pyWkI0dgN7tkhyqeL-VgGy8fIy0xhuaH9jch7VqQjJxt3ag~yzMZ-6rHARclncIRQI0YFtzTZXjZ51R1omXCIufw0bsQpNLqPh7yNoVWByVsLxR3miDrHkb0FPA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566313"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/23566313/Identifying_factors_of_bicycle_comfort_An_online_survey_with_enthusiast_cyclists"><img alt="Research paper thumbnail of Identifying factors of bicycle comfort: An online survey with enthusiast cyclists" class="work-thumbnail" src="https://attachments.academia-assets.com/43991460/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/23566313/Identifying_factors_of_bicycle_comfort_An_online_survey_with_enthusiast_cyclists">Identifying factors of bicycle comfort: An online survey with enthusiast cyclists</a></div><div class="wp-workCard_item"><span>Applied Ergonomics</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Racing bicycles have evolved significantly over the past decades as technology and cyclists&amp;#39; ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Racing bicycles have evolved significantly over the past decades as technology and cyclists&amp;#39; comfort have become a critical design issue. Although ample research has been conducted on comfort for other means of transportation, cyclists&amp;#39; perception of dynamic comfort has received scant attention in the scientific literature. The present study investigates how enthusiast cyclists conceptualize comfort using an online survey with 244 respondents. The purpose is to determine which factors contribute to comfort when riding a bicycle, to identify situations in which comfort is relevant and to determine the extent to which vibrations play a role in comfort evaluations. We found that comfort is influenced by factors related to bicycle components (specifically the frame, saddle and handlebar), as well as environmental factors (type or road, weather conditions) and factors related to the cyclist (position, adjustments, body parts). Respondents indicated that comfort is a concern when ...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="e52348727fa5a35f71151de8d66d7c35" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:43991460,&quot;asset_id&quot;:23566313,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/43991460/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566313"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566313"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566313; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566313]").text(description); $(".js-view-count[data-work-id=23566313]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566313; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566313']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "e52348727fa5a35f71151de8d66d7c35" } } $('.js-work-strip[data-work-id=23566313]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566313,"title":"Identifying factors of bicycle comfort: An online survey with enthusiast cyclists","internal_url":"https://www.academia.edu/23566313/Identifying_factors_of_bicycle_comfort_An_online_survey_with_enthusiast_cyclists","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":43991460,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/43991460/thumbnails/1.jpg","file_name":"Identifying_factors_of_bicycle_comfort_A20160322-12462-3e428r.pdf","download_url":"https://www.academia.edu/attachments/43991460/download_file","bulk_download_file_name":"Identifying_factors_of_bicycle_comfort_A.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/43991460/Identifying_factors_of_bicycle_comfort_A20160322-12462-3e428r-libre.pdf?1458656448=\u0026response-content-disposition=attachment%3B+filename%3DIdentifying_factors_of_bicycle_comfort_A.pdf\u0026Expires=1741338292\u0026Signature=TGVveyXnFQi58i4dfsleOSqH8XQWtXEdbqqUMU70Hr0XOXL1F-c1cRCR-GgW1YaA9~5YhOIy~d79tJYm6fI6ET7NPPbbSqqEdFR5nuZeFtkNpwG8UO267S1V8T3EaStp18Z~xB5oCF8ZzHFHG54ubmNWTf8LCGy2rOP4hyMVVLQ5jD~QkUJWvkOf3kUQr9suY9MG3rM7J-xcuhOj8-TrPnPWjefEUCm1oHecSpRXI8mGXwOG2dBvmN1aWwv6tB~3W-H69lZbDNdPL~-oxMUxsPs65h6~AKgCnj7wD0Uip4xE3eA3xBOe-ljdguFYuv~BYCz62EjrujQDplLLAetBsg__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566318"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23566318/A_Digital_Archives_Framework_for_the_Preservation_of_Artistic_Works_with_Technological_Components"><img alt="Research paper thumbnail of A Digital Archives Framework for the Preservation of Artistic Works with Technological Components" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23566318/A_Digital_Archives_Framework_for_the_Preservation_of_Artistic_Works_with_Technological_Components">A Digital Archives Framework for the Preservation of Artistic Works with Technological Components</a></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The preservation of artistic works with technological components, such as musical works, is recog...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The preservation of artistic works with technological components, such as musical works, is recognised as an issue by both the artistic community and the archival community. Preserving such works involves tackling the difficulties associated with digital information in general, but also raises its own specific problems, such as constantly evolving digital instruments embodied within software and idiosyncratic human-computer interactions. Because of these issues, standards in place for archiving digital information are not always suitable for the preservation of these works. The impact on the organisation and the descriptions of such archives need to be conceptualised in order to provide these technological components with readability, authenticity and intelligibility. While previous projects emphasized readability and authenticity, less effort has been dedicated to addressing intelligibility issues. The research into the specification of significant properties and its extension, nam...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566318"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566318"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566318; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566318]").text(description); $(".js-view-count[data-work-id=23566318]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566318; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566318']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23566318]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566318,"title":"A Digital Archives Framework for the Preservation of Artistic Works with Technological Components","internal_url":"https://www.academia.edu/23566318/A_Digital_Archives_Framework_for_the_Preservation_of_Artistic_Works_with_Technological_Components","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23566320"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23566320/Following_Gesture_Following_Grounding_the_Documentation_of_a_Multi_Agent_Creation_Process"><img alt="Research paper thumbnail of Following Gesture Following: Grounding the Documentation of a Multi-Agent Creation Process" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23566320/Following_Gesture_Following_Grounding_the_Documentation_of_a_Multi_Agent_Creation_Process">Following Gesture Following: Grounding the Documentation of a Multi-Agent Creation Process</a></div><div class="wp-workCard_item"><span>Computer Music Journal</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The documentation of electroacoustic and mixedmusical works typically relies on a posteriori data...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The documentation of electroacoustic and mixedmusical works typically relies on a posteriori data collection. In this article, we argue that the preservation of musical works having technological components should be grounded in a thorough documentation of the creative process that accounts for both human and nonhuman agents of creation. The present research aims at providing a ground for documentation policies that account for the creative process and provide relevant information for performance, migration, and analysis. To do so, we analyzed secondary ethnographic data from a two-year creation and production process of a musical work having a focus on gesture following. Using grounded theory, we developed a conceptual framework with different levels of abstraction and consequent levels of transferability to other creative contexts. Finally, we propose several paths for grounding a subsequent documentation framework in this conceptual framework.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23566320"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23566320"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23566320; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23566320]").text(description); $(".js-view-count[data-work-id=23566320]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23566320; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23566320']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23566320]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23566320,"title":"Following Gesture Following: Grounding the Documentation of a Multi-Agent Creation Process","internal_url":"https://www.academia.edu/23566320/Following_Gesture_Following_Grounding_the_Documentation_of_a_Multi_Agent_Creation_Process","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23624826"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/23624826/Subject_Explorer_3D_A_Virtual_Reality_Collection_Browsing_and_Searching_Tool"><img alt="Research paper thumbnail of Subject Explorer 3D: A Virtual Reality Collection Browsing and Searching Tool" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/23624826/Subject_Explorer_3D_A_Virtual_Reality_Collection_Browsing_and_Searching_Tool">Subject Explorer 3D: A Virtual Reality Collection Browsing and Searching Tool</a></div><div class="wp-workCard_item"><span>Proceedings of the 38th …</span><span>, 2010</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">... Canadian Journal of Information and Library Sciences. 33:1/2 67-83. Julien, Charles-Antoine, ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">... Canadian Journal of Information and Library Sciences. 33:1/2 67-83. Julien, Charles-Antoine, John E. Leide and France Bouthillier. (2008). Controlled ... images. College Park, MD: University of Maryland. Lancaster, F. (Ed.). (1986). ...</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23624826"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23624826"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23624826; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23624826]").text(description); $(".js-view-count[data-work-id=23624826]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23624826; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23624826']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=23624826]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23624826,"title":"Subject Explorer 3D: A Virtual Reality Collection Browsing and Searching Tool","internal_url":"https://www.academia.edu/23624826/Subject_Explorer_3D_A_Virtual_Reality_Collection_Browsing_and_Searching_Tool","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23624827"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/23624827/The_ideal_urban_soundscape_Investigating_the_sound_quality_of_French_cities"><img alt="Research paper thumbnail of The ideal urban soundscape: Investigating the sound quality of French cities" class="work-thumbnail" src="https://attachments.academia-assets.com/44038491/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/23624827/The_ideal_urban_soundscape_Investigating_the_sound_quality_of_French_cities">The ideal urban soundscape: Investigating the sound quality of French cities</a></div><div class="wp-workCard_item"><span>Acta Acustica United with Acustica</span><span>, 2006</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Catherine Guastavino McGill University, Graduate School of Library and Information Studies, Centr...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Catherine Guastavino McGill University, Graduate School of Library and Information Studies, Centre for Interdisciplinary Research on Music Media and Technology, 3459 McTavish, Montreal, H3A 1Y1, Canada. <a href="mailto:Catherine.Guastavino@mcgill.ca" rel="nofollow">Catherine.Guastavino@mcgill.ca</a></span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="d2ad072ad07095bed2875f22ad471ae1" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:44038491,&quot;asset_id&quot;:23624827,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/44038491/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23624827"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23624827"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23624827; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23624827]").text(description); $(".js-view-count[data-work-id=23624827]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23624827; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23624827']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "d2ad072ad07095bed2875f22ad471ae1" } } $('.js-work-strip[data-work-id=23624827]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23624827,"title":"The ideal urban soundscape: Investigating the sound quality of French cities","internal_url":"https://www.academia.edu/23624827/The_ideal_urban_soundscape_Investigating_the_sound_quality_of_French_cities","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":44038491,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/44038491/thumbnails/1.jpg","file_name":"The_Ideal_Urban_Soundscape_Investigating20160323-30566-1fnzpt0.pdf","download_url":"https://www.academia.edu/attachments/44038491/download_file","bulk_download_file_name":"The_ideal_urban_soundscape_Investigating.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/44038491/The_Ideal_Urban_Soundscape_Investigating20160323-30566-1fnzpt0-libre.pdf?1458758186=\u0026response-content-disposition=attachment%3B+filename%3DThe_ideal_urban_soundscape_Investigating.pdf\u0026Expires=1741324680\u0026Signature=U0YkJiGjXEZx0ttYn-0g-ETyV5Rd8KlJiJQTOzGJRe4YU9TdZ4bsYO6oABZhkAr2dLNDLQDlXxksxCc8A7T6umEbkIHfVo3H4ZomUBbTQ3A0Fi8AG~B8lyyhlPi7ONfBJxGmPJHo8w3LkzIwjhnReZ2SAxWGp3j7B8zwDGheA~KtyQxoMp68JZIuzkC6ajpeg8wuSKP0LRW9crceYT6gziJB8P7zoocuMi4HSYAMmMINZUfsRnJ0c3cakP3IQXLbpHI4GtfiCu7a~l2UBstxvhmPb20o62pBnn05QHg29ClgqbysXf9C7feswCVrvA4ByVuSZVbNwfwduke9VbOE3w__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23624828"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/23624828/Categorization_of_environmental_sounds"><img alt="Research paper thumbnail of Categorization of environmental sounds" class="work-thumbnail" src="https://attachments.academia-assets.com/44038489/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/23624828/Categorization_of_environmental_sounds">Categorization of environmental sounds</a></div><div class="wp-workCard_item"><span>Canadian Journal of Experimental Psychology/Revue canadienne de psychologie expérimentale</span><span>, 2007</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="ce280d4247445937c8f83575a554dfb5" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:44038489,&quot;asset_id&quot;:23624828,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/44038489/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23624828"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23624828"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23624828; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23624828]").text(description); $(".js-view-count[data-work-id=23624828]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23624828; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23624828']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "ce280d4247445937c8f83575a554dfb5" } } $('.js-work-strip[data-work-id=23624828]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23624828,"title":"Categorization of environmental sounds","internal_url":"https://www.academia.edu/23624828/Categorization_of_environmental_sounds","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":44038489,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/44038489/thumbnails/1.jpg","file_name":"Categorization_of_environmental_sounds20160323-14041-ac5vka.pdf","download_url":"https://www.academia.edu/attachments/44038489/download_file","bulk_download_file_name":"Categorization_of_environmental_sounds.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/44038489/Categorization_of_environmental_sounds20160323-14041-ac5vka-libre.pdf?1458758184=\u0026response-content-disposition=attachment%3B+filename%3DCategorization_of_environmental_sounds.pdf\u0026Expires=1741324680\u0026Signature=cLm0CPD7fKnwVmgZ7EwPh92kaI681cl0QQRIRSb9ewFvss7lk5Bm6gCr0HcJ0yWdJYZYJB9vNSkzgXp2LGO0b44oIpv7WOX1zhojDRapNrNyPD~sU8cO0dV5F~w-ODFLLuyM~bv4OnYG5Mga61RFsT9pg~lw0LU3K3lj97v0dZH7X9hwYquEeoXkZDqI0rl-l4Uqx7h7L5sn8cnPqIgORYTEXhVafE5igOerccjeaT5eoU84VifniNcKZDgDL~fecUCNAWUKvwQiZFaMTJ5pGvMB4B4xdVQJQhE8O6UlsiIrC5gt2oeuM6r-yHbrDCvG5wfVP8WagIA7vas0aKbBdQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="13652471"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" rel="nofollow" href="https://www.academia.edu/13652471/Do_whole_body_vibrations_affect_spatial_hearing"><img alt="Research paper thumbnail of Do whole-body vibrations affect spatial hearing?" class="work-thumbnail" src="https://a.academia-assets.com/images/blank-paper.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" rel="nofollow" href="https://www.academia.edu/13652471/Do_whole_body_vibrations_affect_spatial_hearing">Do whole-body vibrations affect spatial hearing?</a></div><div class="wp-workCard_item wp-workCard--coauthors"><span>by </span><span><a class="" data-click-track="profile-work-strip-authors" href="https://mcgill.academia.edu/CatherineGuastavino">Catherine Guastavino</a> and <a class="" data-click-track="profile-work-strip-authors" href="https://independent.academia.edu/IljaFrissen">Ilja Frissen</a></span></div><div class="wp-workCard_item"><span>Ergonomics</span><span>, 2014</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">To assist the human operator, modern auditory interfaces increasingly rely on sound spatialisatio...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">To assist the human operator, modern auditory interfaces increasingly rely on sound spatialisation to display auditory information and warning signals. However, we often operate in environments that apply vibrations to the whole body, e.g. when driving a vehicle. Here, we report three experiments investigating the effect of sinusoidal vibrations along the vertical axis on spatial hearing. The first was a free-field, narrow-band noise localisation experiment with 5- Hz vibration at 0.88 ms(-2). The other experiments used headphone-based sound lateralisation tasks. Experiment 2 investigated the effect of vibration frequency (4 vs. 8 Hz) at two different magnitudes (0.83 vs. 1.65 ms(-2)) on a left-right discrimination one-interval forced-choice task. Experiment 3 assessed the effect on a two-interval forced-choice location discrimination task with respect to the central and two peripheral reference locations. In spite of the broad range of methods, none of the experiments show a reliable effect of whole-body vibrations on localisation performance. We report three experiments that used both free-field localisation and headphone lateralisation tasks to assess their sensitivity to whole-body vibrations at low frequencies. None of the experiments show a reliable effect of either frequency or magnitude of whole-body vibrations on localisation performance.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="13652471"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="13652471"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 13652471; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=13652471]").text(description); $(".js-view-count[data-work-id=13652471]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 13652471; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='13652471']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (false){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "-1" } } $('.js-work-strip[data-work-id=13652471]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":13652471,"title":"Do whole-body vibrations affect spatial hearing?","internal_url":"https://www.academia.edu/13652471/Do_whole_body_vibrations_affect_spatial_hearing","owner_id":32807768,"coauthors_can_edit":true,"owner":{"id":32807768,"first_name":"Ilja","middle_initials":null,"last_name":"Frissen","page_name":"IljaFrissen","domain_name":"independent","created_at":"2015-07-05T07:20:17.763-07:00","display_name":"Ilja Frissen","url":"https://independent.academia.edu/IljaFrissen"},"attachments":[]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="19167588"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/19167588/Perceptual_evaluation_of_multi_dimensional_spatial_audio_reproduction"><img alt="Research paper thumbnail of Perceptual evaluation of multi-dimensional spatial audio reproduction" class="work-thumbnail" src="https://attachments.academia-assets.com/40468373/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/19167588/Perceptual_evaluation_of_multi_dimensional_spatial_audio_reproduction">Perceptual evaluation of multi-dimensional spatial audio reproduction</a></div><div class="wp-workCard_item wp-workCard--coauthors"><span>by </span><span><a class="" data-click-track="profile-work-strip-authors" href="https://sorbonne-fr.academia.edu/BrianKatz">Brian F G Katz</a> and <a class="" data-click-track="profile-work-strip-authors" href="https://mcgill.academia.edu/CatherineGuastavino">Catherine Guastavino</a></span></div><div class="wp-workCard_item"><span>The Journal of the Acoustical Society of America</span><span>, 2004</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">Perceptual differences between sound reproduction systems with multiple spatial dimensions have b...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">Perceptual differences between sound reproduction systems with multiple spatial dimensions have been investigated. Two blind studies were performed using system configurations involving 1-D, 2-D, and 3-D loudspeaker arrays. Various types of source material were used, ranging from urban soundscapes to musical passages. Experiment I consisted in collecting subjects&#39; perceptions in a free-response format to identify relevant criteria for multi-dimensional spatial sound reproduction of complex auditory scenes by means of linguistic analysis. Experiment II utilized both free response and scale judgments for seven parameters derived form Experiment I. Results indicated a strong correlation between the source material ͑sound scene͒ and the subjective evaluation of the parameters, making the notion of an &#39;&#39;optimal&#39;&#39; reproduction method difficult for arbitrary source material.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="55cf2b6f1a490a9cb0a3b0f0a2a32e47" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:40468373,&quot;asset_id&quot;:19167588,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/40468373/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="19167588"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="19167588"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 19167588; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=19167588]").text(description); $(".js-view-count[data-work-id=19167588]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 19167588; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='19167588']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "55cf2b6f1a490a9cb0a3b0f0a2a32e47" } } $('.js-work-strip[data-work-id=19167588]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":19167588,"title":"Perceptual evaluation of multi-dimensional spatial audio reproduction","internal_url":"https://www.academia.edu/19167588/Perceptual_evaluation_of_multi_dimensional_spatial_audio_reproduction","owner_id":39373832,"coauthors_can_edit":true,"owner":{"id":39373832,"first_name":"Brian","middle_initials":"F G","last_name":"Katz","page_name":"BrianKatz","domain_name":"sorbonne-fr","created_at":"2015-11-29T01:57:55.186-08:00","display_name":"Brian F G Katz","url":"https://sorbonne-fr.academia.edu/BrianKatz"},"attachments":[{"id":40468373,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/40468373/thumbnails/1.jpg","file_name":"GuastavinoKatz2004JASA.pdf","download_url":"https://www.academia.edu/attachments/40468373/download_file","bulk_download_file_name":"Perceptual_evaluation_of_multi_dimension.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/40468373/GuastavinoKatz2004JASA-libre.pdf?1448795232=\u0026response-content-disposition=attachment%3B+filename%3DPerceptual_evaluation_of_multi_dimension.pdf\u0026Expires=1741338292\u0026Signature=GZARaopqijHw10P0z0~8A2Dzk0GEirNYwyBWOnWGiOajz-x-LfeSA6PqtOzSpHzQ~0uFpzG9ylBVQqqYObmkSx9iWtT9brq7F3kd4YFvtqg95DfmMKFxt35Fa7cVSZvNefqVAj2ornrt9T-dsQTVCsUzF1RiypVSp4EWIPbLEJzR2sSDqTO4-7KvIr6kqp0sjnYl8qDuMA2bhBJUEG3QPskkg9OS1spRUEP1GiRNZkGF7uoM77bp8nD6pVcWoaINX7hGhoLM1PYtukWh0WlixGMim4DXxo9cazrxMo9s9aSwA3G7Pjnj7m7ASkkkn-ohknjB5C7KBL9xNvhx8q5vCQ__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> <div class="js-work-strip profile--work_container" data-work-id="23624831"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/23624831/Constructing_a_true_LCSH_tree_of_a_science_and_engineering_collection"><img alt="Research paper thumbnail of Constructing a true LCSH tree of a science and engineering collection" class="work-thumbnail" src="https://attachments.academia-assets.com/44038493/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/23624831/Constructing_a_true_LCSH_tree_of_a_science_and_engineering_collection">Constructing a true LCSH tree of a science and engineering collection</a></div><div class="wp-workCard_item"><span>Journal of the American Society for Information Science and Technology</span><span>, 2012</span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The Library of Congress Subject Headings (LCSH) is a subject structure used to index large librar...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The Library of Congress Subject Headings (LCSH) is a subject structure used to index large library collections throughout the world. Browsing a collection through LCSH is difficult using current online tools in part because users cannot explore the structure using their existing experience navigating file hierarchies on their hard drives. This is due to inconsistencies in the LCSH structure, which does not adhere to the specific rules defining tree structures. This article proposes a method to adapt the LCSH structure to reflect a real-world collection from the domain of science and engineering. This structure is transformed into a valid tree structure using an automatic process. The analysis of the resulting LCSH tree shows a large and complex structure. The analysis of the distribution of information within the LCSH tree reveals a power law distribution where the vast majority of subjects contain few information items and a few subjects contain the vast majority of the collection.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="88d536a926668deb182f51a3286ca1ef" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:44038493,&quot;asset_id&quot;:23624831,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/44038493/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="23624831"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="23624831"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 23624831; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=23624831]").text(description); $(".js-view-count[data-work-id=23624831]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 23624831; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='23624831']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "88d536a926668deb182f51a3286ca1ef" } } $('.js-work-strip[data-work-id=23624831]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":23624831,"title":"Constructing a true LCSH tree of a science and engineering collection","internal_url":"https://www.academia.edu/23624831/Constructing_a_true_LCSH_tree_of_a_science_and_engineering_collection","owner_id":45597668,"coauthors_can_edit":true,"owner":{"id":45597668,"first_name":"Catherine","middle_initials":null,"last_name":"Guastavino","page_name":"CatherineGuastavino","domain_name":"mcgill","created_at":"2016-03-22T07:12:44.788-07:00","display_name":"Catherine Guastavino","url":"https://mcgill.academia.edu/CatherineGuastavino"},"attachments":[{"id":44038493,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/44038493/thumbnails/1.jpg","file_name":"Constructing_a_true_LCSH_tree_of_a_scien20160323-14036-14fmy9w.pdf","download_url":"https://www.academia.edu/attachments/44038493/download_file","bulk_download_file_name":"Constructing_a_true_LCSH_tree_of_a_scien.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/44038493/Constructing_a_true_LCSH_tree_of_a_scien20160323-14036-14fmy9w-libre.pdf?1458758184=\u0026response-content-disposition=attachment%3B+filename%3DConstructing_a_true_LCSH_tree_of_a_scien.pdf\u0026Expires=1741338292\u0026Signature=UF3s9x~bMBryBcNZu7EQmY7PMywtHLgNcC1eXgQ9oPCox96ktipUv-mwtWUMIBsuKLGDq9OfSy8PeQ-LDTuPhA9gwKHaehcOuEZAuKa4kWd8JPDaoHageCRXWgo8CkzgKWQpsfmtBhU2WHyW0sAxwhZka6ZtJz5msxVAtbzj2LppUCr~qF5MWya~6ms-42hQBhAHD2HJ1QyH8S2b~Ilqi8otsBlIhK9s9~if8eNgdwlgKwzKkASB3cs6ZID-Mpd66ohnw6kTBoBhByKINGrMKoXg2qyizyZtROihCYC7qttXidSw2w03fEeqynC~aURH7Tw96lCZgJkY-W4cQpitWA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> </div><div class="profile--tab_content_container js-tab-pane tab-pane" data-section-id="5494925" id="digitaleffects"><div class="js-work-strip profile--work_container" data-work-id="6665169"><div class="profile--work_thumbnail hidden-xs"><a class="js-work-strip-work-link" data-click-track="profile-work-strip-thumbnail" href="https://www.academia.edu/6665169/An_interdisciplinary_approach_to_audio_to_effect_classification"><img alt="Research paper thumbnail of An interdisciplinary approach to audio to effect classification" class="work-thumbnail" src="https://attachments.academia-assets.com/33396262/thumbnails/1.jpg" /></a></div><div class="wp-workCard wp-workCard_itemContainer"><div class="wp-workCard_item wp-workCard--title"><a class="js-work-strip-work-link text-gray-darker" data-click-track="profile-work-strip-title" href="https://www.academia.edu/6665169/An_interdisciplinary_approach_to_audio_to_effect_classification">An interdisciplinary approach to audio to effect classification</a></div><div class="wp-workCard_item wp-workCard--coauthors"><span>by </span><span><a class="" data-click-track="profile-work-strip-authors" href="https://umontreal.academia.edu/CarolineTraube">Caroline Traube</a> and <a class="" data-click-track="profile-work-strip-authors" href="https://mcgill.academia.edu/CatherineGuastavino">Catherine Guastavino</a></span></div><div class="wp-workCard_item"><span class="js-work-more-abstract-truncated">The aim of this paper is to propose an interdisciplinary classification of digital audio effects ...</span><a class="js-work-more-abstract" data-broccoli-component="work_strip.more_abstract" data-click-track="profile-work-strip-more-abstract" href="javascript:;"><span> more </span><span><i class="fa fa-caret-down"></i></span></a><span class="js-work-more-abstract-untruncated hidden">The aim of this paper is to propose an interdisciplinary classification of digital audio effects to facilitate communication and collaborations between DSP programmers, sound engineers, composers, performers and musicologists. After reviewing classifications reflecting technological, technical and perceptual points of view, we introduce a transverse classification to link disciplinespecific classifications into a single network containing various layers of descriptors, ranging from low-level features to high-level features. Simple tools using the interdisciplinary classification are introduced to facilitate the navigation between effects, underlying techniques, perceptual attributes and semantic descriptors. Finally, concluding remarks on implications for teaching purposes and for the development of audio effects user interfaces based on perceptual features rather than technical parameters are presented.</span></div><div class="wp-workCard_item wp-workCard--actions"><span class="work-strip-bookmark-button-container"></span><a id="b2b11731f6b554ca7e2e35af74a1103b" class="wp-workCard--action" rel="nofollow" data-click-track="profile-work-strip-download" data-download="{&quot;attachment_id&quot;:33396262,&quot;asset_id&quot;:6665169,&quot;asset_type&quot;:&quot;Work&quot;,&quot;button_location&quot;:&quot;profile&quot;}" href="https://www.academia.edu/attachments/33396262/download_file?s=profile"><span><i class="fa fa-arrow-down"></i></span><span>Download</span></a><span class="wp-workCard--action visible-if-viewed-by-owner inline-block" style="display: none;"><span class="js-profile-work-strip-edit-button-wrapper profile-work-strip-edit-button-wrapper" data-work-id="6665169"><a class="js-profile-work-strip-edit-button" tabindex="0"><span><i class="fa fa-pencil"></i></span><span>Edit</span></a></span></span></div><div class="wp-workCard_item wp-workCard--stats"><span><span><span class="js-view-count view-count u-mr2x" data-work-id="6665169"><i class="fa fa-spinner fa-spin"></i></span><script>$(function () { var workId = 6665169; window.Academia.workViewCountsFetcher.queue(workId, function (count) { var description = window.$h.commaizeInt(count) + " " + window.$h.pluralize(count, 'View'); $(".js-view-count[data-work-id=6665169]").text(description); $(".js-view-count[data-work-id=6665169]").attr('title', description).tooltip(); }); });</script></span></span><span><span class="percentile-widget hidden"><span class="u-mr2x work-percentile"></span></span><script>$(function () { var workId = 6665169; window.Academia.workPercentilesFetcher.queue(workId, function (percentileText) { var container = $(".js-work-strip[data-work-id='6665169']"); container.find('.work-percentile').text(percentileText.charAt(0).toUpperCase() + percentileText.slice(1)); container.find('.percentile-widget').show(); container.find('.percentile-widget').removeClass('hidden'); }); });</script></span></div><div id="work-strip-premium-row-container"></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/work_edit-ad038b8c047c1a8d4fa01b402d530ff93c45fee2137a149a4a5398bc8ad67560.js"], function() { // from javascript_helper.rb var dispatcherData = {} if (true){ window.WowProfile.dispatcher = window.WowProfile.dispatcher || _.clone(Backbone.Events); dispatcherData = { dispatcher: window.WowProfile.dispatcher, downloadLinkId: "b2b11731f6b554ca7e2e35af74a1103b" } } $('.js-work-strip[data-work-id=6665169]').each(function() { if (!$(this).data('initialized')) { new WowProfile.WorkStripView({ el: this, workJSON: {"id":6665169,"title":"An interdisciplinary approach to audio to effect classification","internal_url":"https://www.academia.edu/6665169/An_interdisciplinary_approach_to_audio_to_effect_classification","owner_id":157950,"coauthors_can_edit":true,"owner":{"id":157950,"first_name":"Caroline","middle_initials":"","last_name":"Traube","page_name":"CarolineTraube","domain_name":"umontreal","created_at":"2010-03-31T13:06:02.177-07:00","display_name":"Caroline Traube","url":"https://umontreal.academia.edu/CarolineTraube"},"attachments":[{"id":33396262,"title":"","file_type":"pdf","scribd_thumbnail_url":"https://attachments.academia-assets.com/33396262/thumbnails/1.jpg","file_name":"Verfaille_Guastavino_Traube_DAFx_2006.pdf","download_url":"https://www.academia.edu/attachments/33396262/download_file","bulk_download_file_name":"An_interdisciplinary_approach_to_audio_t.pdf","bulk_download_url":"https://d1wqtxts1xzle7.cloudfront.net/33396262/Verfaille_Guastavino_Traube_DAFx_2006-libre.pdf?1396705511=\u0026response-content-disposition=attachment%3B+filename%3DAn_interdisciplinary_approach_to_audio_t.pdf\u0026Expires=1741315048\u0026Signature=UXxx7lXNj4cTy8KvJchZ5mpajjq04ePR8HtSJcxDq-brXHxfgaWbZXFUhRQ7HAx4jh0tur-2Kx7cIJMfFE7BRrF39tC07JkkYjYPFHDIn~D5oO8-rqtrKFMuaWHZAQImLrFVV~j4sbG1DoQVpCX0gRurHbivR0A0WvgFwHg~TSni76aBgOB4lwknFajwrkLjyyQRFWk~Y~dMozJ41Amx42tDMNtB3c0Ce5H76YNcb9T1OwEuO4cCiL3z873WEcO33E5CHVC9IqfBu6MC5WLHCsq4IUKQCNzOizyZ5-ZkJZp~ikHuqlCkWhOAj2bk~naAHVN7QZEhZ6UajHIDc6jLmA__\u0026Key-Pair-Id=APKAJLOHF5GGSLRBV4ZA"}]}, dispatcherData: dispatcherData }); $(this).data('initialized', true); } }); $a.trackClickSource(".js-work-strip-work-link", "profile_work_strip") }); </script> </div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js","https://a.academia-assets.com/assets/google_contacts-0dfb882d836b94dbcb4a2d123d6933fc9533eda5be911641f20b4eb428429600.js"], function() { // from javascript_helper.rb $('.js-google-connect-button').click(function(e) { e.preventDefault(); GoogleContacts.authorize_and_show_contacts(); Aedu.Dismissibles.recordClickthrough("WowProfileImportContactsPrompt"); }); $('.js-update-biography-button').click(function(e) { e.preventDefault(); Aedu.Dismissibles.recordClickthrough("UpdateUserBiographyPrompt"); $.ajax({ url: $r.api_v0_profiles_update_about_path({ subdomain_param: 'api', about: "", }), type: 'PUT', success: function(response) { location.reload(); } }); }); $('.js-work-creator-button').click(function (e) { e.preventDefault(); window.location = $r.upload_funnel_document_path({ source: encodeURIComponent(""), }); }); $('.js-video-upload-button').click(function (e) { e.preventDefault(); window.location = $r.upload_funnel_video_path({ source: encodeURIComponent(""), }); }); $('.js-do-this-later-button').click(function() { $(this).closest('.js-profile-nag-panel').remove(); Aedu.Dismissibles.recordDismissal("WowProfileImportContactsPrompt"); }); $('.js-update-biography-do-this-later-button').click(function(){ $(this).closest('.js-profile-nag-panel').remove(); Aedu.Dismissibles.recordDismissal("UpdateUserBiographyPrompt"); }); $('.wow-profile-mentions-upsell--close').click(function(){ $('.wow-profile-mentions-upsell--panel').hide(); Aedu.Dismissibles.recordDismissal("WowProfileMentionsUpsell"); }); $('.wow-profile-mentions-upsell--button').click(function(){ Aedu.Dismissibles.recordClickthrough("WowProfileMentionsUpsell"); }); new WowProfile.SocialRedesignUserWorks({ initialWorksOffset: 20, allWorksOffset: 20, maxSections: 3 }) }); </script> </div></div></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/wow_profile_edit-5ea339ee107c863779f560dd7275595239fed73f1a13d279d2b599a28c0ecd33.js","https://a.academia-assets.com/assets/add_coauthor-22174b608f9cb871d03443cafa7feac496fb50d7df2d66a53f5ee3c04ba67f53.js","https://a.academia-assets.com/assets/tab-dcac0130902f0cc2d8cb403714dd47454f11fc6fb0e99ae6a0827b06613abc20.js","https://a.academia-assets.com/assets/wow_profile-a9bf3a2bc8c89fa2a77156577594264ee8a0f214d74241bc0fcd3f69f8d107ac.js"], function() { // from javascript_helper.rb window.ae = window.ae || {}; window.ae.WowProfile = window.ae.WowProfile || {}; if(Aedu.User.current && Aedu.User.current.id === $viewedUser.id) { window.ae.WowProfile.current_user_edit = {}; new WowProfileEdit.EditUploadView({ el: '.js-edit-upload-button-wrapper', model: window.$current_user, }); new AddCoauthor.AddCoauthorsController(); } var userInfoView = new WowProfile.SocialRedesignUserInfo({ recaptcha_key: "6LdxlRMTAAAAADnu_zyLhLg0YF9uACwz78shpjJB" }); WowProfile.router = new WowProfile.Router({ userInfoView: userInfoView }); Backbone.history.start({ pushState: true, root: "/" + $viewedUser.page_name }); new WowProfile.UserWorksNav() }); </script> </div> <div class="bootstrap login"><div class="modal fade login-modal" id="login-modal"><div class="login-modal-dialog modal-dialog"><div class="modal-content"><div class="modal-header"><button class="close close" data-dismiss="modal" type="button"><span aria-hidden="true">&times;</span><span class="sr-only">Close</span></button><h4 class="modal-title text-center"><strong>Log In</strong></h4></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><button class="btn btn-fb btn-lg btn-block btn-v-center-content" id="login-facebook-oauth-button"><svg style="float: left; width: 19px; line-height: 1em; margin-right: .3em;" aria-hidden="true" focusable="false" data-prefix="fab" data-icon="facebook-square" class="svg-inline--fa fa-facebook-square fa-w-14" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 448 512"><path fill="currentColor" d="M400 32H48A48 48 0 0 0 0 80v352a48 48 0 0 0 48 48h137.25V327.69h-63V256h63v-54.64c0-62.15 37-96.48 93.67-96.48 27.14 0 55.52 4.84 55.52 4.84v61h-31.27c-30.81 0-40.42 19.12-40.42 38.73V256h68.78l-11 71.69h-57.78V480H400a48 48 0 0 0 48-48V80a48 48 0 0 0-48-48z"></path></svg><small><strong>Log in</strong> with <strong>Facebook</strong></small></button><br /><button class="btn btn-google btn-lg btn-block btn-v-center-content" id="login-google-oauth-button"><svg style="float: left; width: 22px; line-height: 1em; margin-right: .3em;" aria-hidden="true" focusable="false" data-prefix="fab" data-icon="google-plus" class="svg-inline--fa fa-google-plus fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M256,8C119.1,8,8,119.1,8,256S119.1,504,256,504,504,392.9,504,256,392.9,8,256,8ZM185.3,380a124,124,0,0,1,0-248c31.3,0,60.1,11,83,32.3l-33.6,32.6c-13.2-12.9-31.3-19.1-49.4-19.1-42.9,0-77.2,35.5-77.2,78.1S142.3,334,185.3,334c32.6,0,64.9-19.1,70.1-53.3H185.3V238.1H302.2a109.2,109.2,0,0,1,1.9,20.7c0,70.8-47.5,121.2-118.8,121.2ZM415.5,273.8v35.5H380V273.8H344.5V238.3H380V202.8h35.5v35.5h35.2v35.5Z"></path></svg><small><strong>Log in</strong> with <strong>Google</strong></small></button><br /><style type="text/css">.sign-in-with-apple-button { width: 100%; height: 52px; border-radius: 3px; border: 1px solid black; cursor: pointer; } .sign-in-with-apple-button > div { margin: 0 auto; / This centers the Apple-rendered button horizontally }</style><script src="https://appleid.cdn-apple.com/appleauth/static/jsapi/appleid/1/en_US/appleid.auth.js" type="text/javascript"></script><div class="sign-in-with-apple-button" data-border="false" data-color="white" id="appleid-signin"><span &nbsp;&nbsp;="Sign Up with Apple" class="u-fs11"></span></div><script>AppleID.auth.init({ clientId: 'edu.academia.applesignon', scope: 'name email', redirectURI: 'https://www.academia.edu/sessions', state: "567461b7dcca13cb7e0def6a176c32fc563df0ad5ec8f442b98d1e8536c6bf55", });</script><script>// Hacky way of checking if on fast loswp if (window.loswp == null) { (function() { const Google = window?.Aedu?.Auth?.OauthButton?.Login?.Google; const Facebook = window?.Aedu?.Auth?.OauthButton?.Login?.Facebook; if (Google) { new Google({ el: '#login-google-oauth-button', rememberMeCheckboxId: 'remember_me', track: null }); } if (Facebook) { new Facebook({ el: '#login-facebook-oauth-button', rememberMeCheckboxId: 'remember_me', track: null }); } })(); }</script></div></div></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><div class="hr-heading login-hr-heading"><span class="hr-heading-text">or</span></div></div></div></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><form class="js-login-form" action="https://www.academia.edu/sessions" accept-charset="UTF-8" method="post"><input type="hidden" name="authenticity_token" value="N4w3UqXn1hEvoTY7bwaY55oTOZXCe_4BNHmex4hICmU7sZwL6WfhDtPaQ757HnWrkEhfzvhV9vtdCghxK4qJWQ" autocomplete="off" /><div class="form-group"><label class="control-label" for="login-modal-email-input" style="font-size: 14px;">Email</label><input class="form-control" id="login-modal-email-input" name="login" type="email" /></div><div class="form-group"><label class="control-label" for="login-modal-password-input" style="font-size: 14px;">Password</label><input class="form-control" id="login-modal-password-input" name="password" type="password" /></div><input type="hidden" name="post_login_redirect_url" id="post_login_redirect_url" value="https://mcgill.academia.edu/CatherineGuastavino" autocomplete="off" /><div class="checkbox"><label><input type="checkbox" name="remember_me" id="remember_me" value="1" checked="checked" /><small style="font-size: 12px; margin-top: 2px; display: inline-block;">Remember me on this computer</small></label></div><br><input type="submit" name="commit" value="Log In" class="btn btn-primary btn-block btn-lg js-login-submit" data-disable-with="Log In" /></br></form><script>typeof window?.Aedu?.recaptchaManagedForm === 'function' && window.Aedu.recaptchaManagedForm( document.querySelector('.js-login-form'), document.querySelector('.js-login-submit') );</script><small style="font-size: 12px;"><br />or <a data-target="#login-modal-reset-password-container" data-toggle="collapse" href="javascript:void(0)">reset password</a></small><div class="collapse" id="login-modal-reset-password-container"><br /><div class="well margin-0x"><form class="js-password-reset-form" action="https://www.academia.edu/reset_password" accept-charset="UTF-8" method="post"><input type="hidden" name="authenticity_token" value="7kfADu7xoObzOJUdWRdWc1qrqVRQj02H1UFjq5nCX1DiemtXonGX-Q9D4JhND7s_UPDPD2qhRX28MvUdOgDcbA" autocomplete="off" /><p>Enter the email address you signed up with and we&#39;ll email you a reset link.</p><div class="form-group"><input class="form-control" name="email" type="email" /></div><script src="https://recaptcha.net/recaptcha/api.js" async defer></script> <script> var invisibleRecaptchaSubmit = function () { var closestForm = function (ele) { var curEle = ele.parentNode; while (curEle.nodeName !== 'FORM' && curEle.nodeName !== 'BODY'){ curEle = curEle.parentNode; } return curEle.nodeName === 'FORM' ? curEle : null }; var eles = document.getElementsByClassName('g-recaptcha'); if (eles.length > 0) { var form = closestForm(eles[0]); if (form) { form.submit(); } } }; </script> <input type="submit" data-sitekey="6Lf3KHUUAAAAACggoMpmGJdQDtiyrjVlvGJ6BbAj" data-callback="invisibleRecaptchaSubmit" class="g-recaptcha btn btn-primary btn-block" value="Email me a link" value=""/> </form></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/collapse-45805421cf446ca5adf7aaa1935b08a3a8d1d9a6cc5d91a62a2a3a00b20b3e6a.js"], function() { // from javascript_helper.rb $("#login-modal-reset-password-container").on("shown.bs.collapse", function() { $(this).find("input[type=email]").focus(); }); }); </script> </div></div></div><div class="modal-footer"><div class="text-center"><small style="font-size: 12px;">Need an account?&nbsp;<a rel="nofollow" href="https://www.academia.edu/signup">Click here to sign up</a></small></div></div></div></div></div></div><script>// If we are on subdomain or non-bootstrapped page, redirect to login page instead of showing modal (function(){ if (typeof $ === 'undefined') return; var host = window.location.hostname; if ((host === $domain || host === "www."+$domain) && (typeof $().modal === 'function')) { $("#nav_log_in").click(function(e) { // Don't follow the link and open the modal e.preventDefault(); $("#login-modal").on('shown.bs.modal', function() { $(this).find("#login-modal-email-input").focus() }).modal('show'); }); } })()</script> <div class="bootstrap" id="footer"><div class="footer-content clearfix text-center padding-top-7x" style="width:100%;"><ul class="footer-links-secondary footer-links-wide list-inline margin-bottom-1x"><li><a href="https://www.academia.edu/about">About</a></li><li><a href="https://www.academia.edu/press">Press</a></li><li><a href="https://www.academia.edu/documents">Papers</a></li><li><a href="https://www.academia.edu/topics">Topics</a></li><li><a href="https://www.academia.edu/journals">Academia.edu Journals</a></li><li><a rel="nofollow" href="https://www.academia.edu/hiring"><svg style="width: 13px; height: 13px;" aria-hidden="true" focusable="false" data-prefix="fas" data-icon="briefcase" class="svg-inline--fa fa-briefcase fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M320 336c0 8.84-7.16 16-16 16h-96c-8.84 0-16-7.16-16-16v-48H0v144c0 25.6 22.4 48 48 48h416c25.6 0 48-22.4 48-48V288H320v48zm144-208h-80V80c0-25.6-22.4-48-48-48H176c-25.6 0-48 22.4-48 48v48H48c-25.6 0-48 22.4-48 48v80h512v-80c0-25.6-22.4-48-48-48zm-144 0H192V96h128v32z"></path></svg>&nbsp;<strong>We're Hiring!</strong></a></li><li><a rel="nofollow" href="https://support.academia.edu/hc/en-us"><svg style="width: 12px; height: 12px;" aria-hidden="true" focusable="false" data-prefix="fas" data-icon="question-circle" class="svg-inline--fa fa-question-circle fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M504 256c0 136.997-111.043 248-248 248S8 392.997 8 256C8 119.083 119.043 8 256 8s248 111.083 248 248zM262.655 90c-54.497 0-89.255 22.957-116.549 63.758-3.536 5.286-2.353 12.415 2.715 16.258l34.699 26.31c5.205 3.947 12.621 3.008 16.665-2.122 17.864-22.658 30.113-35.797 57.303-35.797 20.429 0 45.698 13.148 45.698 32.958 0 14.976-12.363 22.667-32.534 33.976C247.128 238.528 216 254.941 216 296v4c0 6.627 5.373 12 12 12h56c6.627 0 12-5.373 12-12v-1.333c0-28.462 83.186-29.647 83.186-106.667 0-58.002-60.165-102-116.531-102zM256 338c-25.365 0-46 20.635-46 46 0 25.364 20.635 46 46 46s46-20.636 46-46c0-25.365-20.635-46-46-46z"></path></svg>&nbsp;<strong>Help Center</strong></a></li></ul><ul class="footer-links-tertiary list-inline margin-bottom-1x"><li class="small">Find new research papers in:</li><li class="small"><a href="https://www.academia.edu/Documents/in/Physics">Physics</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Chemistry">Chemistry</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Biology">Biology</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Health_Sciences">Health Sciences</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Ecology">Ecology</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Earth_Sciences">Earth Sciences</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Cognitive_Science">Cognitive Science</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Mathematics">Mathematics</a></li><li class="small"><a href="https://www.academia.edu/Documents/in/Computer_Science">Computer Science</a></li></ul></div></div><div class="DesignSystem" id="credit" style="width:100%;"><ul class="u-pl0x footer-links-legal list-inline"><li><a rel="nofollow" href="https://www.academia.edu/terms">Terms</a></li><li><a rel="nofollow" href="https://www.academia.edu/privacy">Privacy</a></li><li><a rel="nofollow" href="https://www.academia.edu/copyright">Copyright</a></li><li>Academia &copy;2025</li></ul></div><script> //<![CDATA[ window.detect_gmtoffset = true; window.Academia && window.Academia.set_gmtoffset && Academia.set_gmtoffset('/gmtoffset'); //]]> </script> <div id='overlay_background'></div> <div id='bootstrap-modal-container' class='bootstrap'></div> <div id='ds-modal-container' class='bootstrap DesignSystem'></div> <div id='full-screen-modal'></div> </div> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10