CINXE.COM

(PDF) Attributions of morality and mind to artificial intelligence after real-world moral violations

<!DOCTYPE html> <html > <head> <meta charset="utf-8"> <meta rel="search" type="application/opensearchdescription+xml" href="/open_search.xml" title="Academia.edu"> <meta content="width=device-width, initial-scale=1" name="viewport"> <meta name="google-site-verification" content="bKJMBZA7E43xhDOopFZkssMMkBRjvYERV-NaN4R6mrs"> <meta name="csrf-param" content="authenticity_token" /> <meta name="csrf-token" content="sslsQr4CnUBLKEkFVWs--CcQStMi6kWxxD4LAIhKzkZmVn9DKTr2Nc1GxX5C6ZyAJJIhzh2H-fQbHMgtqendBg" /> <meta name="citation_title" content="Attributions of morality and mind to artificial intelligence after real-world moral violations" /> <meta name="citation_publication_date" content="2018/09/01" /> <meta name="citation_journal_title" content="Computers in Human Behavior" /> <meta name="citation_author" content="Daniel B Shank" /> <meta name="twitter:card" content="summary" /> <meta name="twitter:url" content="https://www.academia.edu/110213023/Attributions_of_morality_and_mind_to_artificial_intelligence_after_real_world_moral_violations" /> <meta name="twitter:title" content="Attributions of morality and mind to artificial intelligence after real-world moral violations" /> <meta name="twitter:description" content="The media has portrayed certain artificial intelligence (AI) software as committing moral violations such as the AI judge of a human beauty contest being &amp;quot;racist&amp;quot; when it selected predominately light-skinned winners. We examine people&amp;#39;s" /> <meta name="twitter:image" content="https://0.academia-photos.com/1952752/685138/10882860/s200_daniel.shank.jpg" /> <meta property="fb:app_id" content="2369844204" /> <meta property="og:type" content="article" /> <meta property="og:url" content="https://www.academia.edu/110213023/Attributions_of_morality_and_mind_to_artificial_intelligence_after_real_world_moral_violations" /> <meta property="og:title" content="Attributions of morality and mind to artificial intelligence after real-world moral violations" /> <meta property="og:image" content="http://a.academia-assets.com/images/open-graph-icons/fb-paper.gif" /> <meta property="og:description" content="The media has portrayed certain artificial intelligence (AI) software as committing moral violations such as the AI judge of a human beauty contest being &amp;quot;racist&amp;quot; when it selected predominately light-skinned winners. We examine people&amp;#39;s" /> <meta property="article:author" content="https://mst.academia.edu/DanielShank" /> <meta name="description" content="The media has portrayed certain artificial intelligence (AI) software as committing moral violations such as the AI judge of a human beauty contest being &amp;quot;racist&amp;quot; when it selected predominately light-skinned winners. We examine people&amp;#39;s" /> <title>(PDF) Attributions of morality and mind to artificial intelligence after real-world moral violations</title> <link rel="canonical" href="https://www.academia.edu/110213023/Attributions_of_morality_and_mind_to_artificial_intelligence_after_real_world_moral_violations" /> <script async src="https://www.googletagmanager.com/gtag/js?id=G-5VKX33P2DS"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-5VKX33P2DS', { cookie_domain: 'academia.edu', send_page_view: false, }); gtag('event', 'page_view', { 'controller': "single_work", 'action': "show", 'controller_action': 'single_work#show', 'logged_in': 'false', 'edge': 'unknown', // Send nil if there is no A/B test bucket, in case some records get logged // with missing data - that way we can distinguish between the two cases. // ab_test_bucket should be of the form <ab_test_name>:<bucket> 'ab_test_bucket': null, }) </script> <script> var $controller_name = 'single_work'; var $action_name = "show"; var $rails_env = 'production'; var $app_rev = 'b092bf3a3df71cf13feee7c143e83a57eb6b94fb'; var $domain = 'academia.edu'; var $app_host = "academia.edu"; var $asset_host = "academia-assets.com"; var $start_time = new Date().getTime(); var $recaptcha_key = "6LdxlRMTAAAAADnu_zyLhLg0YF9uACwz78shpjJB"; var $recaptcha_invisible_key = "6Lf3KHUUAAAAACggoMpmGJdQDtiyrjVlvGJ6BbAj"; var $disableClientRecordHit = false; </script> <script> window.require = { config: function() { return function() {} } } </script> <script> window.Aedu = window.Aedu || {}; window.Aedu.hit_data = null; window.Aedu.serverRenderTime = new Date(1739817942000); window.Aedu.timeDifference = new Date().getTime() - 1739817942000; </script> <script type="application/ld+json">{"@context":"https://schema.org","@type":"ScholarlyArticle","author":[{"@context":"https://schema.org","@type":"Person","name":"Daniel B Shank","url":"https://mst.academia.edu/DanielShank"}],"contributor":[],"dateCreated":"2023-11-30","dateModified":"2024-11-25","datePublished":"2018-09-01","headline":"Attributions of morality and mind to artificial intelligence after real-world moral violations","image":"https://attachments.academia-assets.com/108099213/thumbnails/1.jpg","inLanguage":"en","keywords":["Information Systems","Psychology","Cognitive Science","Social Psychology","Computer Science","Artificial Intelligence","Algorithm","Responsibility","Morality","Attribution","Attributions","Computers In Human Behavior"],"publication":"Computers in Human Behavior","publisher":{"@context":"https://schema.org","@type":"Organization","name":"Elsevier BV"},"sourceOrganization":[{"@context":"https://schema.org","@type":"EducationalOrganization","name":"mst"}],"thumbnailUrl":"https://attachments.academia-assets.com/108099213/thumbnails/1.jpg","url":"https://www.academia.edu/110213023/Attributions_of_morality_and_mind_to_artificial_intelligence_after_real_world_moral_violations"}</script><style type="text/css">@media(max-width: 567px){:root{--token-mode: Rebrand;--dropshadow: 0 2px 4px 0 #22223340;--primary-brand: #0645b1;--error-dark: #b60000;--success-dark: #05b01c;--inactive-fill: #ebebee;--hover: #0c3b8d;--pressed: #082f75;--button-primary-fill-inactive: #ebebee;--button-primary-fill: #0645b1;--button-primary-text: #ffffff;--button-primary-fill-hover: #0c3b8d;--button-primary-fill-press: #082f75;--button-primary-icon: #ffffff;--button-primary-fill-inverse: #ffffff;--button-primary-text-inverse: #082f75;--button-primary-icon-inverse: #0645b1;--button-primary-fill-inverse-hover: #cddaef;--button-primary-stroke-inverse-pressed: #0645b1;--button-secondary-stroke-inactive: #b1b1ba;--button-secondary-fill: #eef2f9;--button-secondary-text: #082f75;--button-secondary-fill-press: #cddaef;--button-secondary-fill-inactive: #ebebee;--button-secondary-stroke: #cddaef;--button-secondary-stroke-hover: #386ac1;--button-secondary-stroke-press: #0645b1;--button-secondary-text-inactive: #b1b1ba;--button-secondary-icon: #082f75;--button-secondary-fill-hover: #e6ecf7;--button-secondary-stroke-inverse: #ffffff;--button-secondary-fill-inverse: rgba(255, 255, 255, 0);--button-secondary-icon-inverse: #ffffff;--button-secondary-icon-hover: #082f75;--button-secondary-icon-press: #082f75;--button-secondary-text-inverse: #ffffff;--button-secondary-text-hover: #082f75;--button-secondary-text-press: #082f75;--button-secondary-fill-inverse-hover: #043059;--button-xs-stroke: #141413;--button-xs-stroke-hover: #0c3b8d;--button-xs-stroke-press: #082f75;--button-xs-stroke-inactive: #ebebee;--button-xs-text: #141413;--button-xs-text-hover: #0c3b8d;--button-xs-text-press: #082f75;--button-xs-text-inactive: #91919e;--button-xs-icon: #141413;--button-xs-icon-hover: #0c3b8d;--button-xs-icon-press: #082f75;--button-xs-icon-inactive: #91919e;--button-xs-fill: #ffffff;--button-xs-fill-hover: #f4f7fc;--button-xs-fill-press: #eef2f9;--buttons-button-text-inactive: #91919e;--buttons-button-focus: #0645b1;--buttons-button-icon-inactive: #91919e;--buttons-small-buttons-corner-radius: 16px;--buttons-small-buttons-l-r-padding: 20px;--buttons-small-buttons-height: 48px;--buttons-small-buttons-gap: 8px;--buttons-small-buttons-icon-only-width: 48px;--buttons-small-buttons-icon-size: 20px;--buttons-small-buttons-stroke-default: 1px;--buttons-small-buttons-stroke-thick: 2px;--buttons-large-buttons-l-r-padding: 32px;--buttons-large-buttons-height: 64px;--buttons-large-buttons-icon-only-width: 64px;--buttons-large-buttons-icon-size: 20px;--buttons-large-buttons-gap: 8px;--buttons-large-buttons-corner-radius: 16px;--buttons-large-buttons-stroke-default: 1px;--buttons-large-buttons-stroke-thick: 2px;--buttons-extra-small-buttons-l-r-padding: 8px;--buttons-extra-small-buttons-height: 32px;--buttons-extra-small-buttons-icon-size: 16px;--buttons-extra-small-buttons-gap: 4px;--buttons-extra-small-buttons-corner-radius: 8px;--buttons-stroke-default: 1px;--buttons-stroke-thick: 2px;--background-beige: #f9f7f4;--error-light: #fff2f2;--text-placeholder: #6d6d7d;--stroke-dark: #141413;--stroke-light: #dddde2;--stroke-medium: #535366;--accent-green: #ccffd4;--accent-turquoise: #ccf7ff;--accent-yellow: #f7ffcc;--accent-peach: #ffd4cc;--accent-violet: #f7ccff;--accent-purple: #f4f7fc;--text-primary: #141413;--secondary-brand: #141413;--text-hover: #0c3b8d;--text-white: #ffffff;--text-link: #0645b1;--text-press: #082f75;--success-light: #f0f8f1;--background-light-blue: #f4f7fc;--background-white: #ffffff;--premium-dark: #877440;--premium-light: #f9f6ed;--stroke-white: #ffffff;--inactive-content: #b1b1ba;--annotate-light: #a35dff;--annotate-dark: #824acc;--grid: #eef2f9;--inactive-stroke: #ebebee;--shadow: rgba(34, 34, 51, 0.25);--text-inactive: #6d6d7d;--text-error: #b60000;--stroke-error: #b60000;--background-error: #fff2f2;--background-black: #141413;--icon-default: #141413;--icon-blue: #0645b1;--background-grey: #dddde2;--icon-grey: #b1b1ba;--text-focus: #082f75;--brand-colors-neutral-black: #141413;--brand-colors-neutral-900: #535366;--brand-colors-neutral-800: #6d6d7d;--brand-colors-neutral-700: #91919e;--brand-colors-neutral-600: #b1b1ba;--brand-colors-neutral-500: #c8c8cf;--brand-colors-neutral-400: #dddde2;--brand-colors-neutral-300: #ebebee;--brand-colors-neutral-200: #f8f8fb;--brand-colors-neutral-100: #fafafa;--brand-colors-neutral-white: #ffffff;--brand-colors-blue-900: #043059;--brand-colors-blue-800: #082f75;--brand-colors-blue-700: #0c3b8d;--brand-colors-blue-600: #0645b1;--brand-colors-blue-500: #386ac1;--brand-colors-blue-400: #cddaef;--brand-colors-blue-300: #e6ecf7;--brand-colors-blue-200: #eef2f9;--brand-colors-blue-100: #f4f7fc;--brand-colors-gold-500: #877440;--brand-colors-gold-400: #e9e3d4;--brand-colors-gold-300: #f2efe8;--brand-colors-gold-200: #f9f6ed;--brand-colors-gold-100: #f9f7f4;--brand-colors-error-900: #920000;--brand-colors-error-500: #b60000;--brand-colors-success-900: #035c0f;--brand-colors-green: #ccffd4;--brand-colors-turquoise: #ccf7ff;--brand-colors-yellow: #f7ffcc;--brand-colors-peach: #ffd4cc;--brand-colors-violet: #f7ccff;--brand-colors-error-100: #fff2f2;--brand-colors-success-500: #05b01c;--brand-colors-success-100: #f0f8f1;--text-secondary: #535366;--icon-white: #ffffff;--background-beige-darker: #f2efe8;--icon-dark-grey: #535366;--type-font-family-sans-serif: DM Sans;--type-font-family-serif: Gupter;--type-font-family-mono: IBM Plex Mono;--type-weights-300: 300;--type-weights-400: 400;--type-weights-500: 500;--type-weights-700: 700;--type-sizes-12: 12px;--type-sizes-14: 14px;--type-sizes-16: 16px;--type-sizes-18: 18px;--type-sizes-20: 20px;--type-sizes-22: 22px;--type-sizes-24: 24px;--type-sizes-28: 28px;--type-sizes-30: 30px;--type-sizes-32: 32px;--type-sizes-40: 40px;--type-sizes-42: 42px;--type-sizes-48-2: 48px;--type-line-heights-16: 16px;--type-line-heights-20: 20px;--type-line-heights-23: 23px;--type-line-heights-24: 24px;--type-line-heights-25: 25px;--type-line-heights-26: 26px;--type-line-heights-29: 29px;--type-line-heights-30: 30px;--type-line-heights-32: 32px;--type-line-heights-34: 34px;--type-line-heights-35: 35px;--type-line-heights-36: 36px;--type-line-heights-38: 38px;--type-line-heights-40: 40px;--type-line-heights-46: 46px;--type-line-heights-48: 48px;--type-line-heights-52: 52px;--type-line-heights-58: 58px;--type-line-heights-68: 68px;--type-line-heights-74: 74px;--type-line-heights-82: 82px;--type-paragraph-spacings-0: 0px;--type-paragraph-spacings-4: 4px;--type-paragraph-spacings-8: 8px;--type-paragraph-spacings-16: 16px;--type-sans-serif-xl-font-weight: 400;--type-sans-serif-xl-size: 32px;--type-sans-serif-xl-line-height: 46px;--type-sans-serif-xl-paragraph-spacing: 16px;--type-sans-serif-lg-font-weight: 400;--type-sans-serif-lg-size: 30px;--type-sans-serif-lg-line-height: 36px;--type-sans-serif-lg-paragraph-spacing: 16px;--type-sans-serif-md-font-weight: 400;--type-sans-serif-md-line-height: 30px;--type-sans-serif-md-paragraph-spacing: 16px;--type-sans-serif-md-size: 24px;--type-sans-serif-xs-font-weight: 700;--type-sans-serif-xs-line-height: 24px;--type-sans-serif-xs-paragraph-spacing: 0px;--type-sans-serif-xs-size: 18px;--type-sans-serif-sm-font-weight: 400;--type-sans-serif-sm-line-height: 32px;--type-sans-serif-sm-paragraph-spacing: 16px;--type-sans-serif-sm-size: 20px;--type-body-xl-font-weight: 400;--type-body-xl-size: 24px;--type-body-xl-line-height: 36px;--type-body-xl-paragraph-spacing: 0px;--type-body-sm-font-weight: 400;--type-body-sm-size: 14px;--type-body-sm-line-height: 20px;--type-body-sm-paragraph-spacing: 8px;--type-body-xs-font-weight: 400;--type-body-xs-size: 12px;--type-body-xs-line-height: 16px;--type-body-xs-paragraph-spacing: 0px;--type-body-md-font-weight: 400;--type-body-md-size: 16px;--type-body-md-line-height: 20px;--type-body-md-paragraph-spacing: 4px;--type-body-lg-font-weight: 400;--type-body-lg-size: 20px;--type-body-lg-line-height: 26px;--type-body-lg-paragraph-spacing: 16px;--type-body-lg-medium-font-weight: 500;--type-body-lg-medium-size: 20px;--type-body-lg-medium-line-height: 32px;--type-body-lg-medium-paragraph-spacing: 16px;--type-body-md-medium-font-weight: 500;--type-body-md-medium-size: 16px;--type-body-md-medium-line-height: 20px;--type-body-md-medium-paragraph-spacing: 4px;--type-body-sm-bold-font-weight: 700;--type-body-sm-bold-size: 14px;--type-body-sm-bold-line-height: 20px;--type-body-sm-bold-paragraph-spacing: 8px;--type-body-sm-medium-font-weight: 500;--type-body-sm-medium-size: 14px;--type-body-sm-medium-line-height: 20px;--type-body-sm-medium-paragraph-spacing: 8px;--type-serif-md-font-weight: 400;--type-serif-md-size: 32px;--type-serif-md-paragraph-spacing: 0px;--type-serif-md-line-height: 40px;--type-serif-sm-font-weight: 400;--type-serif-sm-size: 24px;--type-serif-sm-paragraph-spacing: 0px;--type-serif-sm-line-height: 26px;--type-serif-lg-font-weight: 400;--type-serif-lg-size: 48px;--type-serif-lg-paragraph-spacing: 0px;--type-serif-lg-line-height: 52px;--type-serif-xs-font-weight: 400;--type-serif-xs-size: 18px;--type-serif-xs-line-height: 24px;--type-serif-xs-paragraph-spacing: 0px;--type-serif-xl-font-weight: 400;--type-serif-xl-size: 48px;--type-serif-xl-paragraph-spacing: 0px;--type-serif-xl-line-height: 58px;--type-mono-md-font-weight: 400;--type-mono-md-size: 22px;--type-mono-md-line-height: 24px;--type-mono-md-paragraph-spacing: 0px;--type-mono-lg-font-weight: 400;--type-mono-lg-size: 40px;--type-mono-lg-line-height: 40px;--type-mono-lg-paragraph-spacing: 0px;--type-mono-sm-font-weight: 400;--type-mono-sm-size: 14px;--type-mono-sm-line-height: 24px;--type-mono-sm-paragraph-spacing: 0px;--spacing-xs-4: 4px;--spacing-xs-8: 8px;--spacing-xs-16: 16px;--spacing-sm-24: 24px;--spacing-sm-32: 32px;--spacing-md-40: 40px;--spacing-md-48: 48px;--spacing-lg-64: 64px;--spacing-lg-80: 80px;--spacing-xlg-104: 104px;--spacing-xlg-152: 152px;--spacing-xs-12: 12px;--spacing-page-section: 80px;--spacing-card-list-spacing: 48px;--spacing-text-section-spacing: 64px;--spacing-md-xs-headings: 40px;--corner-radius-radius-lg: 16px;--corner-radius-radius-sm: 4px;--corner-radius-radius-md: 8px;--corner-radius-radius-round: 104px}}@media(min-width: 568px)and (max-width: 1279px){:root{--token-mode: Rebrand;--dropshadow: 0 2px 4px 0 #22223340;--primary-brand: #0645b1;--error-dark: #b60000;--success-dark: #05b01c;--inactive-fill: #ebebee;--hover: #0c3b8d;--pressed: #082f75;--button-primary-fill-inactive: #ebebee;--button-primary-fill: #0645b1;--button-primary-text: #ffffff;--button-primary-fill-hover: #0c3b8d;--button-primary-fill-press: #082f75;--button-primary-icon: #ffffff;--button-primary-fill-inverse: #ffffff;--button-primary-text-inverse: #082f75;--button-primary-icon-inverse: #0645b1;--button-primary-fill-inverse-hover: #cddaef;--button-primary-stroke-inverse-pressed: #0645b1;--button-secondary-stroke-inactive: #b1b1ba;--button-secondary-fill: #eef2f9;--button-secondary-text: #082f75;--button-secondary-fill-press: #cddaef;--button-secondary-fill-inactive: #ebebee;--button-secondary-stroke: #cddaef;--button-secondary-stroke-hover: #386ac1;--button-secondary-stroke-press: #0645b1;--button-secondary-text-inactive: #b1b1ba;--button-secondary-icon: #082f75;--button-secondary-fill-hover: #e6ecf7;--button-secondary-stroke-inverse: #ffffff;--button-secondary-fill-inverse: rgba(255, 255, 255, 0);--button-secondary-icon-inverse: #ffffff;--button-secondary-icon-hover: #082f75;--button-secondary-icon-press: #082f75;--button-secondary-text-inverse: #ffffff;--button-secondary-text-hover: #082f75;--button-secondary-text-press: #082f75;--button-secondary-fill-inverse-hover: #043059;--button-xs-stroke: #141413;--button-xs-stroke-hover: #0c3b8d;--button-xs-stroke-press: #082f75;--button-xs-stroke-inactive: #ebebee;--button-xs-text: #141413;--button-xs-text-hover: #0c3b8d;--button-xs-text-press: #082f75;--button-xs-text-inactive: #91919e;--button-xs-icon: #141413;--button-xs-icon-hover: #0c3b8d;--button-xs-icon-press: #082f75;--button-xs-icon-inactive: #91919e;--button-xs-fill: #ffffff;--button-xs-fill-hover: #f4f7fc;--button-xs-fill-press: #eef2f9;--buttons-button-text-inactive: #91919e;--buttons-button-focus: #0645b1;--buttons-button-icon-inactive: #91919e;--buttons-small-buttons-corner-radius: 16px;--buttons-small-buttons-l-r-padding: 20px;--buttons-small-buttons-height: 48px;--buttons-small-buttons-gap: 8px;--buttons-small-buttons-icon-only-width: 48px;--buttons-small-buttons-icon-size: 20px;--buttons-small-buttons-stroke-default: 1px;--buttons-small-buttons-stroke-thick: 2px;--buttons-large-buttons-l-r-padding: 32px;--buttons-large-buttons-height: 64px;--buttons-large-buttons-icon-only-width: 64px;--buttons-large-buttons-icon-size: 20px;--buttons-large-buttons-gap: 8px;--buttons-large-buttons-corner-radius: 16px;--buttons-large-buttons-stroke-default: 1px;--buttons-large-buttons-stroke-thick: 2px;--buttons-extra-small-buttons-l-r-padding: 8px;--buttons-extra-small-buttons-height: 32px;--buttons-extra-small-buttons-icon-size: 16px;--buttons-extra-small-buttons-gap: 4px;--buttons-extra-small-buttons-corner-radius: 8px;--buttons-stroke-default: 1px;--buttons-stroke-thick: 2px;--background-beige: #f9f7f4;--error-light: #fff2f2;--text-placeholder: #6d6d7d;--stroke-dark: #141413;--stroke-light: #dddde2;--stroke-medium: #535366;--accent-green: #ccffd4;--accent-turquoise: #ccf7ff;--accent-yellow: #f7ffcc;--accent-peach: #ffd4cc;--accent-violet: #f7ccff;--accent-purple: #f4f7fc;--text-primary: #141413;--secondary-brand: #141413;--text-hover: #0c3b8d;--text-white: #ffffff;--text-link: #0645b1;--text-press: #082f75;--success-light: #f0f8f1;--background-light-blue: #f4f7fc;--background-white: #ffffff;--premium-dark: #877440;--premium-light: #f9f6ed;--stroke-white: #ffffff;--inactive-content: #b1b1ba;--annotate-light: #a35dff;--annotate-dark: #824acc;--grid: #eef2f9;--inactive-stroke: #ebebee;--shadow: rgba(34, 34, 51, 0.25);--text-inactive: #6d6d7d;--text-error: #b60000;--stroke-error: #b60000;--background-error: #fff2f2;--background-black: #141413;--icon-default: #141413;--icon-blue: #0645b1;--background-grey: #dddde2;--icon-grey: #b1b1ba;--text-focus: #082f75;--brand-colors-neutral-black: #141413;--brand-colors-neutral-900: #535366;--brand-colors-neutral-800: #6d6d7d;--brand-colors-neutral-700: #91919e;--brand-colors-neutral-600: #b1b1ba;--brand-colors-neutral-500: #c8c8cf;--brand-colors-neutral-400: #dddde2;--brand-colors-neutral-300: #ebebee;--brand-colors-neutral-200: #f8f8fb;--brand-colors-neutral-100: #fafafa;--brand-colors-neutral-white: #ffffff;--brand-colors-blue-900: #043059;--brand-colors-blue-800: #082f75;--brand-colors-blue-700: #0c3b8d;--brand-colors-blue-600: #0645b1;--brand-colors-blue-500: #386ac1;--brand-colors-blue-400: #cddaef;--brand-colors-blue-300: #e6ecf7;--brand-colors-blue-200: #eef2f9;--brand-colors-blue-100: #f4f7fc;--brand-colors-gold-500: #877440;--brand-colors-gold-400: #e9e3d4;--brand-colors-gold-300: #f2efe8;--brand-colors-gold-200: #f9f6ed;--brand-colors-gold-100: #f9f7f4;--brand-colors-error-900: #920000;--brand-colors-error-500: #b60000;--brand-colors-success-900: #035c0f;--brand-colors-green: #ccffd4;--brand-colors-turquoise: #ccf7ff;--brand-colors-yellow: #f7ffcc;--brand-colors-peach: #ffd4cc;--brand-colors-violet: #f7ccff;--brand-colors-error-100: #fff2f2;--brand-colors-success-500: #05b01c;--brand-colors-success-100: #f0f8f1;--text-secondary: #535366;--icon-white: #ffffff;--background-beige-darker: #f2efe8;--icon-dark-grey: #535366;--type-font-family-sans-serif: DM Sans;--type-font-family-serif: Gupter;--type-font-family-mono: IBM Plex Mono;--type-weights-300: 300;--type-weights-400: 400;--type-weights-500: 500;--type-weights-700: 700;--type-sizes-12: 12px;--type-sizes-14: 14px;--type-sizes-16: 16px;--type-sizes-18: 18px;--type-sizes-20: 20px;--type-sizes-22: 22px;--type-sizes-24: 24px;--type-sizes-28: 28px;--type-sizes-30: 30px;--type-sizes-32: 32px;--type-sizes-40: 40px;--type-sizes-42: 42px;--type-sizes-48-2: 48px;--type-line-heights-16: 16px;--type-line-heights-20: 20px;--type-line-heights-23: 23px;--type-line-heights-24: 24px;--type-line-heights-25: 25px;--type-line-heights-26: 26px;--type-line-heights-29: 29px;--type-line-heights-30: 30px;--type-line-heights-32: 32px;--type-line-heights-34: 34px;--type-line-heights-35: 35px;--type-line-heights-36: 36px;--type-line-heights-38: 38px;--type-line-heights-40: 40px;--type-line-heights-46: 46px;--type-line-heights-48: 48px;--type-line-heights-52: 52px;--type-line-heights-58: 58px;--type-line-heights-68: 68px;--type-line-heights-74: 74px;--type-line-heights-82: 82px;--type-paragraph-spacings-0: 0px;--type-paragraph-spacings-4: 4px;--type-paragraph-spacings-8: 8px;--type-paragraph-spacings-16: 16px;--type-sans-serif-xl-font-weight: 400;--type-sans-serif-xl-size: 42px;--type-sans-serif-xl-line-height: 46px;--type-sans-serif-xl-paragraph-spacing: 16px;--type-sans-serif-lg-font-weight: 400;--type-sans-serif-lg-size: 32px;--type-sans-serif-lg-line-height: 36px;--type-sans-serif-lg-paragraph-spacing: 16px;--type-sans-serif-md-font-weight: 400;--type-sans-serif-md-line-height: 34px;--type-sans-serif-md-paragraph-spacing: 16px;--type-sans-serif-md-size: 28px;--type-sans-serif-xs-font-weight: 700;--type-sans-serif-xs-line-height: 25px;--type-sans-serif-xs-paragraph-spacing: 0px;--type-sans-serif-xs-size: 20px;--type-sans-serif-sm-font-weight: 400;--type-sans-serif-sm-line-height: 30px;--type-sans-serif-sm-paragraph-spacing: 16px;--type-sans-serif-sm-size: 24px;--type-body-xl-font-weight: 400;--type-body-xl-size: 24px;--type-body-xl-line-height: 36px;--type-body-xl-paragraph-spacing: 0px;--type-body-sm-font-weight: 400;--type-body-sm-size: 14px;--type-body-sm-line-height: 20px;--type-body-sm-paragraph-spacing: 8px;--type-body-xs-font-weight: 400;--type-body-xs-size: 12px;--type-body-xs-line-height: 16px;--type-body-xs-paragraph-spacing: 0px;--type-body-md-font-weight: 400;--type-body-md-size: 16px;--type-body-md-line-height: 20px;--type-body-md-paragraph-spacing: 4px;--type-body-lg-font-weight: 400;--type-body-lg-size: 20px;--type-body-lg-line-height: 26px;--type-body-lg-paragraph-spacing: 16px;--type-body-lg-medium-font-weight: 500;--type-body-lg-medium-size: 20px;--type-body-lg-medium-line-height: 32px;--type-body-lg-medium-paragraph-spacing: 16px;--type-body-md-medium-font-weight: 500;--type-body-md-medium-size: 16px;--type-body-md-medium-line-height: 20px;--type-body-md-medium-paragraph-spacing: 4px;--type-body-sm-bold-font-weight: 700;--type-body-sm-bold-size: 14px;--type-body-sm-bold-line-height: 20px;--type-body-sm-bold-paragraph-spacing: 8px;--type-body-sm-medium-font-weight: 500;--type-body-sm-medium-size: 14px;--type-body-sm-medium-line-height: 20px;--type-body-sm-medium-paragraph-spacing: 8px;--type-serif-md-font-weight: 400;--type-serif-md-size: 40px;--type-serif-md-paragraph-spacing: 0px;--type-serif-md-line-height: 48px;--type-serif-sm-font-weight: 400;--type-serif-sm-size: 28px;--type-serif-sm-paragraph-spacing: 0px;--type-serif-sm-line-height: 32px;--type-serif-lg-font-weight: 400;--type-serif-lg-size: 58px;--type-serif-lg-paragraph-spacing: 0px;--type-serif-lg-line-height: 68px;--type-serif-xs-font-weight: 400;--type-serif-xs-size: 18px;--type-serif-xs-line-height: 24px;--type-serif-xs-paragraph-spacing: 0px;--type-serif-xl-font-weight: 400;--type-serif-xl-size: 74px;--type-serif-xl-paragraph-spacing: 0px;--type-serif-xl-line-height: 82px;--type-mono-md-font-weight: 400;--type-mono-md-size: 22px;--type-mono-md-line-height: 24px;--type-mono-md-paragraph-spacing: 0px;--type-mono-lg-font-weight: 400;--type-mono-lg-size: 40px;--type-mono-lg-line-height: 40px;--type-mono-lg-paragraph-spacing: 0px;--type-mono-sm-font-weight: 400;--type-mono-sm-size: 14px;--type-mono-sm-line-height: 24px;--type-mono-sm-paragraph-spacing: 0px;--spacing-xs-4: 4px;--spacing-xs-8: 8px;--spacing-xs-16: 16px;--spacing-sm-24: 24px;--spacing-sm-32: 32px;--spacing-md-40: 40px;--spacing-md-48: 48px;--spacing-lg-64: 64px;--spacing-lg-80: 80px;--spacing-xlg-104: 104px;--spacing-xlg-152: 152px;--spacing-xs-12: 12px;--spacing-page-section: 104px;--spacing-card-list-spacing: 48px;--spacing-text-section-spacing: 80px;--spacing-md-xs-headings: 40px;--corner-radius-radius-lg: 16px;--corner-radius-radius-sm: 4px;--corner-radius-radius-md: 8px;--corner-radius-radius-round: 104px}}@media(min-width: 1280px){:root{--token-mode: Rebrand;--dropshadow: 0 2px 4px 0 #22223340;--primary-brand: #0645b1;--error-dark: #b60000;--success-dark: #05b01c;--inactive-fill: #ebebee;--hover: #0c3b8d;--pressed: #082f75;--button-primary-fill-inactive: #ebebee;--button-primary-fill: #0645b1;--button-primary-text: #ffffff;--button-primary-fill-hover: #0c3b8d;--button-primary-fill-press: #082f75;--button-primary-icon: #ffffff;--button-primary-fill-inverse: #ffffff;--button-primary-text-inverse: #082f75;--button-primary-icon-inverse: #0645b1;--button-primary-fill-inverse-hover: #cddaef;--button-primary-stroke-inverse-pressed: #0645b1;--button-secondary-stroke-inactive: #b1b1ba;--button-secondary-fill: #eef2f9;--button-secondary-text: #082f75;--button-secondary-fill-press: #cddaef;--button-secondary-fill-inactive: #ebebee;--button-secondary-stroke: #cddaef;--button-secondary-stroke-hover: #386ac1;--button-secondary-stroke-press: #0645b1;--button-secondary-text-inactive: #b1b1ba;--button-secondary-icon: #082f75;--button-secondary-fill-hover: #e6ecf7;--button-secondary-stroke-inverse: #ffffff;--button-secondary-fill-inverse: rgba(255, 255, 255, 0);--button-secondary-icon-inverse: #ffffff;--button-secondary-icon-hover: #082f75;--button-secondary-icon-press: #082f75;--button-secondary-text-inverse: #ffffff;--button-secondary-text-hover: #082f75;--button-secondary-text-press: #082f75;--button-secondary-fill-inverse-hover: #043059;--button-xs-stroke: #141413;--button-xs-stroke-hover: #0c3b8d;--button-xs-stroke-press: #082f75;--button-xs-stroke-inactive: #ebebee;--button-xs-text: #141413;--button-xs-text-hover: #0c3b8d;--button-xs-text-press: #082f75;--button-xs-text-inactive: #91919e;--button-xs-icon: #141413;--button-xs-icon-hover: #0c3b8d;--button-xs-icon-press: #082f75;--button-xs-icon-inactive: #91919e;--button-xs-fill: #ffffff;--button-xs-fill-hover: #f4f7fc;--button-xs-fill-press: #eef2f9;--buttons-button-text-inactive: #91919e;--buttons-button-focus: #0645b1;--buttons-button-icon-inactive: #91919e;--buttons-small-buttons-corner-radius: 16px;--buttons-small-buttons-l-r-padding: 20px;--buttons-small-buttons-height: 48px;--buttons-small-buttons-gap: 8px;--buttons-small-buttons-icon-only-width: 48px;--buttons-small-buttons-icon-size: 20px;--buttons-small-buttons-stroke-default: 1px;--buttons-small-buttons-stroke-thick: 2px;--buttons-large-buttons-l-r-padding: 32px;--buttons-large-buttons-height: 64px;--buttons-large-buttons-icon-only-width: 64px;--buttons-large-buttons-icon-size: 20px;--buttons-large-buttons-gap: 8px;--buttons-large-buttons-corner-radius: 16px;--buttons-large-buttons-stroke-default: 1px;--buttons-large-buttons-stroke-thick: 2px;--buttons-extra-small-buttons-l-r-padding: 8px;--buttons-extra-small-buttons-height: 32px;--buttons-extra-small-buttons-icon-size: 16px;--buttons-extra-small-buttons-gap: 4px;--buttons-extra-small-buttons-corner-radius: 8px;--buttons-stroke-default: 1px;--buttons-stroke-thick: 2px;--background-beige: #f9f7f4;--error-light: #fff2f2;--text-placeholder: #6d6d7d;--stroke-dark: #141413;--stroke-light: #dddde2;--stroke-medium: #535366;--accent-green: #ccffd4;--accent-turquoise: #ccf7ff;--accent-yellow: #f7ffcc;--accent-peach: #ffd4cc;--accent-violet: #f7ccff;--accent-purple: #f4f7fc;--text-primary: #141413;--secondary-brand: #141413;--text-hover: #0c3b8d;--text-white: #ffffff;--text-link: #0645b1;--text-press: #082f75;--success-light: #f0f8f1;--background-light-blue: #f4f7fc;--background-white: #ffffff;--premium-dark: #877440;--premium-light: #f9f6ed;--stroke-white: #ffffff;--inactive-content: #b1b1ba;--annotate-light: #a35dff;--annotate-dark: #824acc;--grid: #eef2f9;--inactive-stroke: #ebebee;--shadow: rgba(34, 34, 51, 0.25);--text-inactive: #6d6d7d;--text-error: #b60000;--stroke-error: #b60000;--background-error: #fff2f2;--background-black: #141413;--icon-default: #141413;--icon-blue: #0645b1;--background-grey: #dddde2;--icon-grey: #b1b1ba;--text-focus: #082f75;--brand-colors-neutral-black: #141413;--brand-colors-neutral-900: #535366;--brand-colors-neutral-800: #6d6d7d;--brand-colors-neutral-700: #91919e;--brand-colors-neutral-600: #b1b1ba;--brand-colors-neutral-500: #c8c8cf;--brand-colors-neutral-400: #dddde2;--brand-colors-neutral-300: #ebebee;--brand-colors-neutral-200: #f8f8fb;--brand-colors-neutral-100: #fafafa;--brand-colors-neutral-white: #ffffff;--brand-colors-blue-900: #043059;--brand-colors-blue-800: #082f75;--brand-colors-blue-700: #0c3b8d;--brand-colors-blue-600: #0645b1;--brand-colors-blue-500: #386ac1;--brand-colors-blue-400: #cddaef;--brand-colors-blue-300: #e6ecf7;--brand-colors-blue-200: #eef2f9;--brand-colors-blue-100: #f4f7fc;--brand-colors-gold-500: #877440;--brand-colors-gold-400: #e9e3d4;--brand-colors-gold-300: #f2efe8;--brand-colors-gold-200: #f9f6ed;--brand-colors-gold-100: #f9f7f4;--brand-colors-error-900: #920000;--brand-colors-error-500: #b60000;--brand-colors-success-900: #035c0f;--brand-colors-green: #ccffd4;--brand-colors-turquoise: #ccf7ff;--brand-colors-yellow: #f7ffcc;--brand-colors-peach: #ffd4cc;--brand-colors-violet: #f7ccff;--brand-colors-error-100: #fff2f2;--brand-colors-success-500: #05b01c;--brand-colors-success-100: #f0f8f1;--text-secondary: #535366;--icon-white: #ffffff;--background-beige-darker: #f2efe8;--icon-dark-grey: #535366;--type-font-family-sans-serif: DM Sans;--type-font-family-serif: Gupter;--type-font-family-mono: IBM Plex Mono;--type-weights-300: 300;--type-weights-400: 400;--type-weights-500: 500;--type-weights-700: 700;--type-sizes-12: 12px;--type-sizes-14: 14px;--type-sizes-16: 16px;--type-sizes-18: 18px;--type-sizes-20: 20px;--type-sizes-22: 22px;--type-sizes-24: 24px;--type-sizes-28: 28px;--type-sizes-30: 30px;--type-sizes-32: 32px;--type-sizes-40: 40px;--type-sizes-42: 42px;--type-sizes-48-2: 48px;--type-line-heights-16: 16px;--type-line-heights-20: 20px;--type-line-heights-23: 23px;--type-line-heights-24: 24px;--type-line-heights-25: 25px;--type-line-heights-26: 26px;--type-line-heights-29: 29px;--type-line-heights-30: 30px;--type-line-heights-32: 32px;--type-line-heights-34: 34px;--type-line-heights-35: 35px;--type-line-heights-36: 36px;--type-line-heights-38: 38px;--type-line-heights-40: 40px;--type-line-heights-46: 46px;--type-line-heights-48: 48px;--type-line-heights-52: 52px;--type-line-heights-58: 58px;--type-line-heights-68: 68px;--type-line-heights-74: 74px;--type-line-heights-82: 82px;--type-paragraph-spacings-0: 0px;--type-paragraph-spacings-4: 4px;--type-paragraph-spacings-8: 8px;--type-paragraph-spacings-16: 16px;--type-sans-serif-xl-font-weight: 400;--type-sans-serif-xl-size: 42px;--type-sans-serif-xl-line-height: 46px;--type-sans-serif-xl-paragraph-spacing: 16px;--type-sans-serif-lg-font-weight: 400;--type-sans-serif-lg-size: 32px;--type-sans-serif-lg-line-height: 38px;--type-sans-serif-lg-paragraph-spacing: 16px;--type-sans-serif-md-font-weight: 400;--type-sans-serif-md-line-height: 34px;--type-sans-serif-md-paragraph-spacing: 16px;--type-sans-serif-md-size: 28px;--type-sans-serif-xs-font-weight: 700;--type-sans-serif-xs-line-height: 25px;--type-sans-serif-xs-paragraph-spacing: 0px;--type-sans-serif-xs-size: 20px;--type-sans-serif-sm-font-weight: 400;--type-sans-serif-sm-line-height: 30px;--type-sans-serif-sm-paragraph-spacing: 16px;--type-sans-serif-sm-size: 24px;--type-body-xl-font-weight: 400;--type-body-xl-size: 24px;--type-body-xl-line-height: 36px;--type-body-xl-paragraph-spacing: 0px;--type-body-sm-font-weight: 400;--type-body-sm-size: 14px;--type-body-sm-line-height: 20px;--type-body-sm-paragraph-spacing: 8px;--type-body-xs-font-weight: 400;--type-body-xs-size: 12px;--type-body-xs-line-height: 16px;--type-body-xs-paragraph-spacing: 0px;--type-body-md-font-weight: 400;--type-body-md-size: 16px;--type-body-md-line-height: 20px;--type-body-md-paragraph-spacing: 4px;--type-body-lg-font-weight: 400;--type-body-lg-size: 20px;--type-body-lg-line-height: 26px;--type-body-lg-paragraph-spacing: 16px;--type-body-lg-medium-font-weight: 500;--type-body-lg-medium-size: 20px;--type-body-lg-medium-line-height: 32px;--type-body-lg-medium-paragraph-spacing: 16px;--type-body-md-medium-font-weight: 500;--type-body-md-medium-size: 16px;--type-body-md-medium-line-height: 20px;--type-body-md-medium-paragraph-spacing: 4px;--type-body-sm-bold-font-weight: 700;--type-body-sm-bold-size: 14px;--type-body-sm-bold-line-height: 20px;--type-body-sm-bold-paragraph-spacing: 8px;--type-body-sm-medium-font-weight: 500;--type-body-sm-medium-size: 14px;--type-body-sm-medium-line-height: 20px;--type-body-sm-medium-paragraph-spacing: 8px;--type-serif-md-font-weight: 400;--type-serif-md-size: 40px;--type-serif-md-paragraph-spacing: 0px;--type-serif-md-line-height: 48px;--type-serif-sm-font-weight: 400;--type-serif-sm-size: 28px;--type-serif-sm-paragraph-spacing: 0px;--type-serif-sm-line-height: 32px;--type-serif-lg-font-weight: 400;--type-serif-lg-size: 58px;--type-serif-lg-paragraph-spacing: 0px;--type-serif-lg-line-height: 68px;--type-serif-xs-font-weight: 400;--type-serif-xs-size: 18px;--type-serif-xs-line-height: 24px;--type-serif-xs-paragraph-spacing: 0px;--type-serif-xl-font-weight: 400;--type-serif-xl-size: 74px;--type-serif-xl-paragraph-spacing: 0px;--type-serif-xl-line-height: 82px;--type-mono-md-font-weight: 400;--type-mono-md-size: 22px;--type-mono-md-line-height: 24px;--type-mono-md-paragraph-spacing: 0px;--type-mono-lg-font-weight: 400;--type-mono-lg-size: 40px;--type-mono-lg-line-height: 40px;--type-mono-lg-paragraph-spacing: 0px;--type-mono-sm-font-weight: 400;--type-mono-sm-size: 14px;--type-mono-sm-line-height: 24px;--type-mono-sm-paragraph-spacing: 0px;--spacing-xs-4: 4px;--spacing-xs-8: 8px;--spacing-xs-16: 16px;--spacing-sm-24: 24px;--spacing-sm-32: 32px;--spacing-md-40: 40px;--spacing-md-48: 48px;--spacing-lg-64: 64px;--spacing-lg-80: 80px;--spacing-xlg-104: 104px;--spacing-xlg-152: 152px;--spacing-xs-12: 12px;--spacing-page-section: 152px;--spacing-card-list-spacing: 48px;--spacing-text-section-spacing: 80px;--spacing-md-xs-headings: 40px;--corner-radius-radius-lg: 16px;--corner-radius-radius-sm: 4px;--corner-radius-radius-md: 8px;--corner-radius-radius-round: 104px}}</style><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/single_work_page/loswp-f29774c14b4e629cbb4375c919ad1c2b2891ac825d0a410ea6339ae17e481a55.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/body-170d1319f0e354621e81ca17054bb147da2856ec0702fe440a99af314a6338c5.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/button-bfbac2a470372e2f3a6661a65fa7ff0a0fbf7aa32534d9a831d683d2a6f9e01b.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/heading-95367dc03b794f6737f30123738a886cf53b7a65cdef98a922a98591d60063e3.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/text_button-d1941ab08e91e29ee143084c4749da4aaffa350a2ac6eec2306b1d7a352d911a.css" /><link crossorigin="" href="https://fonts.gstatic.com/" rel="preconnect" /><link href="https://fonts.googleapis.com/css2?family=DM+Sans:ital,opsz,wght@0,9..40,100..1000;1,9..40,100..1000&amp;family=Gupter:wght@400;500;700&amp;family=IBM+Plex+Mono:wght@300;400&amp;family=Material+Symbols+Outlined:opsz,wght,FILL,GRAD@20,400,0,0&amp;display=swap" rel="stylesheet" /> </head> <body> <div id='react-modal'></div> <div class="js-upgrade-ie-banner" style="display: none; text-align: center; padding: 8px 0; background-color: #ebe480;"><p style="color: #000; font-size: 12px; margin: 0 0 4px;">Academia.edu no longer supports Internet Explorer.</p><p style="color: #000; font-size: 12px; margin: 0;">To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to&nbsp;<a href="https://www.academia.edu/upgrade-browser">upgrade your browser</a>.</p></div><script>// Show this banner for all versions of IE if (!!window.MSInputMethodContext || /(MSIE)/.test(navigator.userAgent)) { document.querySelector('.js-upgrade-ie-banner').style.display = 'block'; }</script> <div class="bootstrap login"><div class="modal fade login-modal" id="login-modal"><div class="login-modal-dialog modal-dialog"><div class="modal-content"><div class="modal-header"><button class="close close" data-dismiss="modal" type="button"><span aria-hidden="true">&times;</span><span class="sr-only">Close</span></button><h4 class="modal-title text-center"><strong>Log In</strong></h4></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><button class="btn btn-fb btn-lg btn-block btn-v-center-content" id="login-facebook-oauth-button"><svg style="float: left; width: 19px; line-height: 1em; margin-right: .3em;" aria-hidden="true" focusable="false" data-prefix="fab" data-icon="facebook-square" class="svg-inline--fa fa-facebook-square fa-w-14" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 448 512"><path fill="currentColor" d="M400 32H48A48 48 0 0 0 0 80v352a48 48 0 0 0 48 48h137.25V327.69h-63V256h63v-54.64c0-62.15 37-96.48 93.67-96.48 27.14 0 55.52 4.84 55.52 4.84v61h-31.27c-30.81 0-40.42 19.12-40.42 38.73V256h68.78l-11 71.69h-57.78V480H400a48 48 0 0 0 48-48V80a48 48 0 0 0-48-48z"></path></svg><small><strong>Log in</strong> with <strong>Facebook</strong></small></button><br /><button class="btn btn-google btn-lg btn-block btn-v-center-content" id="login-google-oauth-button"><svg style="float: left; width: 22px; line-height: 1em; margin-right: .3em;" aria-hidden="true" focusable="false" data-prefix="fab" data-icon="google-plus" class="svg-inline--fa fa-google-plus fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M256,8C119.1,8,8,119.1,8,256S119.1,504,256,504,504,392.9,504,256,392.9,8,256,8ZM185.3,380a124,124,0,0,1,0-248c31.3,0,60.1,11,83,32.3l-33.6,32.6c-13.2-12.9-31.3-19.1-49.4-19.1-42.9,0-77.2,35.5-77.2,78.1S142.3,334,185.3,334c32.6,0,64.9-19.1,70.1-53.3H185.3V238.1H302.2a109.2,109.2,0,0,1,1.9,20.7c0,70.8-47.5,121.2-118.8,121.2ZM415.5,273.8v35.5H380V273.8H344.5V238.3H380V202.8h35.5v35.5h35.2v35.5Z"></path></svg><small><strong>Log in</strong> with <strong>Google</strong></small></button><br /><style type="text/css">.sign-in-with-apple-button { width: 100%; height: 52px; border-radius: 3px; border: 1px solid black; cursor: pointer; } .sign-in-with-apple-button > div { margin: 0 auto; / This centers the Apple-rendered button horizontally }</style><script src="https://appleid.cdn-apple.com/appleauth/static/jsapi/appleid/1/en_US/appleid.auth.js" type="text/javascript"></script><div class="sign-in-with-apple-button" data-border="false" data-color="white" id="appleid-signin"><span &nbsp;&nbsp;="Sign Up with Apple" class="u-fs11"></span></div><script>AppleID.auth.init({ clientId: 'edu.academia.applesignon', scope: 'name email', redirectURI: 'https://www.academia.edu/sessions', state: "88d7cf61921f645234d95fcf06a41cddf34dc6a9c40cd5bb5781b74c1f0a84ab", });</script><script>// Hacky way of checking if on fast loswp if (window.loswp == null) { (function() { const Google = window?.Aedu?.Auth?.OauthButton?.Login?.Google; const Facebook = window?.Aedu?.Auth?.OauthButton?.Login?.Facebook; if (Google) { new Google({ el: '#login-google-oauth-button', rememberMeCheckboxId: 'remember_me', track: null }); } if (Facebook) { new Facebook({ el: '#login-facebook-oauth-button', rememberMeCheckboxId: 'remember_me', track: null }); } })(); }</script></div></div></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><div class="hr-heading login-hr-heading"><span class="hr-heading-text">or</span></div></div></div></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><form class="js-login-form" action="https://www.academia.edu/sessions" accept-charset="UTF-8" method="post"><input type="hidden" name="authenticity_token" value="bOo0m_Fq2URU_gsu_ULGOINKSjLOgn-jrlj0gu4vbw64dSeaZlKyMdKQh1XqwGRAgMghL_Hvw-Zxejevz4x8Tg" autocomplete="off" /><div class="form-group"><label class="control-label" for="login-modal-email-input" style="font-size: 14px;">Email</label><input class="form-control" id="login-modal-email-input" name="login" type="email" /></div><div class="form-group"><label class="control-label" for="login-modal-password-input" style="font-size: 14px;">Password</label><input class="form-control" id="login-modal-password-input" name="password" type="password" /></div><input type="hidden" name="post_login_redirect_url" id="post_login_redirect_url" value="https://www.academia.edu/110213023/Attributions_of_morality_and_mind_to_artificial_intelligence_after_real_world_moral_violations" autocomplete="off" /><div class="checkbox"><label><input type="checkbox" name="remember_me" id="remember_me" value="1" checked="checked" /><small style="font-size: 12px; margin-top: 2px; display: inline-block;">Remember me on this computer</small></label></div><br><input type="submit" name="commit" value="Log In" class="btn btn-primary btn-block btn-lg js-login-submit" data-disable-with="Log In" /></br></form><script>typeof window?.Aedu?.recaptchaManagedForm === 'function' && window.Aedu.recaptchaManagedForm( document.querySelector('.js-login-form'), document.querySelector('.js-login-submit') );</script><small style="font-size: 12px;"><br />or <a data-target="#login-modal-reset-password-container" data-toggle="collapse" href="javascript:void(0)">reset password</a></small><div class="collapse" id="login-modal-reset-password-container"><br /><div class="well margin-0x"><form class="js-password-reset-form" action="https://www.academia.edu/reset_password" accept-charset="UTF-8" method="post"><input type="hidden" name="authenticity_token" value="igjTxy02ZeWOd6ALlbJaFwiuWF8oFERY8PXSMK4pPMNel8DGug4OkAgZLHCCMPhvCywzQhd5-B0v1xEdj4ovgw" autocomplete="off" /><p>Enter the email address you signed up with and we&#39;ll email you a reset link.</p><div class="form-group"><input class="form-control" name="email" type="email" /></div><input class="btn btn-primary btn-block g-recaptcha js-password-reset-submit" data-sitekey="6Lf3KHUUAAAAACggoMpmGJdQDtiyrjVlvGJ6BbAj" type="submit" value="Email me a link" /></form></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/collapse-45805421cf446ca5adf7aaa1935b08a3a8d1d9a6cc5d91a62a2a3a00b20b3e6a.js"], function() { // from javascript_helper.rb $("#login-modal-reset-password-container").on("shown.bs.collapse", function() { $(this).find("input[type=email]").focus(); }); }); </script> </div></div></div><div class="modal-footer"><div class="text-center"><small style="font-size: 12px;">Need an account?&nbsp;<a rel="nofollow" href="https://www.academia.edu/signup">Click here to sign up</a></small></div></div></div></div></div></div><script>// If we are on subdomain or non-bootstrapped page, redirect to login page instead of showing modal (function(){ if (typeof $ === 'undefined') return; var host = window.location.hostname; if ((host === $domain || host === "www."+$domain) && (typeof $().modal === 'function')) { $("#nav_log_in").click(function(e) { // Don't follow the link and open the modal e.preventDefault(); $("#login-modal").on('shown.bs.modal', function() { $(this).find("#login-modal-email-input").focus() }).modal('show'); }); } })()</script> <div id="fb-root"></div><script>window.fbAsyncInit = function() { FB.init({ appId: "2369844204", version: "v8.0", status: true, cookie: true, xfbml: true }); // Additional initialization code. if (window.InitFacebook) { // facebook.ts already loaded, set it up. window.InitFacebook(); } else { // Set a flag for facebook.ts to find when it loads. window.academiaAuthReadyFacebook = true; } };</script> <div id="google-root"></div><script>window.loadGoogle = function() { if (window.InitGoogle) { // google.ts already loaded, set it up. window.InitGoogle("331998490334-rsn3chp12mbkiqhl6e7lu2q0mlbu0f1b"); } else { // Set a flag for google.ts to use when it loads. window.GoogleClientID = "331998490334-rsn3chp12mbkiqhl6e7lu2q0mlbu0f1b"; } };</script> <div class="header--container" id="main-header-container"><div class="header--inner-container header--inner-container-ds2"><div class="header-ds2--left-wrapper"><div class="header-ds2--left-wrapper-inner"><a data-main-header-link-target="logo_home" href="https://www.academia.edu/"><img class="hide-on-desktop-redesign" style="height: 24px; width: 24px;" alt="Academia.edu" src="//a.academia-assets.com/images/academia-logo-redesign-2015-A.svg" width="24" height="24" /><img width="145.2" height="18" class="hide-on-mobile-redesign" style="height: 24px;" alt="Academia.edu" src="//a.academia-assets.com/images/academia-logo-redesign-2015.svg" /></a><div class="header--search-container header--search-container-ds2"><form class="js-SiteSearch-form select2-no-default-pills" action="https://www.academia.edu/search" accept-charset="UTF-8" method="get"><svg style="width: 14px; height: 14px;" aria-hidden="true" focusable="false" data-prefix="fas" data-icon="search" class="header--search-icon svg-inline--fa fa-search fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M505 442.7L405.3 343c-4.5-4.5-10.6-7-17-7H372c27.6-35.3 44-79.7 44-128C416 93.1 322.9 0 208 0S0 93.1 0 208s93.1 208 208 208c48.3 0 92.7-16.4 128-44v16.3c0 6.4 2.5 12.5 7 17l99.7 99.7c9.4 9.4 24.6 9.4 33.9 0l28.3-28.3c9.4-9.4 9.4-24.6.1-34zM208 336c-70.7 0-128-57.2-128-128 0-70.7 57.2-128 128-128 70.7 0 128 57.2 128 128 0 70.7-57.2 128-128 128z"></path></svg><input class="header--search-input header--search-input-ds2 js-SiteSearch-form-input" data-main-header-click-target="search_input" name="q" placeholder="Search" type="text" /></form></div></div></div><nav class="header--nav-buttons header--nav-buttons-ds2 js-main-nav"><button class="ds2-5-button ds2-5-button--secondary js-header-login-url header-button-ds2 header-login-ds2 hide-on-mobile-redesign react-login-modal-opener" data-signup-modal="{&quot;location&quot;:&quot;login-button--header&quot;}" rel="nofollow">Log In</button><button class="ds2-5-button ds2-5-button--secondary header-button-ds2 hide-on-mobile-redesign react-login-modal-opener" data-signup-modal="{&quot;location&quot;:&quot;signup-button--header&quot;}" rel="nofollow">Sign Up</button><button class="header--hamburger-button header--hamburger-button-ds2 hide-on-desktop-redesign js-header-hamburger-button"><div class="icon-bar"></div><div class="icon-bar" style="margin-top: 4px;"></div><div class="icon-bar" style="margin-top: 4px;"></div></button></nav></div><ul class="header--dropdown-container js-header-dropdown"><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/login" rel="nofollow">Log In</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/signup" rel="nofollow">Sign Up</a></li><li class="header--dropdown-row js-header-dropdown-expand-button"><button class="header--dropdown-button">more<svg aria-hidden="true" focusable="false" data-prefix="fas" data-icon="caret-down" class="header--dropdown-button-icon svg-inline--fa fa-caret-down fa-w-10" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 320 512"><path fill="currentColor" d="M31.3 192h257.3c17.8 0 26.7 21.5 14.1 34.1L174.1 354.8c-7.8 7.8-20.5 7.8-28.3 0L17.2 226.1C4.6 213.5 13.5 192 31.3 192z"></path></svg></button></li><li><ul class="header--expanded-dropdown-container"><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/about">About</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/press">Press</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/documents">Papers</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/terms">Terms</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/privacy">Privacy</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/copyright">Copyright</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/hiring"><svg aria-hidden="true" focusable="false" data-prefix="fas" data-icon="briefcase" class="header--dropdown-row-icon svg-inline--fa fa-briefcase fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M320 336c0 8.84-7.16 16-16 16h-96c-8.84 0-16-7.16-16-16v-48H0v144c0 25.6 22.4 48 48 48h416c25.6 0 48-22.4 48-48V288H320v48zm144-208h-80V80c0-25.6-22.4-48-48-48H176c-25.6 0-48 22.4-48 48v48H48c-25.6 0-48 22.4-48 48v80h512v-80c0-25.6-22.4-48-48-48zm-144 0H192V96h128v32z"></path></svg>We&#39;re Hiring!</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://support.academia.edu/hc/en-us"><svg aria-hidden="true" focusable="false" data-prefix="fas" data-icon="question-circle" class="header--dropdown-row-icon svg-inline--fa fa-question-circle fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M504 256c0 136.997-111.043 248-248 248S8 392.997 8 256C8 119.083 119.043 8 256 8s248 111.083 248 248zM262.655 90c-54.497 0-89.255 22.957-116.549 63.758-3.536 5.286-2.353 12.415 2.715 16.258l34.699 26.31c5.205 3.947 12.621 3.008 16.665-2.122 17.864-22.658 30.113-35.797 57.303-35.797 20.429 0 45.698 13.148 45.698 32.958 0 14.976-12.363 22.667-32.534 33.976C247.128 238.528 216 254.941 216 296v4c0 6.627 5.373 12 12 12h56c6.627 0 12-5.373 12-12v-1.333c0-28.462 83.186-29.647 83.186-106.667 0-58.002-60.165-102-116.531-102zM256 338c-25.365 0-46 20.635-46 46 0 25.364 20.635 46 46 46s46-20.636 46-46c0-25.365-20.635-46-46-46z"></path></svg>Help Center</a></li><li class="header--dropdown-row js-header-dropdown-collapse-button"><button class="header--dropdown-button">less<svg aria-hidden="true" focusable="false" data-prefix="fas" data-icon="caret-up" class="header--dropdown-button-icon svg-inline--fa fa-caret-up fa-w-10" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 320 512"><path fill="currentColor" d="M288.662 352H31.338c-17.818 0-26.741-21.543-14.142-34.142l128.662-128.662c7.81-7.81 20.474-7.81 28.284 0l128.662 128.662c12.6 12.599 3.676 34.142-14.142 34.142z"></path></svg></button></li></ul></li></ul></div> <script src="//a.academia-assets.com/assets/webpack_bundles/fast_loswp-bundle-e5ca05062a7092a8f6f5af11f70589210af26f6a0030f102b0b21b22451b9d41.js" defer="defer"></script><script>window.loswp = {}; window.loswp.author = 1952752; window.loswp.bulkDownloadFilterCounts = {}; window.loswp.hasDownloadableAttachment = true; window.loswp.hasViewableAttachments = true; // TODO: just use routes for this window.loswp.loginUrl = "https://www.academia.edu/login?post_login_redirect_url=https%3A%2F%2Fwww.academia.edu%2F110213023%2FAttributions_of_morality_and_mind_to_artificial_intelligence_after_real_world_moral_violations%3Fauto%3Ddownload"; window.loswp.translateUrl = "https://www.academia.edu/login?post_login_redirect_url=https%3A%2F%2Fwww.academia.edu%2F110213023%2FAttributions_of_morality_and_mind_to_artificial_intelligence_after_real_world_moral_violations%3Fshow_translation%3Dtrue"; window.loswp.previewableAttachments = [{"id":108099213,"identifier":"Attachment_108099213","shouldShowBulkDownload":false}]; window.loswp.shouldDetectTimezone = true; window.loswp.shouldShowBulkDownload = true; window.loswp.showSignupCaptcha = false window.loswp.willEdgeCache = false; window.loswp.work = {"work":{"id":110213023,"created_at":"2023-11-30T07:25:06.679-08:00","from_world_paper_id":244415119,"updated_at":"2024-11-25T03:04:48.945-08:00","_data":{"publisher":"Elsevier BV","grobid_abstract":"The media has portrayed certain artificial intelligence (AI) software as committing moral violations such as the AI judge of a human beauty contest being \"racist\" when it selected predominately light-skinned winners. We examine people's attributions of morality for seven such real-world events that were first publicized in the media, experimentally manipulating the occurrence of a violation and the inclusion of information about the AI's algorithm. Both the presence of the moral violation and the information about the AI's algorithm increase participant's reporting of a moral violation occurring in the event. However, even in the violation outcome conditions only 43.5 percent of the participants reported that they were sure that a moral violation occurred. Addressing whether the AI is blamed for the moral violation we found that people attributed increased wrongness to the AI-but not to the organization, programmer, or users-after a moral violation. In addition to moral wrongness, the AI was attributed moderate levels of awareness, intentionality, justification, and responsibility for the violation outcome. Finally, the inclusion of the algorithm information marginally increased perceptions of the AI having mind, and perceived mind was positively related to attributions of intentionality and wrongness to the AI.","publication_date":"2018,9,1","publication_name":"Computers in Human Behavior","grobid_abstract_attachment_id":"108099213"},"document_type":"paper","pre_hit_view_count_baseline":null,"quality":"high","language":"en","title":"Attributions of morality and mind to artificial intelligence after real-world moral violations","broadcastable":true,"draft":null,"has_indexable_attachment":true,"indexable":true}}["work"]; window.loswp.workCoauthors = [1952752]; window.loswp.locale = "en"; window.loswp.countryCode = "SG"; window.loswp.cwvAbTestBucket = ""; window.loswp.designVariant = "ds_vanilla"; window.loswp.fullPageMobileSutdModalVariant = "full_page_mobile_sutd_modal"; window.loswp.useOptimizedScribd4genScript = false; window.loginModal = {}; window.loginModal.appleClientId = 'edu.academia.applesignon'; window.userInChina = "false";</script><script defer="" src="https://accounts.google.com/gsi/client"></script><div class="ds-loswp-container"><div class="ds-work-card--grid-container"><div class="ds-work-card--container js-loswp-work-card"><div class="ds-work-card--cover"><div class="ds-work-cover--wrapper"><div class="ds-work-cover--container"><button class="ds-work-cover--clickable js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;swp-splash-paper-cover&quot;,&quot;attachmentId&quot;:108099213,&quot;attachmentType&quot;:&quot;pdf&quot;}"><img alt="First page of “Attributions of morality and mind to artificial intelligence after real-world moral violations”" class="ds-work-cover--cover-thumbnail" src="https://0.academia-photos.com/attachment_thumbnails/108099213/mini_magick20231130-1-5s3s6p.png?1701358022" /><img alt="PDF Icon" class="ds-work-cover--file-icon" src="//a.academia-assets.com/images/single_work_splash/adobe_icon.svg" /><div class="ds-work-cover--hover-container"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">download</span><p>Download Free PDF</p></div><div class="ds-work-cover--ribbon-container">Download Free PDF</div><div class="ds-work-cover--ribbon-triangle"></div></button></div></div></div><div class="ds-work-card--work-information"><h1 class="ds-work-card--work-title">Attributions of morality and mind to artificial intelligence after real-world moral violations</h1><div class="ds-work-card--work-authors ds-work-card--detail"><a class="ds-work-card--author js-wsj-grid-card-author ds2-5-body-md ds2-5-body-link" data-author-id="1952752" href="https://mst.academia.edu/DanielShank"><img alt="Profile image of Daniel B Shank" class="ds-work-card--author-avatar" src="https://0.academia-photos.com/1952752/685138/10882860/s65_daniel.shank.jpg" />Daniel B Shank</a></div><div class="ds-work-card--detail"><p class="ds-work-card--detail ds2-5-body-sm">2018, Computers in Human Behavior</p><div class="ds-work-card--work-metadata"><div class="ds-work-card--work-metadata__stat"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">visibility</span><p class="ds2-5-body-sm" id="work-metadata-view-count">…</p></div><div class="ds-work-card--work-metadata__stat"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">description</span><p class="ds2-5-body-sm">32 pages</p></div><div class="ds-work-card--work-metadata__stat"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">link</span><p class="ds2-5-body-sm">1 file</p></div></div><script>(async () => { const workId = 110213023; const worksViewsPath = "/v0/works/views?subdomain_param=api&amp;work_ids%5B%5D=110213023"; const getWorkViews = async (workId) => { const response = await fetch(worksViewsPath); if (!response.ok) { throw new Error('Failed to load work views'); } const data = await response.json(); return data.views[workId]; }; // Get the view count for the work - we send this immediately rather than waiting for // the DOM to load, so it can be available as soon as possible (but without holding up // the backend or other resource requests, because it's a bit expensive and not critical). const viewCount = await getWorkViews(workId); const updateViewCount = (viewCount) => { try { const viewCountNumber = parseInt(viewCount, 10); if (viewCountNumber === 0) { // Remove the whole views element if there are zero views. document.getElementById('work-metadata-view-count')?.parentNode?.remove(); return; } const commaizedViewCount = viewCountNumber.toLocaleString(); const viewCountBody = document.getElementById('work-metadata-view-count'); if (!viewCountBody) { throw new Error('Failed to find work views element'); } viewCountBody.textContent = `${commaizedViewCount} views`; } catch (error) { // Remove the whole views element if there was some issue parsing. document.getElementById('work-metadata-view-count')?.parentNode?.remove(); throw new Error(`Failed to parse view count: ${viewCount}`, error); } }; // If the DOM is still loading, wait for it to be ready before updating the view count. if (document.readyState === "loading") { document.addEventListener('DOMContentLoaded', () => { updateViewCount(viewCount); }); // Otherwise, just update it immediately. } else { updateViewCount(viewCount); } })();</script></div><p class="ds-work-card--work-abstract ds-work-card--detail ds2-5-body-md">The media has portrayed certain artificial intelligence (AI) software as committing moral violations such as the AI judge of a human beauty contest being &quot;racist&quot; when it selected predominately light-skinned winners. We examine people&#39;s attributions of morality for seven such real-world events that were first publicized in the media, experimentally manipulating the occurrence of a violation and the inclusion of information about the AI&#39;s algorithm. Both the presence of the moral violation and the information about the AI&#39;s algorithm increase participant&#39;s reporting of a moral violation occurring in the event. However, even in the violation outcome conditions only 43.5 percent of the participants reported that they were sure that a moral violation occurred. Addressing whether the AI is blamed for the moral violation we found that people attributed increased wrongness to the AI-but not to the organization, programmer, or users-after a moral violation. In addition to moral wrongness, the AI was attributed moderate levels of awareness, intentionality, justification, and responsibility for the violation outcome. Finally, the inclusion of the algorithm information marginally increased perceptions of the AI having mind, and perceived mind was positively related to attributions of intentionality and wrongness to the AI.</p><div class="ds-work-card--button-container"><button class="ds2-5-button js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;continue-reading-button--work-card&quot;,&quot;attachmentId&quot;:108099213,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;workUrl&quot;:&quot;https://www.academia.edu/110213023/Attributions_of_morality_and_mind_to_artificial_intelligence_after_real_world_moral_violations&quot;}">See full PDF</button><button class="ds2-5-button ds2-5-button--secondary js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;download-pdf-button--work-card&quot;,&quot;attachmentId&quot;:108099213,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;workUrl&quot;:&quot;https://www.academia.edu/110213023/Attributions_of_morality_and_mind_to_artificial_intelligence_after_real_world_moral_violations&quot;}"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">download</span>Download PDF</button></div><div class="ds-signup-banner-trigger-container"><div class="ds-signup-banner-trigger ds-signup-banner-trigger-control"></div></div><div class="ds-signup-banner ds-signup-banner-control"><div id="ds-signup-banner-close-button"><button class="ds2-5-button ds2-5-button--secondary ds2-5-button--inverse"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">close</span></button></div><div class="ds-signup-banner-ctas"><img src="//a.academia-assets.com/images/academia-logo-capital-white.svg" /><h4 class="ds2-5-heading-serif-sm">Sign up for access to the world's latest research</h4><button class="ds2-5-button ds2-5-button--inverse ds2-5-button--full-width js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;signup-banner&quot;}">Sign up for free<span class="material-symbols-outlined" style="font-size: 20px" translate="no">arrow_forward</span></button></div><div class="ds-signup-banner-divider"></div><div class="ds-signup-banner-reasons"><div class="ds-signup-banner-reasons-item"><span class="material-symbols-outlined" style="font-size: 24px" translate="no">check</span><span>Get notified about relevant papers</span></div><div class="ds-signup-banner-reasons-item"><span class="material-symbols-outlined" style="font-size: 24px" translate="no">check</span><span>Save papers to use in your research</span></div><div class="ds-signup-banner-reasons-item"><span class="material-symbols-outlined" style="font-size: 24px" translate="no">check</span><span>Join the discussion with peers</span></div><div class="ds-signup-banner-reasons-item"><span class="material-symbols-outlined" style="font-size: 24px" translate="no">check</span><span>Track your impact</span></div></div></div><script>(() => { // Set up signup banner show/hide behavior: // 1. If the signup banner trigger (a 242px-high* invisible div underneath the 'See Full PDF' / 'Download PDF' buttons) // is already fully scrolled above the viewport, show the banner by default // 2. If the signup banner trigger is fully visible, show the banner // 3. If the signup banner trigger has even a few pixels scrolled below the viewport, hide the banner // // * 242px is the empirically determined height of the signup banner. It's better to be a bit taller than // necessary than too short, so it's fine that the mobile (small breakpoint) banner is shorter. // First check session storage for the signup banner's visibility state const signupBannerHidden = sessionStorage.getItem('ds-signup-banner-hidden'); if (signupBannerHidden === 'true') { return; } const signupBanner = document.querySelector('.ds-signup-banner'); const signupBannerTrigger = document.querySelector('.ds-signup-banner-trigger'); if (!signupBannerTrigger) { window.Sentry.captureMessage("Signup banner trigger not found"); return; } let footerShown = false; window.addEventListener('load', () => { const rect = signupBannerTrigger.getBoundingClientRect(); // If page loaded up already scrolled below the trigger (via scroll restoration), show the banner by default if (rect.bottom < 0) { footerShown = true; signupBanner.classList.add('ds-signup-banner-visible'); } }); // Wait for trigger to fully enter viewport before showing banner (ensures PDF CTAs are never covered by banner) const observer = new IntersectionObserver((entries) => { entries.forEach(entry => { if (entry.isIntersecting && !footerShown) { footerShown = true; signupBanner.classList.add('ds-signup-banner-visible'); } else if (!entry.isIntersecting && footerShown) { if (signupBannerTrigger.getBoundingClientRect().bottom > 0) { footerShown = false; signupBanner.classList.remove('ds-signup-banner-visible'); } } }); }); observer.observe(signupBannerTrigger); // Set up signup banner close button event handler: const signupBannerCloseButton = document.querySelector('#ds-signup-banner-close-button'); signupBannerCloseButton.addEventListener('click', () => { signupBanner.classList.remove('ds-signup-banner-visible'); observer.unobserve(signupBannerTrigger); // Store the signup banner's visibility state in session storage sessionStorage.setItem('ds-signup-banner-hidden', 'true'); }); })();</script></div></div></div><div data-auto_select="false" data-client_id="331998490334-rsn3chp12mbkiqhl6e7lu2q0mlbu0f1b" data-doc_id="108099213" data-landing_url="https://www.academia.edu/110213023/Attributions_of_morality_and_mind_to_artificial_intelligence_after_real_world_moral_violations" data-login_uri="https://www.academia.edu/registrations/google_one_tap" data-moment_callback="onGoogleOneTapEvent" id="g_id_onload"></div><div class="ds-top-related-works--grid-container"><div class="ds-related-content--container ds-top-related-works--container"><h2 class="ds-related-content--heading">Related papers</h2><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="0" data-entity-id="110213013" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/110213013/When_are_artificial_intelligence_versus_human_agents_faulted_for_wrongdoing_Moral_attributions_after_individual_and_joint_decisions">When are artificial intelligence versus human agents faulted for wrongdoing? Moral attributions after individual and joint decisions</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="1952752" href="https://mst.academia.edu/DanielShank">Daniel B Shank</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Information, Communication &amp; Society, 2019</p><p class="ds-related-work--abstract ds2-5-body-sm">Artificial intelligence (AI) agents make decisions that affect individuals and society which can produce outcomes traditionally considered moral violations if performed by humans. Do people attribute the same moral permissibility and fault to AIs and humans when each produces the same moral violation outcome? Additionally, how do people attribute morality when the AI and human are jointly making the decision which produces that violation? We investigate these questions with an experiment that manipulates written descriptions of four real-world scenarios where, originally, a violation outcome was produced by an AI. Our decision-making structures include individual decision-makingeither AIs or humansand joint decision-makingeither humans monitoring AIs or AIs recommending to humans. We find that the decision-making structure has little effect on morally faulting AIs, but that humans who monitor AIs are faulted less than solo humans and humans receiving recommendations. Furthermore, people attribute more permission and less fault to AIs compared to humans for the violation in both joint decision-making structures. The blame for joint AI-human wrongdoing suggests the potential for strategic scapegoating of AIs for human moral failings and the need for future research on AI-human teams.</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;When are artificial intelligence versus human agents faulted for wrongdoing? Moral attributions after individual and joint decisions&quot;,&quot;attachmentId&quot;:108099214,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/110213013/When_are_artificial_intelligence_versus_human_agents_faulted_for_wrongdoing_Moral_attributions_after_individual_and_joint_decisions&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/110213013/When_are_artificial_intelligence_versus_human_agents_faulted_for_wrongdoing_Moral_attributions_after_individual_and_joint_decisions"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="1" data-entity-id="78442326" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/78442326/How_do_People_Judge_the_Immorality_of_Artificial_Intelligence_versus_Humans_Committing_Moral_Wrongs_in_Real_World_Situations">How do People Judge the Immorality of Artificial Intelligence versus Humans Committing Moral Wrongs in Real-World Situations?</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="1952752" href="https://mst.academia.edu/DanielShank">Daniel B Shank</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2022</p><p class="ds-related-work--abstract ds2-5-body-sm">In general, people will judge a morally wrong behavior when perpetrated by an artificial intelligence (AI) as still being wrong. But moral judgements are complex, and therefore we explore how moral judgements made about AIs differ from those made about humans in real- world situations. In contrast to much of the current research on the morality of AIs, we examine real-world encounters where an AI commits a moral wrong as reported by participants in previous research. We adapt these to create nearly identical scenarios with human perpetrators. In Study 1, across scenarios, humans are perceived as more wrong, intentional, and to blame compared to AIs. In Study 2, we replicate those results and find that by showing the participants the contrasting scenario – showing the AI scenario when one is rating the human scenario or vice versa – does not have a significant effect on moral judgements. An exploratory word-frequency analysis and illustrative quotes from participants’ open-ended expl...</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;How do People Judge the Immorality of Artificial Intelligence versus Humans Committing Moral Wrongs in Real-World Situations?&quot;,&quot;attachmentId&quot;:85488385,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/78442326/How_do_People_Judge_the_Immorality_of_Artificial_Intelligence_versus_Humans_Committing_Moral_Wrongs_in_Real_World_Situations&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/78442326/How_do_People_Judge_the_Immorality_of_Artificial_Intelligence_versus_Humans_Committing_Moral_Wrongs_in_Real_World_Situations"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="2" data-entity-id="116559363" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/116559363/Attributions_toward_Artificial_Agents_in_a_modified_Moral_Turing_Test">Attributions toward Artificial Agents in a modified Moral Turing Test</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="5604859" href="https://gsu.academia.edu/EyalAharoni">Eyal Aharoni</a><span>, </span><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="308490005" href="https://independent.academia.edu/SharleneFernandes4">Sharlene Fernandes</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Scientific Reports</p><p class="ds-related-work--abstract ds2-5-body-sm">Advances in artificial intelligence (AI) raise important questions about whether people view moral evaluations by AI systems similarly to human-generated moral evaluations. We conducted a modified Moral Turing Test (m-MTT), inspired by Allen and colleagues’ (2000) proposal, by asking people to distinguish real human moral evaluations from those made by a popular advanced AI language model: GPT-4. A representative sample of 299 U.S. adults first rated the quality of moral evaluations when blinded to their source. Remarkably, they rated the AI&#39;s moral reasoning as superior in quality to humans’ along almost all dimensions, including virtuousness, intelligence, and trustworthiness, consistent with passing what Allen and colleagues call the comparative MTT. Next, when tasked with identifying the source of each evaluation (human or computer), people performed significantly above chance levels. Although the AI did not pass this test, this was not because of its inferior moral reasoning but, potentially, its perceived superiority, among other possible explanations. The emergence of language models capable of producing moral responses perceived as superior in quality to humans’ raises concerns that people may uncritically accept potentially harmful moral guidance from AI. This possibility highlights the need for safeguards around generative language models in matters of morality.</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Attributions toward Artificial Agents in a modified Moral Turing Test&quot;,&quot;attachmentId&quot;:112655133,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/116559363/Attributions_toward_Artificial_Agents_in_a_modified_Moral_Turing_Test&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/116559363/Attributions_toward_Artificial_Agents_in_a_modified_Moral_Turing_Test"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="3" data-entity-id="70955157" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/70955157/Artificial_Artificial_Intelligence_Measuring_Influence_of_AI_Assessments_on_Moral_Decision_Making">Artificial Artificial Intelligence: Measuring Influence of AI &#39;Assessments&#39; on Moral Decision-Making</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="39651277" href="https://independent.academia.edu/JanaSchaichBorg">Jana Schaich Borg</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2020</p><p class="ds-related-work--abstract ds2-5-body-sm">Given AI&amp;#39;s growing role in modeling and improving decision-making, how and when to present users with feedback is an urgent topic to address. We empirically examined the effect of feedback from false AI on moral decision-making about donor kidney allocation. We found some evidence that judgments about whether a patient should receive a kidney can be influenced by feedback about participants&amp;#39; own decision-making perceived to be given by AI, even if the feedback is entirely random. We also discovered different effects between assessments presented as being from human experts and assessments presented as being from AI.</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Artificial Artificial Intelligence: Measuring Influence of AI &#39;Assessments&#39; on Moral Decision-Making&quot;,&quot;attachmentId&quot;:80493117,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/70955157/Artificial_Artificial_Intelligence_Measuring_Influence_of_AI_Assessments_on_Moral_Decision_Making&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/70955157/Artificial_Artificial_Intelligence_Measuring_Influence_of_AI_Assessments_on_Moral_Decision_Making"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="4" data-entity-id="54519308" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/54519308/Guilty_Artificial_Minds_Folk_Attributions_of_Mens_Rea_and_Culpability_to_Artificially_Intelligent_Agents">Guilty Artificial Minds: Folk Attributions of Mens Rea and Culpability to Artificially Intelligent Agents</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="88719" href="https://york.academia.edu/MichaelStuart">Michael T Stuart</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Proceedings of the ACM, 2021</p><p class="ds-related-work--abstract ds2-5-body-sm">While philosophers hold that it is patently absurd to blame robots or hold them morally responsible, a series of recent empirical studies suggest that people do ascribe blame to AI systems and robots in certain contexts. This is disconcerting: Blame might be shifted from the owners, users or designers of AI systems to the systems themselves, leading to the diminished accountability of the responsible human agents. In this paper, we explore one of the potential underlying reasons for robot blame, namely the folk&#39;s willingness to ascribe inculpating mental states or &quot;mens rea&quot; to robots. In a vignette-based experiment (N=513), we presented participants with a situation in which an agent knowingly runs the risk of bringing about substantial harm. We manipulated agent type (human v. group agent v. AI-driven robot) and outcome (neutral v. bad), and measured both moral judgment (wrongness of the action and blameworthiness of the agent) and mental states attributed to the agent (recklessness and the desire to inflict harm). We found that (i) judgments of wrongness and blame were relatively similar across agent types, possibly because (ii) attributions of mental states were, as suspected, similar across agent types. This raised the question-also explored in the experiment-whether people attribute knowledge and desire to robots in a merely metaphorical way (e.g., the robot &quot;knew&quot; rather than really knew). However, (iii), according to our data people were unwilling to downgrade to mens rea in a merely metaphorical sense when given the chance. Finally, (iv), we report a surprising and novel finding, which we call the inverse outcome effect on robot blame: People were less willing to blame artificial agents for bad outcomes than for neutral outcomes. This suggests that they are implicitly aware of the dangers of overattributing blame to robots when harm comes to pass, such as inappropriately letting the responsible human agent off the moral hook.</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Guilty Artificial Minds: Folk Attributions of Mens Rea and Culpability to Artificially Intelligent Agents&quot;,&quot;attachmentId&quot;:70844541,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/54519308/Guilty_Artificial_Minds_Folk_Attributions_of_Mens_Rea_and_Culpability_to_Artificially_Intelligent_Agents&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/54519308/Guilty_Artificial_Minds_Folk_Attributions_of_Mens_Rea_and_Culpability_to_Artificially_Intelligent_Agents"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="5" data-entity-id="50940028" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/50940028/Can_Mind_Perception_Explain_Virtuous_Character_Judgments_of_Artificial_Intelligence">Can Mind Perception Explain Virtuous Character Judgments of Artificial Intelligence</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="1952752" href="https://mst.academia.edu/DanielShank">Daniel B Shank</a><span>, </span><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="391072" href="https://nd.academia.edu/PatrickGamez">Patrick Gamez</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Technology, Mind, and Behavior, 2021</p><p class="ds-related-work--abstract ds2-5-body-sm">People tend to attribute less of a virtuous or unvirtuous characteristic to artificial intelligence (AI) agents compared to humans after observing a behavior exemplifying that particular virtue or vice. We argue that this difference can be explained by perceptions of experiential and agentic mind. Experiential mind focuses on one&#39;s emotions, sensations, and past experiences, whereas agentic mind focuses on one&#39;s intentions, capacity for action, and behaviors. Building on person-centered morality, virtue ethics, and mind perception research, we argue that both agentic and experiential mind are possible mediators of behavior-to-character attributions. We conducted two experiments (n = 613, n = 584) using vignette scenarios in the virtue ethics domains of truth, justice, fear, wealth, and honor where we manipulated the actor to be an AI or human and the behavior to be virtuous or unvirtuous. As expected, we found that the character judgments of virtues and vices are weaker for AIs compared to humans. This character judgment difference is mediated by both experiential and agentic mind with a larger mediation effect for experiential mind compared to agentic mind. Exploratory analyses revealed differences in character and experiential mind based on the virtue domain.</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Can Mind Perception Explain Virtuous Character Judgments of Artificial Intelligence&quot;,&quot;attachmentId&quot;:68816921,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/50940028/Can_Mind_Perception_Explain_Virtuous_Character_Judgments_of_Artificial_Intelligence&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/50940028/Can_Mind_Perception_Explain_Virtuous_Character_Judgments_of_Artificial_Intelligence"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="6" data-entity-id="41913020" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/41913020/Information_Communication_and_Society_Attributions_of_ethical_responsibility_by_Artificial_Intelligence_practitioners">Information, Communication &amp; Society Attributions of ethical responsibility by Artificial Intelligence practitioners</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="1007068" href="https://vanderbilt.academia.edu/JennyDavis">Jenny L Davis</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Attributions of ethical responsibility by Artificial Intelligence practitioners, 2020</p><p class="ds-related-work--abstract ds2-5-body-sm">Systems based on Artificial Intelligence (AI) are increasingly normalized as part of work, leisure, and governance in contemporary societies. Although ethics in AI has received significant attention, it remains unclear where the burden of responsibility lies. Through twenty-one interviews with AI practitioners in Australia, this research seeks to understand how ethical attributions figure into the professional imagination. As institutionally embedded technical experts, AI practitioners act as a connective tissue linking the range of actors that come in contact with, and have effects upon, AI products and services. Findings highlight that practitioners distribute ethical responsibility across a range of actors and factors, reserving a portion of responsibility for themselves, albeit constrained. Characterized by imbalances of decision-making power and technical expertise, practitioners position themselves as mediators between powerful bodies that set parameters for production; users who engage with products once they leave the proverbial workbench; and AI systems that evolve and develop beyond practitioner control. Distributing responsibility throughout complex sociotechnical networks, practitioners preclude simple attributions of accountability for the social effects of AI. This indicates that AI ethics are not the purview of any singular player but instead, derive from collectivities that require critical guidance and oversight at all stages of conception, production, distribution, and use.</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Information, Communication \u0026 Society Attributions of ethical responsibility by Artificial Intelligence practitioners&quot;,&quot;attachmentId&quot;:62039171,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/41913020/Information_Communication_and_Society_Attributions_of_ethical_responsibility_by_Artificial_Intelligence_practitioners&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/41913020/Information_Communication_and_Society_Attributions_of_ethical_responsibility_by_Artificial_Intelligence_practitioners"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="7" data-entity-id="72170093" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/72170093/Employing_AI_to_Better_Understand_Our_Morals">Employing AI to Better Understand Our Morals</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="883505" href="https://tees.academia.edu/TheAnhHan">The Anh Han</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Entropy</p><p class="ds-related-work--abstract ds2-5-body-sm">We present a summary of research that we have conducted employing AI to better understand human morality. This summary adumbrates theoretical fundamentals and considers how to regulate development of powerful new AI technologies. The latter research aim is benevolent AI, with fair distribution of benefits associated with the development of these and related technologies, avoiding disparities of power and wealth due to unregulated competition. Our approach avoids statistical models employed in other approaches to solve moral dilemmas, because these are “blind” to natural constraints on moral agents, and risk perpetuating mistakes. Instead, our approach employs, for instance, psychologically realistic counterfactual reasoning in group dynamics. The present paper reviews studies involving factors fundamental to human moral motivation, including egoism vs. altruism, commitment vs. defaulting, guilt vs. non-guilt, apology plus forgiveness, counterfactual collaboration, among other factor...</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Employing AI to Better Understand Our Morals&quot;,&quot;attachmentId&quot;:81206632,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/72170093/Employing_AI_to_Better_Understand_Our_Morals&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/72170093/Employing_AI_to_Better_Understand_Our_Morals"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="8" data-entity-id="75725941" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/75725941/AI_in_the_headlines_the_portrayal_of_the_ethical_issues_of_artificial_intelligence_in_the_media">AI in the headlines: the portrayal of the ethical issues of artificial intelligence in the media</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="104328456" href="https://ncstate.academia.edu/AllenCoin">Allen Coin</a></div><p class="ds-related-work--metadata ds2-5-body-xs">AI &amp; SOCIETY, 2020</p><p class="ds-related-work--abstract ds2-5-body-sm">As artificial intelligence (AI) technologies become increasingly prominent in our daily lives, media coverage of the ethical considerations of these technologies has followed suit. Since previous research has shown that media coverage can drive public discourse about novel technologies, studying how the ethical issues of AI are portrayed in the media may lead to greater insight into the potential ramifications of this public discourse, particularly with regard to development and regulation of AI. This paper expands upon previous research by systematically analyzing and categorizing the media portrayal of the ethical issues of AI to better understand how media coverage of these issues may shape public debate about AI. Our results suggest that the media has a fairly realistic and practical focus in its coverage of the ethics of AI, but that the coverage is still shallow. A multifaceted approach to handling the social, ethical and policy issues of AI technology is needed, including inc...</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;AI in the headlines: the portrayal of the ethical issues of artificial intelligence in the media&quot;,&quot;attachmentId&quot;:83381776,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/75725941/AI_in_the_headlines_the_portrayal_of_the_ethical_issues_of_artificial_intelligence_in_the_media&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/75725941/AI_in_the_headlines_the_portrayal_of_the_ethical_issues_of_artificial_intelligence_in_the_media"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="9" data-entity-id="108514561" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/108514561/Appraisals_of_harms_and_injustice_trigger_an_eerie_feeling_that_decreases_trust_in_artificial_intelligence_systems">Appraisals of harms and injustice trigger an eerie feeling that decreases trust in artificial intelligence systems</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="8304475" href="https://neoma-bs.academia.edu/MarcdeBourmont">Marc de Bourmont</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Annals of Operations Research</p><p class="ds-related-work--abstract ds2-5-body-sm">As artificial intelligence (AI) becomes more pervasive, the concern over how users can trust artificial agents is more important than ever before. In this research, we seek to understand the trust formation between humans and artificial agents from the morality and uncanny theory perspective. We conducted three studies to carefully examine the effect of two moral foundations: perceptions of harm and perceptions of injustice, as well as reported wrongdoing on uncanniness and examine the effect of uncanniness on trust in artificial agents. In Study 1, we found perceived injustice was the primary determinant of uncanniness and uncanniness had a negative effect on trust. Studies 2 and 3 extended these findings using two different scenarios of wrongful acts involving an artificial agent. In addition to explaining the contribution of moral appraisals to the feeling of uncanny, the latter studies also uncover substantial contributions of both perceived harm and perceived injustice. The results provide a foundation for establishing trust in artificial agents and designing an AI system by instilling moral values in it.</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Appraisals of harms and injustice trigger an eerie feeling that decreases trust in artificial intelligence systems&quot;,&quot;attachmentId&quot;:106875905,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/108514561/Appraisals_of_harms_and_injustice_trigger_an_eerie_feeling_that_decreases_trust_in_artificial_intelligence_systems&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/108514561/Appraisals_of_harms_and_injustice_trigger_an_eerie_feeling_that_decreases_trust_in_artificial_intelligence_systems"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div></div></div><div class="ds-sticky-ctas--wrapper js-loswp-sticky-ctas hidden"><div class="ds-sticky-ctas--grid-container"><div class="ds-sticky-ctas--container"><button class="ds2-5-button js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;continue-reading-button--sticky-ctas&quot;,&quot;attachmentId&quot;:108099213,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;workUrl&quot;:null}">See full PDF</button><button class="ds2-5-button ds2-5-button--secondary js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;download-pdf-button--sticky-ctas&quot;,&quot;attachmentId&quot;:108099213,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;workUrl&quot;:null}"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">download</span>Download PDF</button></div></div></div><div class="ds-below-fold--grid-container"><div class="ds-work--container js-loswp-embedded-document"><div class="attachment_preview" data-attachment="Attachment_108099213" style="display: none"><div class="js-scribd-document-container"><div class="scribd--document-loading js-scribd-document-loader" style="display: block;"><img alt="Loading..." src="//a.academia-assets.com/images/loaders/paper-load.gif" /><p>Loading Preview</p></div></div><div style="text-align: center;"><div class="scribd--no-preview-alert js-preview-unavailable"><p>Sorry, preview is currently unavailable. You can download the paper by clicking the button above.</p></div></div></div></div><div class="ds-sidebar--container js-work-sidebar"><div class="ds-related-content--container"><h2 class="ds-related-content--heading">Related papers</h2><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="0" data-entity-id="42880249" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/42880249/Artificial_virtue_the_machine_question_and_perceptions_of_moral_character_in_artificial_moral_agents">Artificial virtue: the machine question and perceptions of moral character in artificial moral agents</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="391072" href="https://nd.academia.edu/PatrickGamez">Patrick Gamez</a><span>, </span><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="157049449" href="https://independent.academia.edu/MalloryNorth">Mallory North</a></div><p class="ds-related-work--metadata ds2-5-body-xs">AI &amp; Society, 2020</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Artificial virtue: the machine question and perceptions of moral character in artificial moral agents&quot;,&quot;attachmentId&quot;:64834358,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/42880249/Artificial_virtue_the_machine_question_and_perceptions_of_moral_character_in_artificial_moral_agents&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/42880249/Artificial_virtue_the_machine_question_and_perceptions_of_moral_character_in_artificial_moral_agents"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="1" data-entity-id="81305558" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/81305558/Moral_Thinking_and_Artificial_Intelligence_2022">Moral Thinking and Artificial Intelligence 2022</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="182136901" href="https://independent.academia.edu/MortenSestoft">Morten Sestoft</a></div><p class="ds-related-work--metadata ds2-5-body-xs">submitted to Danish Yearbook of Philosophy, 2022</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Moral Thinking and Artificial Intelligence 2022&quot;,&quot;attachmentId&quot;:87396016,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/81305558/Moral_Thinking_and_Artificial_Intelligence_2022&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/81305558/Moral_Thinking_and_Artificial_Intelligence_2022"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="2" data-entity-id="2664250" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/2664250/Just_an_Artifact_Why_Machines_Are_Perceived_as_Moral_Agents">Just an Artifact: Why Machines Are Perceived as Moral Agents</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="16552" href="https://hertie-school.academia.edu/JoannaBryson">Joanna Bryson</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2011</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Just an Artifact: Why Machines Are Perceived as Moral Agents&quot;,&quot;attachmentId&quot;:30662649,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/2664250/Just_an_Artifact_Why_Machines_Are_Perceived_as_Moral_Agents&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/2664250/Just_an_Artifact_Why_Machines_Are_Perceived_as_Moral_Agents"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="3" data-entity-id="4095936" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/4095936/Attempts_to_Attribute_Moral_Agency_to_Intelligent_Machines_are_Misguided">Attempts to Attribute Moral Agency to Intelligent Machines are Misguided</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="661886" href="https://louisville.academia.edu/RomanYampolskiy">Roman Yampolskiy</a></div><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Attempts to Attribute Moral Agency to Intelligent Machines are Misguided&quot;,&quot;attachmentId&quot;:31619315,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/4095936/Attempts_to_Attribute_Moral_Agency_to_Intelligent_Machines_are_Misguided&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/4095936/Attempts_to_Attribute_Moral_Agency_to_Intelligent_Machines_are_Misguided"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="4" data-entity-id="95850350" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/95850350/Algorithmic_Discrimination_Causes_Less_Moral_Outrage_than_Human_Discrimination">Algorithmic Discrimination Causes Less Moral Outrage than Human Discrimination</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="47938" href="https://unc.academia.edu/KurtGray">Kurt Gray</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2020</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Algorithmic Discrimination Causes Less Moral Outrage than Human Discrimination&quot;,&quot;attachmentId&quot;:97916637,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/95850350/Algorithmic_Discrimination_Causes_Less_Moral_Outrage_than_Human_Discrimination&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/95850350/Algorithmic_Discrimination_Causes_Less_Moral_Outrage_than_Human_Discrimination"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="5" data-entity-id="27751683" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/27751683/Moral_judgments_of_human_vs_robot_agents_2016_">Moral judgments of human vs. robot agents (2016)</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="138929" href="https://brown.academia.edu/BertramMalle">Bertram Malle</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Proceedings of the 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Moral judgments of human vs. robot agents (2016)&quot;,&quot;attachmentId&quot;:48027480,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/27751683/Moral_judgments_of_human_vs_robot_agents_2016_&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/27751683/Moral_judgments_of_human_vs_robot_agents_2016_"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="6" data-entity-id="99085320" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/99085320/Moral_Psychology_and_Artificial_Agents_Part_Two_">Moral Psychology and Artificial Agents (Part Two)</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="60357322" href="https://independent.academia.edu/MKoverola">Mika Koverola</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Machine Law, Ethics, and Morality in the Age of Artificial Intelligence, 2021</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Moral Psychology and Artificial Agents (Part Two)&quot;,&quot;attachmentId&quot;:100265928,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/99085320/Moral_Psychology_and_Artificial_Agents_Part_Two_&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/99085320/Moral_Psychology_and_Artificial_Agents_Part_Two_"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="7" data-entity-id="92226810" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/92226810/Cognitive_morality_and_AI_A_proposed_classification_of_AI_systems_using_Kohlbergs_theory_of_cognitive_ethics">Cognitive morality and AI: A proposed classification of AI systems using Kohlberg&#39;s theory of cognitive ethics</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="160482091" href="https://sikkimuniversity.academia.edu/ShailendraKumar">Shailendra Kumar</a></div><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Cognitive morality and AI: A proposed classification of AI systems using Kohlberg&#39;s\r\n theory of cognitive ethics&quot;,&quot;attachmentId&quot;:98828460,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/92226810/Cognitive_morality_and_AI_A_proposed_classification_of_AI_systems_using_Kohlbergs_theory_of_cognitive_ethics&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/92226810/Cognitive_morality_and_AI_A_proposed_classification_of_AI_systems_using_Kohlbergs_theory_of_cognitive_ethics"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="8" data-entity-id="92003833" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/92003833/AI_and_moral_thinking_how_can_we_live_well_with_machines_to_enhance_our_moral_agency">AI and moral thinking: how can we live well with machines to enhance our moral agency</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="4716823" href="https://uwl.academia.edu/PaulaBoddington">Paula Boddington</a></div><p class="ds-related-work--metadata ds2-5-body-xs">AI Ethics, 2020</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;AI and moral thinking: how can we live well with machines to enhance our moral agency&quot;,&quot;attachmentId&quot;:95130654,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/92003833/AI_and_moral_thinking_how_can_we_live_well_with_machines_to_enhance_our_moral_agency&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/92003833/AI_and_moral_thinking_how_can_we_live_well_with_machines_to_enhance_our_moral_agency"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="9" data-entity-id="73024754" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/73024754/Moral_judgments_of_human_vs_robot_agents">Moral judgments of human vs. robot agents</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="138929" href="https://brown.academia.edu/BertramMalle">Bertram Malle</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 2016</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Moral judgments of human vs. robot agents&quot;,&quot;attachmentId&quot;:81711280,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/73024754/Moral_judgments_of_human_vs_robot_agents&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/73024754/Moral_judgments_of_human_vs_robot_agents"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="10" data-entity-id="12567660" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/12567660/Exploring_moral_behavior_of_thinking_machines">Exploring moral behavior of thinking machines</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="3343908" href="https://nyu.academia.edu/AbhiAgarwal">Abhi Agarwal</a></div><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Exploring moral behavior of thinking machines&quot;,&quot;attachmentId&quot;:37725641,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/12567660/Exploring_moral_behavior_of_thinking_machines&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/12567660/Exploring_moral_behavior_of_thinking_machines"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="11" data-entity-id="115590236" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/115590236/Machines_Humans_and_Critical_Moral_Thinking_2023">Machines, Humans and Critical Moral Thinking 2023</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="182136901" href="https://independent.academia.edu/MortenSestoft">Morten Sestoft</a></div><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Machines, Humans and Critical Moral Thinking 2023&quot;,&quot;attachmentId&quot;:111953388,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/115590236/Machines_Humans_and_Critical_Moral_Thinking_2023&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/115590236/Machines_Humans_and_Critical_Moral_Thinking_2023"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="12" data-entity-id="99912076" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/99912076/Cognitive_morality_and_artificial_intelligence_AI_a_proposed_classification_of_AI_systems_using_Kohlbergs_theory_of_cognitive_ethics">Cognitive morality and artificial intelligence (AI): a proposed classification of AI systems using Kohlberg&#39;s theory of cognitive ethics</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="185881648" href="https://oxford.academia.edu/SanghamitraChoudhury">Sanghamitra Choudhury</a></div><p class="ds-related-work--metadata ds2-5-body-xs">TECHNOLOGICAL SUSTAINABILITY, 2023</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Cognitive morality and artificial intelligence (AI): a proposed classification of AI systems using Kohlberg&#39;s theory of cognitive ethics&quot;,&quot;attachmentId&quot;:100875303,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/99912076/Cognitive_morality_and_artificial_intelligence_AI_a_proposed_classification_of_AI_systems_using_Kohlbergs_theory_of_cognitive_ethics&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/99912076/Cognitive_morality_and_artificial_intelligence_AI_a_proposed_classification_of_AI_systems_using_Kohlbergs_theory_of_cognitive_ethics"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="13" data-entity-id="97739353" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/97739353/Moral_Action_Changes_Mind_Perception_for_Human_and_Artificial_Moral_Agents">Moral Action Changes Mind Perception for Human and Artificial Moral Agents</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="38013342" href="https://newbulgarian.academia.edu/MGrinberg">Maurice Grinberg</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Cognitive Science, 2017</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Moral Action Changes Mind Perception for Human and Artificial Moral Agents&quot;,&quot;attachmentId&quot;:99279978,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/97739353/Moral_Action_Changes_Mind_Perception_for_Human_and_Artificial_Moral_Agents&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/97739353/Moral_Action_Changes_Mind_Perception_for_Human_and_Artificial_Moral_Agents"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="14" data-entity-id="100632238" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/100632238/AI_Moral_Enhancement_Upgrading_the_Socio_Technical_System_of_Moral_Engagement">AI Moral Enhancement: Upgrading the Socio-Technical System of Moral Engagement</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="3089271" href="https://southernct.academia.edu/RichardVolkman">Richard Volkman</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Science and Engineering Ethics</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;AI Moral Enhancement: Upgrading the Socio-Technical System of Moral Engagement&quot;,&quot;attachmentId&quot;:101399909,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/100632238/AI_Moral_Enhancement_Upgrading_the_Socio_Technical_System_of_Moral_Engagement&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/100632238/AI_Moral_Enhancement_Upgrading_the_Socio_Technical_System_of_Moral_Engagement"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="15" data-entity-id="75146460" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/75146460/Responsibility_assignment_won_t_solve_the_moral_issues_of_artificial_intelligence">Responsibility assignment won’t solve the moral issues of artificial intelligence</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="864591" href="https://fz-juelich.academia.edu/JanHendrikHeinrichs">Jan-Hendrik Heinrichs</a></div><p class="ds-related-work--metadata ds2-5-body-xs">AI and Ethics</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Responsibility assignment won’t solve the moral issues of artificial intelligence&quot;,&quot;attachmentId&quot;:83032781,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/75146460/Responsibility_assignment_won_t_solve_the_moral_issues_of_artificial_intelligence&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/75146460/Responsibility_assignment_won_t_solve_the_moral_issues_of_artificial_intelligence"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="16" data-entity-id="62822139" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/62822139/Responses_to_a_Critique_of_Artificial_Moral_Agents">Responses to a Critique of Artificial Moral Agents</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="81330852" href="https://independent.academia.edu/BByford">Ben Byford</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2019</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Responses to a Critique of Artificial Moral Agents&quot;,&quot;attachmentId&quot;:75467082,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/62822139/Responses_to_a_Critique_of_Artificial_Moral_Agents&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/62822139/Responses_to_a_Critique_of_Artificial_Moral_Agents"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="17" data-entity-id="66869740" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/66869740/Framing_Effects_on_Judgments_of_Social_Robots_Im_Moral_Behaviors">Framing Effects on Judgments of Social Robots&#39; (Im)Moral Behaviors</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="273869" href="https://syr.academia.edu/JaimeBanks">Jaime Banks</a></div><p class="ds-related-work--metadata ds2-5-body-xs"> Frontiers in Robotics and AI, 2021</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Framing Effects on Judgments of Social Robots&#39; (Im)Moral Behaviors&quot;,&quot;attachmentId&quot;:77899688,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/66869740/Framing_Effects_on_Judgments_of_Social_Robots_Im_Moral_Behaviors&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/66869740/Framing_Effects_on_Judgments_of_Social_Robots_Im_Moral_Behaviors"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="18" data-entity-id="75287086" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/75287086/Whose_morality_Which_rationality_Challenging_artificial_intelligence_as_a_remedy_for_the_lack_of_moral_enhancement">Whose morality? Which rationality? Challenging artificial intelligence as a remedy for the lack of moral enhancement</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="40797651" href="https://independent.academia.edu/SilvijaSerafimova">Silviya Serafimova</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Humanities and Social Sciences Communications</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Whose morality? Which rationality? Challenging artificial intelligence as a remedy for the lack of moral enhancement&quot;,&quot;attachmentId&quot;:83116586,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/75287086/Whose_morality_Which_rationality_Challenging_artificial_intelligence_as_a_remedy_for_the_lack_of_moral_enhancement&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/75287086/Whose_morality_Which_rationality_Challenging_artificial_intelligence_as_a_remedy_for_the_lack_of_moral_enhancement"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="19" data-entity-id="40754446" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/40754446/THE_MORAL_MACHINE_EXPERIMENT">THE MORAL MACHINE EXPERIMENT</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="6430796" href="https://wau.academia.edu/ajgroschel">Amilcar Gröschel, Jr.</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2018</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;THE MORAL MACHINE EXPERIMENT&quot;,&quot;attachmentId&quot;:61035399,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/40754446/THE_MORAL_MACHINE_EXPERIMENT&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/40754446/THE_MORAL_MACHINE_EXPERIMENT"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="20" data-entity-id="109180654" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/109180654/Artificial_Moral_Advisors">Artificial Moral Advisors</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="31740949" href="https://independent.academia.edu/JamieWebb">Jamie Webb</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Proceedings of the 2022 AAAI/ACM Conference on AI, Ethics, and Society</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Artificial Moral Advisors&quot;,&quot;attachmentId&quot;:107381461,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/109180654/Artificial_Moral_Advisors&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/109180654/Artificial_Moral_Advisors"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="21" data-entity-id="103607085" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/103607085/Artificial_Intelligence_and_Morality_A_Social_Responsibility">Artificial Intelligence and Morality: A Social Responsibility</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="150077353" href="https://independent.academia.edu/AnuradhaKanade">Anuradha Kanade</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Journal of Intelligence Studies in Business</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Artificial Intelligence and Morality: A Social Responsibility&quot;,&quot;attachmentId&quot;:103570454,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/103607085/Artificial_Intelligence_and_Morality_A_Social_Responsibility&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/103607085/Artificial_Intelligence_and_Morality_A_Social_Responsibility"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div></div><div class="ds-related-content--container"><h2 class="ds-related-content--heading">Related topics</h2><div class="ds-research-interests--pills-container"><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="37" rel="nofollow" href="https://www.academia.edu/Documents/in/Information_Systems">Information Systems</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="221" rel="nofollow" href="https://www.academia.edu/Documents/in/Psychology">Psychology</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="237" rel="nofollow" href="https://www.academia.edu/Documents/in/Cognitive_Science">Cognitive Science</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="248" rel="nofollow" href="https://www.academia.edu/Documents/in/Social_Psychology">Social Psychology</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="422" rel="nofollow" href="https://www.academia.edu/Documents/in/Computer_Science">Computer Science</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="465" rel="nofollow" href="https://www.academia.edu/Documents/in/Artificial_Intelligence">Artificial Intelligence</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="26817" rel="nofollow" href="https://www.academia.edu/Documents/in/Algorithm">Algorithm</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="34442" rel="nofollow" href="https://www.academia.edu/Documents/in/Responsibility">Responsibility</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="83952" rel="nofollow" href="https://www.academia.edu/Documents/in/Morality">Morality</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="125342" rel="nofollow" href="https://www.academia.edu/Documents/in/Attribution">Attribution</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="709926" rel="nofollow" href="https://www.academia.edu/Documents/in/Attributions">Attributions</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="1011475" rel="nofollow" href="https://www.academia.edu/Documents/in/Computers_In_Human_Behavior">Computers In Human Behavior</a></div></div></div></div></div><div class="footer--content"><ul class="footer--main-links hide-on-mobile"><li><a href="https://www.academia.edu/about">About</a></li><li><a href="https://www.academia.edu/press">Press</a></li><li><a href="https://www.academia.edu/documents">Papers</a></li><li><a href="https://www.academia.edu/topics">Topics</a></li><li><a href="https://www.academia.edu/hiring"><svg style="width: 13px; height: 13px; position: relative; bottom: -1px;" aria-hidden="true" focusable="false" data-prefix="fas" data-icon="briefcase" class="svg-inline--fa fa-briefcase fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M320 336c0 8.84-7.16 16-16 16h-96c-8.84 0-16-7.16-16-16v-48H0v144c0 25.6 22.4 48 48 48h416c25.6 0 48-22.4 48-48V288H320v48zm144-208h-80V80c0-25.6-22.4-48-48-48H176c-25.6 0-48 22.4-48 48v48H48c-25.6 0-48 22.4-48 48v80h512v-80c0-25.6-22.4-48-48-48zm-144 0H192V96h128v32z"></path></svg>&nbsp;<strong>We&#39;re Hiring!</strong></a></li><li><a href="https://support.academia.edu/hc/en-us"><svg style="width: 12px; height: 12px; position: relative; bottom: -1px;" aria-hidden="true" focusable="false" data-prefix="fas" data-icon="question-circle" class="svg-inline--fa fa-question-circle fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M504 256c0 136.997-111.043 248-248 248S8 392.997 8 256C8 119.083 119.043 8 256 8s248 111.083 248 248zM262.655 90c-54.497 0-89.255 22.957-116.549 63.758-3.536 5.286-2.353 12.415 2.715 16.258l34.699 26.31c5.205 3.947 12.621 3.008 16.665-2.122 17.864-22.658 30.113-35.797 57.303-35.797 20.429 0 45.698 13.148 45.698 32.958 0 14.976-12.363 22.667-32.534 33.976C247.128 238.528 216 254.941 216 296v4c0 6.627 5.373 12 12 12h56c6.627 0 12-5.373 12-12v-1.333c0-28.462 83.186-29.647 83.186-106.667 0-58.002-60.165-102-116.531-102zM256 338c-25.365 0-46 20.635-46 46 0 25.364 20.635 46 46 46s46-20.636 46-46c0-25.365-20.635-46-46-46z"></path></svg>&nbsp;<strong>Help Center</strong></a></li></ul><ul class="footer--research-interests"><li>Find new research papers in:</li><li><a href="https://www.academia.edu/Documents/in/Physics">Physics</a></li><li><a href="https://www.academia.edu/Documents/in/Chemistry">Chemistry</a></li><li><a href="https://www.academia.edu/Documents/in/Biology">Biology</a></li><li><a href="https://www.academia.edu/Documents/in/Health_Sciences">Health Sciences</a></li><li><a href="https://www.academia.edu/Documents/in/Ecology">Ecology</a></li><li><a href="https://www.academia.edu/Documents/in/Earth_Sciences">Earth Sciences</a></li><li><a href="https://www.academia.edu/Documents/in/Cognitive_Science">Cognitive Science</a></li><li><a href="https://www.academia.edu/Documents/in/Mathematics">Mathematics</a></li><li><a href="https://www.academia.edu/Documents/in/Computer_Science">Computer Science</a></li></ul><ul class="footer--legal-links hide-on-mobile"><li><a href="https://www.academia.edu/terms">Terms</a></li><li><a href="https://www.academia.edu/privacy">Privacy</a></li><li><a href="https://www.academia.edu/copyright">Copyright</a></li><li>Academia &copy;2025</li></ul></div> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10