CINXE.COM

(PDF) The risks of autonomous machines: from responsibility gaps to control gaps | Frank Hindriks - Academia.edu

<!DOCTYPE html> <html > <head> <meta charset="utf-8"> <meta rel="search" type="application/opensearchdescription+xml" href="/open_search.xml" title="Academia.edu"> <meta content="width=device-width, initial-scale=1" name="viewport"> <meta name="google-site-verification" content="bKJMBZA7E43xhDOopFZkssMMkBRjvYERV-NaN4R6mrs"> <meta name="csrf-param" content="authenticity_token" /> <meta name="csrf-token" content="0AUMJbB2wbaW814jnngi89+QvYdISXhMN0UXxVfDTCkTRLdtPoGXNhH0pFz7tFWcMjUWaXmPFpYe4NOeEG7pLg==" /> <meta name="citation_title" content="The risks of autonomous machines: from responsibility gaps to control gaps" /> <meta name="citation_journal_title" content="Synthese" /> <meta name="citation_author" content="Frank Hindriks" /> <meta name="twitter:card" content="summary" /> <meta name="twitter:url" content="https://www.academia.edu/107266757/The_risks_of_autonomous_machines_from_responsibility_gaps_to_control_gaps" /> <meta name="twitter:title" content="The risks of autonomous machines: from responsibility gaps to control gaps" /> <meta name="twitter:description" content="Responsibility gaps concern the attribution of blame for harms caused by autonomous machines. The worry has been that, because they are artificial agents, it is impossible to attribute blame, even though doing so would be appropriate given the harms" /> <meta name="twitter:image" content="https://0.academia-photos.com/73279/3015033/40150905/s200_frank.hindriks.jpg" /> <meta property="fb:app_id" content="2369844204" /> <meta property="og:type" content="article" /> <meta property="og:url" content="https://www.academia.edu/107266757/The_risks_of_autonomous_machines_from_responsibility_gaps_to_control_gaps" /> <meta property="og:title" content="The risks of autonomous machines: from responsibility gaps to control gaps" /> <meta property="og:image" content="http://a.academia-assets.com/images/open-graph-icons/fb-paper.gif" /> <meta property="og:description" content="Responsibility gaps concern the attribution of blame for harms caused by autonomous machines. The worry has been that, because they are artificial agents, it is impossible to attribute blame, even though doing so would be appropriate given the harms" /> <meta property="article:author" content="https://rug.academia.edu/FrankHindriks" /> <meta name="description" content="Responsibility gaps concern the attribution of blame for harms caused by autonomous machines. The worry has been that, because they are artificial agents, it is impossible to attribute blame, even though doing so would be appropriate given the harms" /> <title>(PDF) The risks of autonomous machines: from responsibility gaps to control gaps | Frank Hindriks - Academia.edu</title> <link rel="canonical" href="https://www.academia.edu/107266757/The_risks_of_autonomous_machines_from_responsibility_gaps_to_control_gaps" /> <script async src="https://www.googletagmanager.com/gtag/js?id=G-5VKX33P2DS"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-5VKX33P2DS', { cookie_domain: 'academia.edu', send_page_view: false, }); gtag('event', 'page_view', { 'controller': "single_work", 'action': "show", 'controller_action': 'single_work#show', 'logged_in': 'false', 'edge': 'unknown', // Send nil if there is no A/B test bucket, in case some records get logged // with missing data - that way we can distinguish between the two cases. // ab_test_bucket should be of the form <ab_test_name>:<bucket> 'ab_test_bucket': null, }) </script> <script> var $controller_name = 'single_work'; var $action_name = "show"; var $rails_env = 'production'; var $app_rev = '92477ec68c09d28ae4730a4143c926f074776319'; var $domain = 'academia.edu'; var $app_host = "academia.edu"; var $asset_host = "academia-assets.com"; var $start_time = new Date().getTime(); var $recaptcha_key = "6LdxlRMTAAAAADnu_zyLhLg0YF9uACwz78shpjJB"; var $recaptcha_invisible_key = "6Lf3KHUUAAAAACggoMpmGJdQDtiyrjVlvGJ6BbAj"; var $disableClientRecordHit = false; </script> <script> window.require = { config: function() { return function() {} } } </script> <script> window.Aedu = window.Aedu || {}; window.Aedu.hit_data = null; window.Aedu.serverRenderTime = new Date(1732827298000); window.Aedu.timeDifference = new Date().getTime() - 1732827298000; </script> <script type="application/ld+json">{"@context":"https://schema.org","@type":"ScholarlyArticle","abstract":"Responsibility gaps concern the attribution of blame for harms caused by autonomous machines. The worry has been that, because they are artificial agents, it is impossible to attribute blame, even though doing so would be appropriate given the harms they cause. We argue that there are no responsibility gaps. The harms can be blameless. And if they are not, the blame that is appropriate is indirect and can be attributed to designers, engineers, software developers, manufacturers or regulators. The real problem lies elsewhere: autonomous machines should be built so as to exhibit a level of risk that is morally acceptable. If they fall short of this standard, they exhibit what we call ‘a control gap.’ The causal control that autonomous machines have will then fall short of the guidance control they should emulate.","author":[{"@context":"https://schema.org","@type":"Person","name":"Frank Hindriks"}],"contributor":[],"dateCreated":"2023-09-26","dateModified":null,"datePublished":null,"headline":"The risks of autonomous machines: from responsibility gaps to control gaps","inLanguage":"en","keywords":["Philosophy","Philosophy Of Language","Philosophy of Science","Responsibility","Worry","Autonomous Vehicles","Moral Responsibility","Attribution","Blame","Control Management","Synthese","meaningful human control"],"locationCreated":null,"publication":"Synthese","publisher":{"@context":"https://schema.org","@type":"Organization","name":"Springer Science and Business Media LLC"},"image":null,"thumbnailUrl":null,"url":"https://www.academia.edu/107266757/The_risks_of_autonomous_machines_from_responsibility_gaps_to_control_gaps","sourceOrganization":[{"@context":"https://schema.org","@type":"EducationalOrganization","name":"rug"}]}</script><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/single_work_page/loswp-102fa537001ba4d8dcd921ad9bd56c474abc201906ea4843e7e7efe9dfbf561d.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/body-8d679e925718b5e8e4b18e9a4fab37f7eaa99e43386459376559080ac8f2856a.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/button-3cea6e0ad4715ed965c49bfb15dedfc632787b32ff6d8c3a474182b231146ab7.css" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/text_button-73590134e40cdb49f9abdc8e796cc00dc362693f3f0f6137d6cf9bb78c318ce7.css" /><link crossorigin="" href="https://fonts.gstatic.com/" rel="preconnect" /><link href="https://fonts.googleapis.com/css2?family=DM+Sans:ital,opsz,wght@0,9..40,100..1000;1,9..40,100..1000&amp;family=Gupter:wght@400;500;700&amp;family=IBM+Plex+Mono:wght@300;400&amp;family=Material+Symbols+Outlined:opsz,wght,FILL,GRAD@20,400,0,0&amp;display=swap" rel="stylesheet" /><link rel="stylesheet" media="all" href="//a.academia-assets.com/assets/design_system/common-10fa40af19d25203774df2d4a03b9b5771b45109c2304968038e88a81d1215c5.css" /> </head> <body> <div id='react-modal'></div> <div class="js-upgrade-ie-banner" style="display: none; text-align: center; padding: 8px 0; background-color: #ebe480;"><p style="color: #000; font-size: 12px; margin: 0 0 4px;">Academia.edu no longer supports Internet Explorer.</p><p style="color: #000; font-size: 12px; margin: 0;">To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to&nbsp;<a href="https://www.academia.edu/upgrade-browser">upgrade your browser</a>.</p></div><script>// Show this banner for all versions of IE if (!!window.MSInputMethodContext || /(MSIE)/.test(navigator.userAgent)) { document.querySelector('.js-upgrade-ie-banner').style.display = 'block'; }</script> <div class="bootstrap login"><div class="modal fade login-modal" id="login-modal"><div class="login-modal-dialog modal-dialog"><div class="modal-content"><div class="modal-header"><button class="close close" data-dismiss="modal" type="button"><span aria-hidden="true">&times;</span><span class="sr-only">Close</span></button><h4 class="modal-title text-center"><strong>Log In</strong></h4></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><button class="btn btn-fb btn-lg btn-block btn-v-center-content" id="login-facebook-oauth-button"><svg style="float: left; width: 19px; line-height: 1em; margin-right: .3em;" aria-hidden="true" focusable="false" data-prefix="fab" data-icon="facebook-square" class="svg-inline--fa fa-facebook-square fa-w-14" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 448 512"><path fill="currentColor" d="M400 32H48A48 48 0 0 0 0 80v352a48 48 0 0 0 48 48h137.25V327.69h-63V256h63v-54.64c0-62.15 37-96.48 93.67-96.48 27.14 0 55.52 4.84 55.52 4.84v61h-31.27c-30.81 0-40.42 19.12-40.42 38.73V256h68.78l-11 71.69h-57.78V480H400a48 48 0 0 0 48-48V80a48 48 0 0 0-48-48z"></path></svg><small><strong>Log in</strong> with <strong>Facebook</strong></small></button><br /><button class="btn btn-google btn-lg btn-block btn-v-center-content" id="login-google-oauth-button"><svg style="float: left; width: 22px; line-height: 1em; margin-right: .3em;" aria-hidden="true" focusable="false" data-prefix="fab" data-icon="google-plus" class="svg-inline--fa fa-google-plus fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M256,8C119.1,8,8,119.1,8,256S119.1,504,256,504,504,392.9,504,256,392.9,8,256,8ZM185.3,380a124,124,0,0,1,0-248c31.3,0,60.1,11,83,32.3l-33.6,32.6c-13.2-12.9-31.3-19.1-49.4-19.1-42.9,0-77.2,35.5-77.2,78.1S142.3,334,185.3,334c32.6,0,64.9-19.1,70.1-53.3H185.3V238.1H302.2a109.2,109.2,0,0,1,1.9,20.7c0,70.8-47.5,121.2-118.8,121.2ZM415.5,273.8v35.5H380V273.8H344.5V238.3H380V202.8h35.5v35.5h35.2v35.5Z"></path></svg><small><strong>Log in</strong> with <strong>Google</strong></small></button><br /><style type="text/css">.sign-in-with-apple-button { width: 100%; height: 52px; border-radius: 3px; border: 1px solid black; cursor: pointer; }</style><script src="https://appleid.cdn-apple.com/appleauth/static/jsapi/appleid/1/en_US/appleid.auth.js" type="text/javascript"></script><div class="sign-in-with-apple-button" data-border="false" data-color="white" id="appleid-signin"><span &nbsp;&nbsp;="Sign Up with Apple" class="u-fs11"></span></div><script>AppleID.auth.init({ clientId: 'edu.academia.applesignon', scope: 'name email', redirectURI: 'https://www.academia.edu/sessions', state: "e7ec21c731a973104d7fa645adbf58c6e965f0d69fb8c601b78cb196cfe5962c", });</script><script>// Hacky way of checking if on fast loswp if (window.loswp == null) { (function() { const Google = window?.Aedu?.Auth?.OauthButton?.Login?.Google; const Facebook = window?.Aedu?.Auth?.OauthButton?.Login?.Facebook; if (Google) { new Google({ el: '#login-google-oauth-button', rememberMeCheckboxId: 'remember_me', track: null }); } if (Facebook) { new Facebook({ el: '#login-facebook-oauth-button', rememberMeCheckboxId: 'remember_me', track: null }); } })(); }</script></div></div></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><div class="hr-heading login-hr-heading"><span class="hr-heading-text">or</span></div></div></div></div><div class="modal-body"><div class="row"><div class="col-xs-10 col-xs-offset-1"><form class="js-login-form" action="https://www.academia.edu/sessions" accept-charset="UTF-8" method="post"><input name="utf8" type="hidden" value="&#x2713;" autocomplete="off" /><input type="hidden" name="authenticity_token" value="JeNPI9MTLfquzSWZGy0Xu10afz3hDalhyumuvnPGkSjmovRrXeR7einK3+Z+4WDUsL/U09DLx7vjTGrlNGs0Lw==" autocomplete="off" /><div class="form-group"><label class="control-label" for="login-modal-email-input" style="font-size: 14px;">Email</label><input class="form-control" id="login-modal-email-input" name="login" type="email" /></div><div class="form-group"><label class="control-label" for="login-modal-password-input" style="font-size: 14px;">Password</label><input class="form-control" id="login-modal-password-input" name="password" type="password" /></div><input type="hidden" name="post_login_redirect_url" id="post_login_redirect_url" value="https://www.academia.edu/107266757/The_risks_of_autonomous_machines_from_responsibility_gaps_to_control_gaps" autocomplete="off" /><div class="checkbox"><label><input type="checkbox" name="remember_me" id="remember_me" value="1" checked="checked" /><small style="font-size: 12px; margin-top: 2px; display: inline-block;">Remember me on this computer</small></label></div><br><input type="submit" name="commit" value="Log In" class="btn btn-primary btn-block btn-lg js-login-submit" data-disable-with="Log In" /></br></form><script>typeof window?.Aedu?.recaptchaManagedForm === 'function' && window.Aedu.recaptchaManagedForm( document.querySelector('.js-login-form'), document.querySelector('.js-login-submit') );</script><small style="font-size: 12px;"><br />or <a data-target="#login-modal-reset-password-container" data-toggle="collapse" href="javascript:void(0)">reset password</a></small><div class="collapse" id="login-modal-reset-password-container"><br /><div class="well margin-0x"><form class="js-password-reset-form" action="https://www.academia.edu/reset_password" accept-charset="UTF-8" method="post"><input name="utf8" type="hidden" value="&#x2713;" autocomplete="off" /><input type="hidden" name="authenticity_token" value="COh17IAsMWfMocPxq8BaZJUjv/xhLb6gMVFUevl+MyLLqc6kDttn50umOY7ODC0LeIYUElDr0HoY9JAhvtOWJQ==" autocomplete="off" /><p>Enter the email address you signed up with and we&#39;ll email you a reset link.</p><div class="form-group"><input class="form-control" name="email" type="email" /></div><input class="btn btn-primary btn-block g-recaptcha js-password-reset-submit" data-sitekey="6Lf3KHUUAAAAACggoMpmGJdQDtiyrjVlvGJ6BbAj" type="submit" value="Email me a link" /></form></div></div><script> require.config({ waitSeconds: 90 })(["https://a.academia-assets.com/assets/collapse-45805421cf446ca5adf7aaa1935b08a3a8d1d9a6cc5d91a62a2a3a00b20b3e6a.js"], function() { // from javascript_helper.rb $("#login-modal-reset-password-container").on("shown.bs.collapse", function() { $(this).find("input[type=email]").focus(); }); }); </script> </div></div></div><div class="modal-footer"><div class="text-center"><small style="font-size: 12px;">Need an account?&nbsp;<a rel="nofollow" href="https://www.academia.edu/signup">Click here to sign up</a></small></div></div></div></div></div></div><script>// If we are on subdomain or non-bootstrapped page, redirect to login page instead of showing modal (function(){ if (typeof $ === 'undefined') return; var host = window.location.hostname; if ((host === $domain || host === "www."+$domain) && (typeof $().modal === 'function')) { $("#nav_log_in").click(function(e) { // Don't follow the link and open the modal e.preventDefault(); $("#login-modal").on('shown.bs.modal', function() { $(this).find("#login-modal-email-input").focus() }).modal('show'); }); } })()</script> <div id="fb-root"></div><script>window.fbAsyncInit = function() { FB.init({ appId: "2369844204", version: "v8.0", status: true, cookie: true, xfbml: true }); // Additional initialization code. if (window.InitFacebook) { // facebook.ts already loaded, set it up. window.InitFacebook(); } else { // Set a flag for facebook.ts to find when it loads. window.academiaAuthReadyFacebook = true; } };</script> <div id="google-root"></div><script>window.loadGoogle = function() { if (window.InitGoogle) { // google.ts already loaded, set it up. window.InitGoogle("331998490334-rsn3chp12mbkiqhl6e7lu2q0mlbu0f1b"); } else { // Set a flag for google.ts to use when it loads. window.GoogleClientID = "331998490334-rsn3chp12mbkiqhl6e7lu2q0mlbu0f1b"; } };</script> <div class="header--container" id="main-header-container"><div class="header--inner-container header--inner-container-ds2"><div class="header-ds2--left-wrapper"><div class="header-ds2--left-wrapper-inner"><a data-main-header-link-target="logo_home" href="https://www.academia.edu/"><img class="hide-on-desktop-redesign" style="height: 24px; width: 24px;" alt="Academia.edu" src="//a.academia-assets.com/images/academia-logo-redesign-2015-A.svg" width="24" height="24" /><img width="145.2" height="18" class="hide-on-mobile-redesign" style="height: 24px;" alt="Academia.edu" src="//a.academia-assets.com/images/academia-logo-redesign-2015.svg" /></a><div class="header--search-container header--search-container-ds2"><form class="js-SiteSearch-form select2-no-default-pills" action="https://www.academia.edu/search" accept-charset="UTF-8" method="get"><input name="utf8" type="hidden" value="&#x2713;" autocomplete="off" /><svg style="width: 14px; height: 14px;" aria-hidden="true" focusable="false" data-prefix="fas" data-icon="search" class="header--search-icon svg-inline--fa fa-search fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M505 442.7L405.3 343c-4.5-4.5-10.6-7-17-7H372c27.6-35.3 44-79.7 44-128C416 93.1 322.9 0 208 0S0 93.1 0 208s93.1 208 208 208c48.3 0 92.7-16.4 128-44v16.3c0 6.4 2.5 12.5 7 17l99.7 99.7c9.4 9.4 24.6 9.4 33.9 0l28.3-28.3c9.4-9.4 9.4-24.6.1-34zM208 336c-70.7 0-128-57.2-128-128 0-70.7 57.2-128 128-128 70.7 0 128 57.2 128 128 0 70.7-57.2 128-128 128z"></path></svg><input class="header--search-input header--search-input-ds2 js-SiteSearch-form-input" data-main-header-click-target="search_input" name="q" placeholder="Search" type="text" /></form></div></div></div><nav class="header--nav-buttons header--nav-buttons-ds2 js-main-nav"><a class="ds2-5-button ds2-5-button--secondary js-header-login-url header-button-ds2 header-login-ds2 hide-on-mobile-redesign" href="https://www.academia.edu/login" rel="nofollow">Log In</a><a class="ds2-5-button ds2-5-button--secondary header-button-ds2 hide-on-mobile-redesign" href="https://www.academia.edu/signup" rel="nofollow">Sign Up</a><button class="header--hamburger-button header--hamburger-button-ds2 hide-on-desktop-redesign js-header-hamburger-button"><div class="icon-bar"></div><div class="icon-bar" style="margin-top: 4px;"></div><div class="icon-bar" style="margin-top: 4px;"></div></button></nav></div><ul class="header--dropdown-container js-header-dropdown"><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/login" rel="nofollow">Log In</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/signup" rel="nofollow">Sign Up</a></li><li class="header--dropdown-row js-header-dropdown-expand-button"><button class="header--dropdown-button">more<svg aria-hidden="true" focusable="false" data-prefix="fas" data-icon="caret-down" class="header--dropdown-button-icon svg-inline--fa fa-caret-down fa-w-10" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 320 512"><path fill="currentColor" d="M31.3 192h257.3c17.8 0 26.7 21.5 14.1 34.1L174.1 354.8c-7.8 7.8-20.5 7.8-28.3 0L17.2 226.1C4.6 213.5 13.5 192 31.3 192z"></path></svg></button></li><li><ul class="header--expanded-dropdown-container"><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/about">About</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/press">Press</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://medium.com/@academia">Blog</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/documents">Papers</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/terms">Terms</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/privacy">Privacy</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/copyright">Copyright</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://www.academia.edu/hiring"><svg aria-hidden="true" focusable="false" data-prefix="fas" data-icon="briefcase" class="header--dropdown-row-icon svg-inline--fa fa-briefcase fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M320 336c0 8.84-7.16 16-16 16h-96c-8.84 0-16-7.16-16-16v-48H0v144c0 25.6 22.4 48 48 48h416c25.6 0 48-22.4 48-48V288H320v48zm144-208h-80V80c0-25.6-22.4-48-48-48H176c-25.6 0-48 22.4-48 48v48H48c-25.6 0-48 22.4-48 48v80h512v-80c0-25.6-22.4-48-48-48zm-144 0H192V96h128v32z"></path></svg>We&#39;re Hiring!</a></li><li class="header--dropdown-row"><a class="header--dropdown-link" href="https://support.academia.edu/"><svg aria-hidden="true" focusable="false" data-prefix="fas" data-icon="question-circle" class="header--dropdown-row-icon svg-inline--fa fa-question-circle fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M504 256c0 136.997-111.043 248-248 248S8 392.997 8 256C8 119.083 119.043 8 256 8s248 111.083 248 248zM262.655 90c-54.497 0-89.255 22.957-116.549 63.758-3.536 5.286-2.353 12.415 2.715 16.258l34.699 26.31c5.205 3.947 12.621 3.008 16.665-2.122 17.864-22.658 30.113-35.797 57.303-35.797 20.429 0 45.698 13.148 45.698 32.958 0 14.976-12.363 22.667-32.534 33.976C247.128 238.528 216 254.941 216 296v4c0 6.627 5.373 12 12 12h56c6.627 0 12-5.373 12-12v-1.333c0-28.462 83.186-29.647 83.186-106.667 0-58.002-60.165-102-116.531-102zM256 338c-25.365 0-46 20.635-46 46 0 25.364 20.635 46 46 46s46-20.636 46-46c0-25.365-20.635-46-46-46z"></path></svg>Help Center</a></li><li class="header--dropdown-row js-header-dropdown-collapse-button"><button class="header--dropdown-button">less<svg aria-hidden="true" focusable="false" data-prefix="fas" data-icon="caret-up" class="header--dropdown-button-icon svg-inline--fa fa-caret-up fa-w-10" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 320 512"><path fill="currentColor" d="M288.662 352H31.338c-17.818 0-26.741-21.543-14.142-34.142l128.662-128.662c7.81-7.81 20.474-7.81 28.284 0l128.662 128.662c12.6 12.599 3.676 34.142-14.142 34.142z"></path></svg></button></li></ul></li></ul></div> <script src="//a.academia-assets.com/assets/webpack_bundles/fast_loswp-bundle-5a6217ebba1e3e193cb1032ca4e3bea5eb426dd3928dd536d7bf14b642b1cd3a.js" defer="defer"></script><script>window.loswp = {}; window.loswp.author = 73279; window.loswp.bulkDownloadFilterCounts = {}; window.loswp.hasDownloadableAttachment = true; window.loswp.hasViewableAttachments = true; // TODO: just use routes for this window.loswp.loginUrl = "https://www.academia.edu/login?post_login_redirect_url=https%3A%2F%2Fwww.academia.edu%2F107266757%2FThe_risks_of_autonomous_machines_from_responsibility_gaps_to_control_gaps%3Fauto%3Ddownload"; window.loswp.translateUrl = "https://www.academia.edu/login?post_login_redirect_url=https%3A%2F%2Fwww.academia.edu%2F107266757%2FThe_risks_of_autonomous_machines_from_responsibility_gaps_to_control_gaps%3Fshow_translation%3Dtrue"; window.loswp.previewableAttachments = [{"id":105982912,"identifier":"Attachment_105982912","shouldShowBulkDownload":false}]; window.loswp.shouldDetectTimezone = true; window.loswp.shouldShowBulkDownload = true; window.loswp.showSignupCaptcha = false window.loswp.willEdgeCache = false; window.loswp.work = {"work":{"id":107266757,"created_at":"2023-09-26T11:52:01.793-07:00","from_world_paper_id":240914640,"updated_at":"2024-09-30T00:49:01.687-07:00","_data":{"abstract":"Responsibility gaps concern the attribution of blame for harms caused by autonomous machines. The worry has been that, because they are artificial agents, it is impossible to attribute blame, even though doing so would be appropriate given the harms they cause. We argue that there are no responsibility gaps. The harms can be blameless. And if they are not, the blame that is appropriate is indirect and can be attributed to designers, engineers, software developers, manufacturers or regulators. The real problem lies elsewhere: autonomous machines should be built so as to exhibit a level of risk that is morally acceptable. If they fall short of this standard, they exhibit what we call ‘a control gap.’ The causal control that autonomous machines have will then fall short of the guidance control they should emulate.","publisher":"Springer Science and Business Media LLC","publication_name":"Synthese"},"document_type":"paper","pre_hit_view_count_baseline":null,"quality":"high","language":"en","title":"The risks of autonomous machines: from responsibility gaps to control gaps","broadcastable":true,"draft":null,"has_indexable_attachment":true,"indexable":true}}["work"]; window.loswp.workCoauthors = [73279]; window.loswp.locale = "en"; window.loswp.countryCode = "SG"; window.loswp.cwvAbTestBucket = ""; window.loswp.designVariant = "ds_vanilla"; window.loswp.fullPageMobileSutdModalVariant = "control"; window.loswp.useOptimizedScribd4genScript = false; window.loswp.appleClientId = 'edu.academia.applesignon';</script><script defer="" src="https://accounts.google.com/gsi/client"></script><div class="ds-loswp-container"><div class="ds-work-card--grid-container"><div class="ds-work-card--container js-loswp-work-card"><div class="ds-work-card--cover"><div class="ds-work-cover--wrapper"><div class="ds-work-cover--container"><button class="ds-work-cover--clickable js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;swp-splash-paper-cover&quot;,&quot;attachmentId&quot;:105982912,&quot;attachmentType&quot;:&quot;pdf&quot;}"><img alt="First page of “The risks of autonomous machines: from responsibility gaps to control gaps”" class="ds-work-cover--cover-thumbnail" src="https://0.academia-photos.com/attachment_thumbnails/105982912/mini_magick20230926-1-yxfn9s.png?1695754365" /><img alt="PDF Icon" class="ds-work-cover--file-icon" src="//a.academia-assets.com/assets/single_work_splash/adobe.icon-574afd46eb6b03a77a153a647fb47e30546f9215c0ee6a25df597a779717f9ef.svg" /><div class="ds-work-cover--hover-container"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">download</span><p>Download Free PDF</p></div><div class="ds-work-cover--ribbon-container">Download Free PDF</div><div class="ds-work-cover--ribbon-triangle"></div></button></div></div></div><div class="ds-work-card--work-information"><h1 class="ds-work-card--work-title">The risks of autonomous machines: from responsibility gaps to control gaps</h1><div class="ds-work-card--work-authors ds-work-card--detail"><a class="ds-work-card--author js-wsj-grid-card-author ds2-5-body-md ds2-5-body-link" data-author-id="73279" href="https://rug.academia.edu/FrankHindriks"><img alt="Profile image of Frank Hindriks" class="ds-work-card--author-avatar" src="https://0.academia-photos.com/73279/3015033/40150905/s65_frank.hindriks.jpg" />Frank Hindriks</a></div><div class="ds-work-card--detail"><p class="ds-work-card--detail ds2-5-body-sm">Synthese</p><div class="ds-work-card--work-metadata"><div class="ds-work-card--work-metadata__stat"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">visibility</span><p class="ds2-5-body-sm" id="work-metadata-view-count">…</p></div><div class="ds-work-card--work-metadata__stat"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">description</span><p class="ds2-5-body-sm">17 pages</p></div><div class="ds-work-card--work-metadata__stat"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">link</span><p class="ds2-5-body-sm">1 file</p></div></div><script>(async () => { const workId = 107266757; const worksViewsPath = "/v0/works/views?subdomain_param=api&amp;work_ids%5B%5D=107266757"; const getWorkViews = async (workId) => { const response = await fetch(worksViewsPath); if (!response.ok) { throw new Error('Failed to load work views'); } const data = await response.json(); return data.views[workId]; }; // Get the view count for the work - we send this immediately rather than waiting for // the DOM to load, so it can be available as soon as possible (but without holding up // the backend or other resource requests, because it's a bit expensive and not critical). const viewCount = await getWorkViews(workId); const updateViewCount = (viewCount) => { const viewCountNumber = Number(viewCount); if (!viewCountNumber) { throw new Error('Failed to parse view count'); } const commaizedViewCount = viewCountNumber.toLocaleString(); const viewCountBody = document.getElementById('work-metadata-view-count'); if (viewCountBody) { viewCountBody.textContent = `${commaizedViewCount} views`; } else { throw new Error('Failed to find work views element'); } }; // If the DOM is still loading, wait for it to be ready before updating the view count. if (document.readyState === "loading") { document.addEventListener('DOMContentLoaded', () => { updateViewCount(viewCount); }); // Otherwise, just update it immediately. } else { updateViewCount(viewCount); } })();</script></div><p class="ds-work-card--work-abstract ds-work-card--detail ds2-5-body-md">Responsibility gaps concern the attribution of blame for harms caused by autonomous machines. The worry has been that, because they are artificial agents, it is impossible to attribute blame, even though doing so would be appropriate given the harms they cause. We argue that there are no responsibility gaps. The harms can be blameless. And if they are not, the blame that is appropriate is indirect and can be attributed to designers, engineers, software developers, manufacturers or regulators. The real problem lies elsewhere: autonomous machines should be built so as to exhibit a level of risk that is morally acceptable. If they fall short of this standard, they exhibit what we call ‘a control gap.’ The causal control that autonomous machines have will then fall short of the guidance control they should emulate.</p><div class="ds-work-card--button-container"><button class="ds2-5-button js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;continue-reading-button--work-card&quot;,&quot;attachmentId&quot;:105982912,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;workUrl&quot;:&quot;https://www.academia.edu/107266757/The_risks_of_autonomous_machines_from_responsibility_gaps_to_control_gaps&quot;}">See full PDF</button><button class="ds2-5-button ds2-5-button--secondary js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;download-pdf-button--work-card&quot;,&quot;attachmentId&quot;:105982912,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;workUrl&quot;:&quot;https://www.academia.edu/107266757/The_risks_of_autonomous_machines_from_responsibility_gaps_to_control_gaps&quot;}"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">download</span>Download PDF</button></div></div></div></div><div data-auto_select="false" data-client_id="331998490334-rsn3chp12mbkiqhl6e7lu2q0mlbu0f1b" data-doc_id="105982912" data-landing_url="https://www.academia.edu/107266757/The_risks_of_autonomous_machines_from_responsibility_gaps_to_control_gaps" data-login_uri="https://www.academia.edu/registrations/google_one_tap" data-moment_callback="onGoogleOneTapEvent" id="g_id_onload"></div><div class="ds-top-related-works--grid-container"><div class="ds-related-content--container ds-top-related-works--container"><h2 class="ds-related-content--heading">Related papers</h2><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="0" data-entity-id="78489007" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/78489007/If_Robots_Cause_Harm_Who_is_to_Blame_Self_Driving_Cars_and_Criminal_Liability">If Robots Cause Harm, Who is to Blame? Self-Driving Cars and Criminal Liability</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="47824270" href="https://independent.academia.edu/SabineGless">Sabine Gless</a></div><p class="ds-related-work--metadata ds2-5-body-xs">SSRN Electronic Journal, 2016</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;If Robots Cause Harm, Who is to Blame? Self-Driving Cars and Criminal Liability&quot;,&quot;attachmentId&quot;:85519462,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/78489007/If_Robots_Cause_Harm_Who_is_to_Blame_Self_Driving_Cars_and_Criminal_Liability&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/78489007/If_Robots_Cause_Harm_Who_is_to_Blame_Self_Driving_Cars_and_Criminal_Liability"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="1" data-entity-id="34493348" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/34493348/On_the_legal_responsibility_of_autonomous_machines">On the legal responsibility of autonomous machines</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="52777627" href="https://independent.academia.edu/MarekJakubiec">Marek Jakubiec</a><span>, </span><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="258399" href="https://jagiellonian.academia.edu/BartoszBrozek">Bartosz Brozek</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Artificial Intelligence and Law, 2017</p><p class="ds-related-work--abstract ds2-5-body-sm">The paper concerns the problem of the legal responsibility of autonomous machines. In our opinion it boils down to the question of whether such machines can be seen as real agents through the prism of folk-psychology. We argue that autonomous machines cannot be granted the status of legal agents. Although this is quite possible from purely technical point of view, since the law is a conventional tool of regulating social interactions and as such can accommodate various legislative constructs, including legal responsibility of autonomous artificial agents, we believe that it would remain a mere &#39;law in books&#39;, never materializing as &#39;law in action&#39;. It is not impossible to imagine that the evolution of our conceptual apparatus will reach a stage, when autonomous robots become full-blooded moral and legal agents. However, today at least, we seem to be far from this point.</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;On the legal responsibility of autonomous machines&quot;,&quot;attachmentId&quot;:54360036,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/34493348/On_the_legal_responsibility_of_autonomous_machines&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/34493348/On_the_legal_responsibility_of_autonomous_machines"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="2" data-entity-id="93554557" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/93554557/The_Liability_Problem_for_Autonomous_Artificial_Agents">The Liability Problem for Autonomous Artificial Agents</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="37788467" href="https://newschool.academia.edu/PeterAsaro">Peter M Asaro</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2016</p><p class="ds-related-work--abstract ds2-5-body-sm">This paper describes and frames a central ethical issue–the liability problem–facing the regulation of artificial computational agents, including artificial intelligence (AI) and robotic systems, as they become increasingly autonomous, and supersede current capabilities. While it frames the issue in legal terms of liability and culpability, these terms are deeply imbued and interconnected with their ethical and moral correlate–responsibility. In order for society to benefit from advances in AI technology, it will be necessary to develop regulatory policies which manage the risk and liability of deploying systems with increasingly autonomous capabilities. However, current approaches to liability have difficulties when it comes to dealing with autonomous artificial agents because their behavior may be unpredictable to those who create and deploy them, and they will not be proper legal or moral agents. This problem is the motivation for a research project that will explore the fundamen...</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;The Liability Problem for Autonomous Artificial Agents&quot;,&quot;attachmentId&quot;:96261237,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/93554557/The_Liability_Problem_for_Autonomous_Artificial_Agents&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/93554557/The_Liability_Problem_for_Autonomous_Artificial_Agents"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="3" data-entity-id="72528022" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/72528022/The_Retribution_Gap_and_Responsibility_Loci_Related_to_Robots_and_Automated_Technologies_A_Reply_to_Nyholm">The Retribution-Gap and Responsibility-Loci Related to Robots and Automated Technologies: A Reply to Nyholm</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="97307073" href="https://independent.academia.edu/RoosdeJong">Roos de Jong</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Science and Engineering Ethics</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;The Retribution-Gap and Responsibility-Loci Related to Robots and Automated Technologies: A Reply to Nyholm&quot;,&quot;attachmentId&quot;:81420737,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/72528022/The_Retribution_Gap_and_Responsibility_Loci_Related_to_Robots_and_Automated_Technologies_A_Reply_to_Nyholm&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/72528022/The_Retribution_Gap_and_Responsibility_Loci_Related_to_Robots_and_Automated_Technologies_A_Reply_to_Nyholm"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="4" data-entity-id="74636748" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/74636748/Truly_Autonomous_Machines_Are_Ethical">Truly Autonomous Machines Are Ethical</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="14772179" href="https://cmu.academia.edu/JohnHooker">John Hooker</a></div><p class="ds-related-work--metadata ds2-5-body-xs">AI Magazine, 2019</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Truly Autonomous Machines Are Ethical&quot;,&quot;attachmentId&quot;:82718239,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/74636748/Truly_Autonomous_Machines_Are_Ethical&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/74636748/Truly_Autonomous_Machines_Are_Ethical"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="5" data-entity-id="33706205" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/33706205/Attributing_Agency_to_Automated_Systems_Reflections_on_Human_Robot_Collaborations_and_Responsibility_Loci_Science_and_Engineering_Ethics_2018_">Attributing Agency to Automated Systems: Reflections on Human-Robot Collaborations and Responsibility-Loci (Science and Engineering Ethics, 2018)</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="2173822" href="https://lmu-munich.academia.edu/SvenNyholm">Sven Nyholm</a></div><p class="ds-related-work--abstract ds2-5-body-sm">Many ethicists writing about automated systems (e.g. self-driving cars and autonomous weapons systems) attribute agency to these systems. Not only that, they seemingly attribute an autonomous or independent form of agency to these machines. This leads some ethicists to worry about responsibility-gaps and retribution-gaps in cases where automated systems harm or kill human beings. In this paper, I consider what sorts of agency it makes sense to attribute to most current forms of automated systems, in particular automated cars and military robots. I argue that whereas it indeed makes sense to attribute different forms of fairly sophisticated agency to these machines, we ought not to regard them as acting on their own, independently of any human beings. Rather, the right way to understand the agency exercised by these machines is in terms of human-robot collaborations, where the humans involved initiate, supervise, and manage the agency of their robotic collaborators. This means, I argue, that there is much less room for justified worries about responsibility-gaps and retribution-gaps than many ethicists think.</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Attributing Agency to Automated Systems: Reflections on Human-Robot Collaborations and Responsibility-Loci (Science and Engineering Ethics, 2018)&quot;,&quot;attachmentId&quot;:53922540,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/33706205/Attributing_Agency_to_Automated_Systems_Reflections_on_Human_Robot_Collaborations_and_Responsibility_Loci_Science_and_Engineering_Ethics_2018_&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/33706205/Attributing_Agency_to_Automated_Systems_Reflections_on_Human_Robot_Collaborations_and_Responsibility_Loci_Science_and_Engineering_Ethics_2018_"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="6" data-entity-id="7454272" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/7454272/Responsibility_for_Crashes_of_Autonomous_Vehicles_An_Ethical_Analysis">Responsibility for Crashes of Autonomous Vehicles: An Ethical Analysis</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="2076342" href="https://lmu-munich.academia.edu/AlexanderHevelke">Alexander Hevelke</a></div><p class="ds-related-work--abstract ds2-5-body-sm">A number of companies including Google and BMW are currently working on the development of autonomous cars. But if fully autonomous cars are going to drive on our roads, it must be decided who is to be held responsible in case of accidents. This involves not only legal questions, but also moral ones. The first question discussed is whether we should try to design the tort liability for car manufacturers in a way that will help along the development and improvement of autonomous vehicles. In particular, Patrick Lin’s concern that any security gain derived from the introduction of autonomous cars would constitute a trade-off in human lives will be addressed. The second question is whether it would be morally permissible to impose liability on the user based on a duty to pay attention to the road and traffic and to intervene when necessary to avoid accidents. Doubts about the moral legitimacy of such a scheme are based on the notion that it is a form of defamation if a person is held to blame for causing the death of another by his inattention if he never had a real chance to intervene. Therefore, the legitimacy of such an approach would depend on the user having an actual chance to do so. The last option discussed in this paper is a system in which a person using an autonomous vehicle has no duty (and possibly no way) of interfering, but is still held (financially, not criminally) responsible for possible accidents. Two ways of doing so are discussed, but only one is judged morally feasible.</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Responsibility for Crashes of Autonomous Vehicles: An Ethical Analysis&quot;,&quot;attachmentId&quot;:34033453,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/7454272/Responsibility_for_Crashes_of_Autonomous_Vehicles_An_Ethical_Analysis&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/7454272/Responsibility_for_Crashes_of_Autonomous_Vehicles_An_Ethical_Analysis"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="7" data-entity-id="30095981" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/30095981/SHOULD_AUTONOMOUS_AGENTS_BE_LIABLE_FOR_WHAT_THEY_DO">SHOULD AUTONOMOUS AGENTS BE LIABLE FOR WHAT THEY DO</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="43117" href="https://unimaas.academia.edu/JaapHage">Jaap Hage</a></div><p class="ds-related-work--abstract ds2-5-body-sm">This article addresses the question whether autonomous agents can be held responsible for their acts. In this connection autonomous systems are taken to be non-human systems which do things which would be considered acts if performed by humans. The argument, which leads to the conclusion that autonomous agents can be held responsible for their acts is based on an analogy between human beings and autonomous agents and its main element is that if humans can be held responsible, so can, in principle, autonomous agents. This argument can only be convincing if the relevant similarities between human beings and autonomous agents are more important than the relevant differences. An important part of the argument is therefore aimed at showing precisely this. The main point here is that the argument does not claim that autonomous agents are actually like human beings, but rather that human beings are actually like autonomous agents. This analogy can only lead to the conclusion that autonomous agents can be held responsible if it is assumed that human beings can be held responsible, even if they – as the argument assumes – are like autonomous agents. This will be argued indeed, and leads to the transition from the question whether human beings and autonomous agents can be held responsible and liable to the question whether it is desirable to do so. The answer to this last question is guardedly affirmative: it depends on the circumstances, but yes, sometimes it is desirable to hold human beings and autonomous agents responsible and liable for what they did. Therefore it sometimes makes sense to do so.</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;SHOULD AUTONOMOUS AGENTS BE LIABLE FOR WHAT THEY DO&quot;,&quot;attachmentId&quot;:50555844,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/30095981/SHOULD_AUTONOMOUS_AGENTS_BE_LIABLE_FOR_WHAT_THEY_DO&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/30095981/SHOULD_AUTONOMOUS_AGENTS_BE_LIABLE_FOR_WHAT_THEY_DO"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="8" data-entity-id="35097815" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/35097815/Mind_the_Gap_Responsible_Robotics_and_the_Problem_of_Responsibility">Mind the Gap: Responsible Robotics and the Problem of Responsibility</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="38706" href="https://niu.academia.edu/DavidGunkel">David Gunkel</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Ethics and Information Technology, 2017</p><p class="ds-related-work--abstract ds2-5-body-sm">The task of this essay is to respond to the question concerning robots and responsibility—to answer for the way that we understand, debate, and decide who or what is able to answer for decisions and actions undertaken by increasingly interactive, autonomous, and sociable mechanisms. The analysis proceeds through three steps or movements. 1) It begins by critically examining the instrumental theory of technology, which determines the way one typically deals with and responds to the question of responsibility when it involves technology. 2) It then considers three instances where recent innovations in robotics challenge this standard operating procedure by opening gaps in the usual way of assigning responsibility. The innovations considered in this section include: autonomous technology, machine learning, and social robots. 3) The essay concludes by evaluating the three different responses—instrumentalism 2.0, machine ethics, and hybrid responsibility—that have been made in face of these difficulties in an effort to map out the opportunities and challenges of and for responsible robotics.</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Mind the Gap: Responsible Robotics and the Problem of Responsibility&quot;,&quot;attachmentId&quot;:54959073,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/35097815/Mind_the_Gap_Responsible_Robotics_and_the_Problem_of_Responsibility&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/35097815/Mind_the_Gap_Responsible_Robotics_and_the_Problem_of_Responsibility"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-wsj-grid-card" data-collection-position="9" data-entity-id="28920933" data-sort-order="default"><a class="ds-related-work--title js-wsj-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/28920933/Praise_the_Machine_Punish_the_Human_The_Contradictory_History_of_Accountability_in_Automated_Aviation">Praise the Machine! Punish the Human! The Contradictory History of Accountability in Automated Aviation</a><div class="ds-related-work--metadata"><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="10833256" href="https://columbia.academia.edu/MCElish">M. C. Elish</a><span>, </span><a class="js-wsj-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="48382876" href="https://independent.academia.edu/TimHwang1">Tim Hwang</a></div><p class="ds-related-work--abstract ds2-5-body-sm">What will happen to current regimes of liability when driverless cars become commercially available? What happens when there is no human actor — only a computational agent — responsible for an accident? This white paper addresses these questions by examining the historical emergence and response to technologies of autopilot and cruise control. Through an examination of technical, social and legal histories, we observe a counter-intuitive focus on human responsibility even while human action is increasingly replaced by automation. We argue that a potential legal crisis with respect to driverless cars and other autonomous vehicles is unlikely. Despite this, we propose that the debate around liability and autonomous systems be reframed more precisely to reflect the agentive role of designers and engineers as well as the new and unique kinds of human action attendant to autonomous systems. The advent of commercially available autonomous vehicles, like the driverless car, presents an opportunity to reconfigure regimes of liability that reflect realities of informational asymmetry between designers and consumers. Our paper concludes by offering a set of policy principles to guide future legislation.</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Praise the Machine! Punish the Human! The Contradictory History of Accountability in Automated Aviation&quot;,&quot;attachmentId&quot;:49355242,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/28920933/Praise_the_Machine_Punish_the_Human_The_Contradictory_History_of_Accountability_in_Automated_Aviation&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-wsj-grid-card-view-pdf" href="https://www.academia.edu/28920933/Praise_the_Machine_Punish_the_Human_The_Contradictory_History_of_Accountability_in_Automated_Aviation"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div></div></div><div class="ds-sticky-ctas--wrapper js-loswp-sticky-ctas hidden"><div class="ds-sticky-ctas--grid-container"><div class="ds-sticky-ctas--container"><button class="ds2-5-button js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;continue-reading-button--sticky-ctas&quot;,&quot;attachmentId&quot;:105982912,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;workUrl&quot;:null}">See full PDF</button><button class="ds2-5-button ds2-5-button--secondary js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;download-pdf-button--sticky-ctas&quot;,&quot;attachmentId&quot;:105982912,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;workUrl&quot;:null}"><span class="material-symbols-outlined" style="font-size: 20px" translate="no">download</span>Download PDF</button></div></div></div><div class="ds-below-fold--grid-container"><div class="ds-work--container js-loswp-embedded-document"><div class="attachment_preview" data-attachment="Attachment_105982912" style="display: none"><div class="js-scribd-document-container"><div class="scribd--document-loading js-scribd-document-loader" style="display: block;"><img alt="Loading..." src="//a.academia-assets.com/images/loaders/paper-load.gif" /><p>Loading Preview</p></div></div><div style="text-align: center;"><div class="scribd--no-preview-alert js-preview-unavailable"><p>Sorry, preview is currently unavailable. You can download the paper by clicking the button above.</p></div></div></div></div><div class="ds-sidebar--container js-work-sidebar"><div class="ds-related-content--container"><h2 class="ds-related-content--heading">Related papers</h2><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="0" data-entity-id="34579086" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/34579086/When_HAL_Kills_Stop_Asking_Whos_to_Blame_Autonomous_machines_and_responsibility">When HAL Kills, Stop Asking Who&#39;s to Blame Autonomous machines and responsibility</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="6412209" href="https://nagoya-u.academia.edu/MinaoKukita">Minao Kukita</a></div><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;When HAL Kills, Stop Asking Who&#39;s to Blame Autonomous machines and responsibility&quot;,&quot;attachmentId&quot;:54444991,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/34579086/When_HAL_Kills_Stop_Asking_Whos_to_Blame_Autonomous_machines_and_responsibility&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/34579086/When_HAL_Kills_Stop_Asking_Whos_to_Blame_Autonomous_machines_and_responsibility"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="1" data-entity-id="1006565" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/1006565/Sharing_Moral_Responsibility_with_Robots_A_Pragmatic_Approach">Sharing Moral Responsibility with Robots: A Pragmatic Approach</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="223025" href="https://chalmers.academia.edu/GordanaDodigCrnkovic">Gordana Dodig-Crnkovic</a></div><p class="ds-related-work--metadata ds2-5-body-xs">… of the 2008 conference on Tenth …, 2008</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Sharing Moral Responsibility with Robots: A Pragmatic Approach&quot;,&quot;attachmentId&quot;:6158800,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/1006565/Sharing_Moral_Responsibility_with_Robots_A_Pragmatic_Approach&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/1006565/Sharing_Moral_Responsibility_with_Robots_A_Pragmatic_Approach"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="2" data-entity-id="124918741" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/124918741/Blaming_humans_in_autonomous_vehicle_accidents_Shared_responsibility_across_levels_of_automation">Blaming humans in autonomous vehicle accidents: Shared responsibility across levels of automation</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="320548780" href="https://independent.academia.edu/SohanSDsouza">Sohan Dsouza</a></div><p class="ds-related-work--metadata ds2-5-body-xs">arXiv (Cornell University), 2018</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Blaming humans in autonomous vehicle accidents: Shared responsibility across levels of automation&quot;,&quot;attachmentId&quot;:119055223,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/124918741/Blaming_humans_in_autonomous_vehicle_accidents_Shared_responsibility_across_levels_of_automation&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/124918741/Blaming_humans_in_autonomous_vehicle_accidents_Shared_responsibility_across_levels_of_automation"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="3" data-entity-id="58981434" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/58981434/Meaningful_Human_Control_over_Autonomous_Systems_A_Philosophical_Account">Meaningful Human Control over Autonomous Systems: A Philosophical Account</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="234119" href="https://tudelft.academia.edu/FilippoSantonideSio">Filippo Santoni de Sio</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Frontiers in Robotics and AI</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Meaningful Human Control over Autonomous Systems: A Philosophical Account&quot;,&quot;attachmentId&quot;:73129235,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/58981434/Meaningful_Human_Control_over_Autonomous_Systems_A_Philosophical_Account&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/58981434/Meaningful_Human_Control_over_Autonomous_Systems_A_Philosophical_Account"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="4" data-entity-id="85013510" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/85013510/A_Theory_of_Vicarious_Liability_for_Autonomous_Machine_Caused_Harm">A Theory of Vicarious Liability for Autonomous Machine Caused Harm</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="79246915" href="https://independent.academia.edu/PinchasHuberman">Pinchas Huberman</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Osgoode Hall Law Journal, 2021</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;A Theory of Vicarious Liability for Autonomous Machine Caused Harm&quot;,&quot;attachmentId&quot;:89845252,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/85013510/A_Theory_of_Vicarious_Liability_for_Autonomous_Machine_Caused_Harm&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/85013510/A_Theory_of_Vicarious_Liability_for_Autonomous_Machine_Caused_Harm"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="5" data-entity-id="78896056" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/78896056/Mind_the_Gap_Autonomous_Systems_the_Responsibility_Gap_and_Moral_Entanglement">Mind the Gap: Autonomous Systems, the Responsibility Gap, and Moral Entanglement</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="1307789" href="https://cornell.academia.edu/TrystanGoetze">Trystan S Goetze</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Proceedings of the 2022 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’22), 2022</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Mind the Gap: Autonomous Systems, the Responsibility Gap, and Moral Entanglement&quot;,&quot;attachmentId&quot;:85781085,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/78896056/Mind_the_Gap_Autonomous_Systems_the_Responsibility_Gap_and_Moral_Entanglement&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/78896056/Mind_the_Gap_Autonomous_Systems_the_Responsibility_Gap_and_Moral_Entanglement"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="6" data-entity-id="45077358" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/45077358/Playing_the_Blame_Game_with_Robots">Playing the Blame Game with Robots</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="88719" href="https://york.academia.edu/MichaelStuart">Michael T Stuart</a><span>, </span><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="2305002" href="https://uzh.academia.edu/MarkusKneer">Markus Kneer</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction (HRI’21 Companion), March 8-11, 2021, Boulder, CO, USA. ACM, New York, NY, USA, 2021</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Playing the Blame Game with Robots&quot;,&quot;attachmentId&quot;:65635188,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/45077358/Playing_the_Blame_Game_with_Robots&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/45077358/Playing_the_Blame_Game_with_Robots"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="7" data-entity-id="2841513" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/2841513/Artificial_intelligence_safety_engineering_Why_machine_ethics_is_a_wrong_approach">Artificial intelligence safety engineering: Why machine ethics is a wrong approach</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="661886" href="https://louisville.academia.edu/RomanYampolskiy">Roman Yampolskiy</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2011</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Artificial intelligence safety engineering: Why machine ethics is a wrong approach&quot;,&quot;attachmentId&quot;:30785976,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/2841513/Artificial_intelligence_safety_engineering_Why_machine_ethics_is_a_wrong_approach&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/2841513/Artificial_intelligence_safety_engineering_Why_machine_ethics_is_a_wrong_approach"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="8" data-entity-id="86038645" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/86038645/Transformations_of_Responsibility_in_the_Age_of_Automation_Being_Answerable_to_Human_and_Non_Human_Others">Transformations of Responsibility in the Age of Automation: Being Answerable to Human and Non-Human Others</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="16604" href="https://univie.academia.edu/MarkCoeckelbergh">Mark Coeckelbergh</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Techno:Phil – Aktuelle Herausforderungen der Technikphilosophie, 2020</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Transformations of Responsibility in the Age of Automation: Being Answerable to Human and Non-Human Others&quot;,&quot;attachmentId&quot;:90579304,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/86038645/Transformations_of_Responsibility_in_the_Age_of_Automation_Being_Answerable_to_Human_and_Non_Human_Others&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/86038645/Transformations_of_Responsibility_in_the_Age_of_Automation_Being_Answerable_to_Human_and_Non_Human_Others"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="9" data-entity-id="34567022" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/34567022/Autonomy_and_Responsibility_in_Hybrid_Systems_The_Example_of_Autonomous_Cars_in_Patrick_Lin_Keith_Abney_and_Ryan_Jenkins_Hg_Robot_Ethics_2_0_OUP">Autonomy and Responsibility in Hybrid Systems: The Example of Autonomous Cars, in: Patrick Lin, Keith Abney, and Ryan Jenkins (Hg.): &quot;Robot Ethics 2.0&quot;. OUP.</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="169595" href="https://uni-tuebingen.academia.edu/WulfLoh">Wulf Loh</a><span>, </span><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="32783172" href="https://univie.academia.edu/JaninaSombetzki">Janina Loh (geb. Sombetzki)</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Oxford University Press, 2017</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Autonomy and Responsibility in Hybrid Systems: The Example of Autonomous Cars, in: Patrick Lin, Keith Abney, and Ryan Jenkins (Hg.): \&quot;Robot Ethics 2.0\&quot;. OUP.&quot;,&quot;attachmentId&quot;:55157226,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/34567022/Autonomy_and_Responsibility_in_Hybrid_Systems_The_Example_of_Autonomous_Cars_in_Patrick_Lin_Keith_Abney_and_Ryan_Jenkins_Hg_Robot_Ethics_2_0_OUP&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/34567022/Autonomy_and_Responsibility_in_Hybrid_Systems_The_Example_of_Autonomous_Cars_in_Patrick_Lin_Keith_Abney_and_Ryan_Jenkins_Hg_Robot_Ethics_2_0_OUP"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="10" data-entity-id="2970734" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/2970734/Risk_and_Robots_some_ethical_issues">Risk and Robots - some ethical issues </a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="3419234" href="https://nlda.academia.edu/PeterOlsthoorn">Peter Olsthoorn</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Paper for the 2011 conference on The Ethics of Emerging Military Technologies, organized by The International Society for Military Ethics, and hosted by the University of San Diego.</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Risk and Robots - some ethical issues &quot;,&quot;attachmentId&quot;:30929924,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/2970734/Risk_and_Robots_some_ethical_issues&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/2970734/Risk_and_Robots_some_ethical_issues"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="11" data-entity-id="66507830" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/66507830/The_Responsibility_Quantification_Model_of_Human_Interaction_With_Automation">The Responsibility Quantification Model of Human Interaction With Automation</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="601094" href="https://telaviv.academia.edu/JoachimMeyer">Joachim Meyer</a></div><p class="ds-related-work--metadata ds2-5-body-xs">IEEE Transactions on Automation Science and Engineering</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;The Responsibility Quantification Model of Human Interaction With Automation&quot;,&quot;attachmentId&quot;:77672570,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/66507830/The_Responsibility_Quantification_Model_of_Human_Interaction_With_Automation&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/66507830/The_Responsibility_Quantification_Model_of_Human_Interaction_With_Automation"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="12" data-entity-id="1006599" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/1006599/Towards_Trustworthy_Intelligent_Robots_A_Pragmatic_Approach_to_Moral_Responsibility">Towards Trustworthy Intelligent Robots-A Pragmatic Approach to Moral Responsibility</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="567058" href="https://mdh.academia.edu/BaranCuruklu">Baran Curuklu</a><span>, </span><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="223025" href="https://chalmers.academia.edu/GordanaDodigCrnkovic">Gordana Dodig-Crnkovic</a></div><p class="ds-related-work--metadata ds2-5-body-xs">mrtc.mdh.se</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Towards Trustworthy Intelligent Robots-A Pragmatic Approach to Moral Responsibility&quot;,&quot;attachmentId&quot;:6158850,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/1006599/Towards_Trustworthy_Intelligent_Robots_A_Pragmatic_Approach_to_Moral_Responsibility&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/1006599/Towards_Trustworthy_Intelligent_Robots_A_Pragmatic_Approach_to_Moral_Responsibility"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="13" data-entity-id="36154057" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/36154057/Robots_Liability_A_Use_Case_and_a_Potential_Solution_Robots_Liability_A_Use_Case_and_a_Potential_Solution">Robots Liability: A Use Case and a Potential Solution Robots Liability: A Use Case and a Potential Solution</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="51102160" href="https://uc3m.academia.edu/AlejandroZornoza">Alejandro Zornoza Somolinos</a></div><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Robots Liability: A Use Case and a Potential Solution Robots Liability: A Use Case and a Potential Solution&quot;,&quot;attachmentId&quot;:56046527,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/36154057/Robots_Liability_A_Use_Case_and_a_Potential_Solution_Robots_Liability_A_Use_Case_and_a_Potential_Solution&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/36154057/Robots_Liability_A_Use_Case_and_a_Potential_Solution_Robots_Liability_A_Use_Case_and_a_Potential_Solution"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="14" data-entity-id="22011093" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/22011093/Responsible_Machines_The_Opportunities_and_Challenges_of_Artificial_Autonomous_Agents">Responsible Machines: The Opportunities and Challenges of Artificial Autonomous Agents</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="38706" href="https://niu.academia.edu/DavidGunkel">David Gunkel</a></div><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Responsible Machines: The Opportunities and Challenges of Artificial Autonomous Agents&quot;,&quot;attachmentId&quot;:42727398,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/22011093/Responsible_Machines_The_Opportunities_and_Challenges_of_Artificial_Autonomous_Agents&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/22011093/Responsible_Machines_The_Opportunities_and_Challenges_of_Artificial_Autonomous_Agents"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="15" data-entity-id="48523704" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/48523704/Theoretical_foundations_for_the_responsibility_of_autonomous_agents">Theoretical foundations for the responsibility of autonomous agents</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="81959258" href="https://unimaas.academia.edu/Hage">Jaap Hage</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Artificial Intelligence and Law</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Theoretical foundations for the responsibility of autonomous agents&quot;,&quot;attachmentId&quot;:67090901,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/48523704/Theoretical_foundations_for_the_responsibility_of_autonomous_agents&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/48523704/Theoretical_foundations_for_the_responsibility_of_autonomous_agents"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="16" data-entity-id="88502792" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/88502792/Can_we_Bridge_AI_s_responsibility_gap_at_Will">Can we Bridge AI’s responsibility gap at Will?</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="42759516" href="https://tu-harburg.academia.edu/MaximilianKiener">Maximilian Kiener</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Ethical Theory and Moral Practice</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Can we Bridge AI’s responsibility gap at Will?&quot;,&quot;attachmentId&quot;:92465207,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/88502792/Can_we_Bridge_AI_s_responsibility_gap_at_Will&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/88502792/Can_we_Bridge_AI_s_responsibility_gap_at_Will"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="17" data-entity-id="25737698" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/25737698/Mai_2016_Vortrag_im_Rahmen_des_Dagstuhl_Seminar_16222_Engineering_Moral_Agents_from_Human_Morality_to_Artificial_Morality_mit_einer_Short_Intro_zum_Thema_Responsibility_in_Human_Machine_Interaction_">Mai 2016 - Vortrag im Rahmen des Dagstuhl Seminar 16222 Engineering Moral Agents – from Human Morality to Artificial Morality mit einer Short-Intro zum Thema „Responsibility in Human-Machine-Interaction”.</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="32783172" href="https://univie.academia.edu/JaninaSombetzki">Janina Loh (geb. Sombetzki)</a></div><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Mai 2016 - Vortrag im Rahmen des Dagstuhl Seminar 16222 Engineering Moral Agents – from Human Morality to Artificial Morality mit einer Short-Intro zum Thema „Responsibility in Human-Machine-Interaction”.&quot;,&quot;attachmentId&quot;:46092717,&quot;attachmentType&quot;:&quot;pptx&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/25737698/Mai_2016_Vortrag_im_Rahmen_des_Dagstuhl_Seminar_16222_Engineering_Moral_Agents_from_Human_Morality_to_Artificial_Morality_mit_einer_Short_Intro_zum_Thema_Responsibility_in_Human_Machine_Interaction_&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/25737698/Mai_2016_Vortrag_im_Rahmen_des_Dagstuhl_Seminar_16222_Engineering_Moral_Agents_from_Human_Morality_to_Artificial_Morality_mit_einer_Short_Intro_zum_Thema_Responsibility_in_Human_Machine_Interaction_"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="18" data-entity-id="104309113" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/104309113/Civil_Liability_and_Artificial_Intelligence_Who_is_Responsible_for_Damages_Caused_by_Autonomous_Intelligent_Systems">Civil Liability and Artificial Intelligence: Who is Responsible for Damages Caused by Autonomous Intelligent Systems?</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="78980356" href="https://independent.academia.edu/DouglasBInda">Douglas L Binda Filho</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2020</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Civil Liability and Artificial Intelligence: Who is Responsible for Damages Caused by Autonomous Intelligent Systems?&quot;,&quot;attachmentId&quot;:104074081,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/104309113/Civil_Liability_and_Artificial_Intelligence_Who_is_Responsible_for_Damages_Caused_by_Autonomous_Intelligent_Systems&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/104309113/Civil_Liability_and_Artificial_Intelligence_Who_is_Responsible_for_Damages_Caused_by_Autonomous_Intelligent_Systems"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="19" data-entity-id="55750368" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/55750368/Autonomous_technology_and_the_greater_human_good">Autonomous technology and the greater human good</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="7433346" href="https://independent.academia.edu/SteveOmohundro">Steve Omohundro</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Journal of Experimental &amp; Theoretical Artificial Intelligence, 2014</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Autonomous technology and the greater human good&quot;,&quot;attachmentId&quot;:71473211,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/55750368/Autonomous_technology_and_the_greater_human_good&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/55750368/Autonomous_technology_and_the_greater_human_good"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="20" data-entity-id="80604931" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/80604931/Robots_Liability_A_Use_Case_and_a_Potential_Solution">Robots Liability: A Use Case and a Potential Solution</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="51102160" href="https://uc3m.academia.edu/AlejandroZornoza">Alejandro Zornoza Somolinos</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Robotics - Legal, Ethical and Socioeconomic Impacts, 2017</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Robots Liability: A Use Case and a Potential Solution&quot;,&quot;attachmentId&quot;:86931832,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/80604931/Robots_Liability_A_Use_Case_and_a_Potential_Solution&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/80604931/Robots_Liability_A_Use_Case_and_a_Potential_Solution"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="21" data-entity-id="97670062" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/97670062/NEGLIGENCE_FAILURES_AND_NEGLIGENCE_FIXES_A_COMPARATIVE_ANALYSIS_OF_CRIMINAL_REGULATION_OF_AI_AND_AUTONOMOUS_VEHICLES">NEGLIGENCE FAILURES AND NEGLIGENCE FIXES. A COMPARATIVE ANALYSIS OF CRIMINAL REGULATION OF AI AND AUTONOMOUS VEHICLES</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="62594657" href="https://unimaas.academia.edu/AliceGiannini">Alice Giannini</a></div><p class="ds-related-work--metadata ds2-5-body-xs">Criminal Law Forum, 2023</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;NEGLIGENCE FAILURES AND NEGLIGENCE FIXES. A COMPARATIVE ANALYSIS OF CRIMINAL REGULATION OF AI AND AUTONOMOUS VEHICLES&quot;,&quot;attachmentId&quot;:99229113,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/97670062/NEGLIGENCE_FAILURES_AND_NEGLIGENCE_FIXES_A_COMPARATIVE_ANALYSIS_OF_CRIMINAL_REGULATION_OF_AI_AND_AUTONOMOUS_VEHICLES&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/97670062/NEGLIGENCE_FAILURES_AND_NEGLIGENCE_FIXES_A_COMPARATIVE_ANALYSIS_OF_CRIMINAL_REGULATION_OF_AI_AND_AUTONOMOUS_VEHICLES"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div><div class="ds-related-work--container js-related-work-sidebar-card" data-collection-position="22" data-entity-id="63387343" data-sort-order="default"><a class="ds-related-work--title js-related-work-grid-card-title ds2-5-body-md ds2-5-body-link" href="https://www.academia.edu/63387343/Assigning_Liability_in_an_Autonomous_World">Assigning Liability in an Autonomous World</a><div class="ds-related-work--metadata"><a class="js-related-work-grid-card-author ds2-5-body-sm ds2-5-body-link" data-author-id="149715951" href="https://cmc.academia.edu/AgniSharma">Agni Sharma</a></div><p class="ds-related-work--metadata ds2-5-body-xs">2017</p><div class="ds-related-work--ctas"><button class="ds2-5-text-link ds2-5-text-link--inline js-swp-download-button" data-signup-modal="{&quot;location&quot;:&quot;wsj-grid-card-download-pdf-modal&quot;,&quot;work_title&quot;:&quot;Assigning Liability in an Autonomous World&quot;,&quot;attachmentId&quot;:75833805,&quot;attachmentType&quot;:&quot;pdf&quot;,&quot;work_url&quot;:&quot;https://www.academia.edu/63387343/Assigning_Liability_in_an_Autonomous_World&quot;,&quot;alternativeTracking&quot;:true}"><span class="material-symbols-outlined" style="font-size: 18px" translate="no">download</span><span class="ds2-5-text-link__content">Download free PDF</span></button><a class="ds2-5-text-link ds2-5-text-link--inline js-related-work-grid-card-view-pdf" href="https://www.academia.edu/63387343/Assigning_Liability_in_an_Autonomous_World"><span class="ds2-5-text-link__content">View PDF</span><span class="material-symbols-outlined" style="font-size: 18px" translate="no">chevron_right</span></a></div></div></div><div class="ds-related-content--container"><h2 class="ds-related-content--heading">Related topics</h2><div class="ds-research-interests--pills-container"><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="803" href="https://www.academia.edu/Documents/in/Philosophy">Philosophy</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="807" href="https://www.academia.edu/Documents/in/Philosophy_Of_Language">Philosophy Of Language</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="821" href="https://www.academia.edu/Documents/in/Philosophy_of_Science">Philosophy of Science</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="34442" href="https://www.academia.edu/Documents/in/Responsibility">Responsibility</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="48327" href="https://www.academia.edu/Documents/in/Worry">Worry</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="84688" href="https://www.academia.edu/Documents/in/Autonomous_Vehicles">Autonomous Vehicles</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="96714" href="https://www.academia.edu/Documents/in/Moral_Responsibility">Moral Responsibility</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="125342" href="https://www.academia.edu/Documents/in/Attribution">Attribution</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="186000" href="https://www.academia.edu/Documents/in/Blame">Blame</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="573872" href="https://www.academia.edu/Documents/in/Control_Management">Control Management</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="613835" href="https://www.academia.edu/Documents/in/Synthese">Synthese</a><a class="js-related-research-interest ds-research-interests--pill" data-entity-id="3173647" href="https://www.academia.edu/Documents/in/meaningful_human_control">meaningful human control</a></div></div></div></div></div><div class="footer--content"><ul class="footer--main-links hide-on-mobile"><li><a href="https://www.academia.edu/about">About</a></li><li><a href="https://www.academia.edu/press">Press</a></li><li><a rel="nofollow" href="https://medium.com/academia">Blog</a></li><li><a href="https://www.academia.edu/documents">Papers</a></li><li><a href="https://www.academia.edu/topics">Topics</a></li><li><a href="https://www.academia.edu/hiring"><svg style="width: 13px; height: 13px; position: relative; bottom: -1px;" aria-hidden="true" focusable="false" data-prefix="fas" data-icon="briefcase" class="svg-inline--fa fa-briefcase fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M320 336c0 8.84-7.16 16-16 16h-96c-8.84 0-16-7.16-16-16v-48H0v144c0 25.6 22.4 48 48 48h416c25.6 0 48-22.4 48-48V288H320v48zm144-208h-80V80c0-25.6-22.4-48-48-48H176c-25.6 0-48 22.4-48 48v48H48c-25.6 0-48 22.4-48 48v80h512v-80c0-25.6-22.4-48-48-48zm-144 0H192V96h128v32z"></path></svg>&nbsp;<strong>We&#39;re Hiring!</strong></a></li><li><a href="https://support.academia.edu/"><svg style="width: 12px; height: 12px; position: relative; bottom: -1px;" aria-hidden="true" focusable="false" data-prefix="fas" data-icon="question-circle" class="svg-inline--fa fa-question-circle fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 512 512"><path fill="currentColor" d="M504 256c0 136.997-111.043 248-248 248S8 392.997 8 256C8 119.083 119.043 8 256 8s248 111.083 248 248zM262.655 90c-54.497 0-89.255 22.957-116.549 63.758-3.536 5.286-2.353 12.415 2.715 16.258l34.699 26.31c5.205 3.947 12.621 3.008 16.665-2.122 17.864-22.658 30.113-35.797 57.303-35.797 20.429 0 45.698 13.148 45.698 32.958 0 14.976-12.363 22.667-32.534 33.976C247.128 238.528 216 254.941 216 296v4c0 6.627 5.373 12 12 12h56c6.627 0 12-5.373 12-12v-1.333c0-28.462 83.186-29.647 83.186-106.667 0-58.002-60.165-102-116.531-102zM256 338c-25.365 0-46 20.635-46 46 0 25.364 20.635 46 46 46s46-20.636 46-46c0-25.365-20.635-46-46-46z"></path></svg>&nbsp;<strong>Help Center</strong></a></li></ul><ul class="footer--research-interests"><li>Find new research papers in:</li><li><a href="https://www.academia.edu/Documents/in/Physics">Physics</a></li><li><a href="https://www.academia.edu/Documents/in/Chemistry">Chemistry</a></li><li><a href="https://www.academia.edu/Documents/in/Biology">Biology</a></li><li><a href="https://www.academia.edu/Documents/in/Health_Sciences">Health Sciences</a></li><li><a href="https://www.academia.edu/Documents/in/Ecology">Ecology</a></li><li><a href="https://www.academia.edu/Documents/in/Earth_Sciences">Earth Sciences</a></li><li><a href="https://www.academia.edu/Documents/in/Cognitive_Science">Cognitive Science</a></li><li><a href="https://www.academia.edu/Documents/in/Mathematics">Mathematics</a></li><li><a href="https://www.academia.edu/Documents/in/Computer_Science">Computer Science</a></li></ul><ul class="footer--legal-links hide-on-mobile"><li><a href="https://www.academia.edu/terms">Terms</a></li><li><a href="https://www.academia.edu/privacy">Privacy</a></li><li><a href="https://www.academia.edu/copyright">Copyright</a></li><li>Academia &copy;2024</li></ul></div> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10