CINXE.COM

Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief | Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief | The National Academies Press

<!doctype html> <html lang="en"> <head> <meta charset="UTF-8"> <meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <meta http-equiv="cache-control" content="no-cache"/> <title>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief | Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief | The National Academies Press</title> <meta name="description" content="Read chapter Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief: The online inform..."> <meta name="og:title" content="Read &quot;Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief&quot; at NAP.edu"> <meta name="og:site_name" content="The National Academies Press"> <meta name="og:url" content="https://nap.nationalacademies.org/read/27997/chapter/1"> <meta name="og:description" content="Read chapter Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief: The online inform..."> <meta name="og:type" content="book"> <meta name="og:image" content="https://nap.nationalacademies.org/cover/27997/450"> <meta name="article:publisher" content="https://www.facebook.com/NationalAcademiesPress"> <meta name="twitter:card" content="summary"> <meta name="twitter:site" content="@theNASEM"> <meta name="twitter:title" content="Read &quot;Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief&quot; at NAP.edu"> <meta name="twitter:description" content="Read chapter Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief: The online inform..."> <meta name="twitter:image:src" content="https://nap.nationalacademies.org/cover/27997/450"> <meta name="twitter:image:width" content="450"> <meta name="twitter:image:height" content="675"> <meta name="twitter:url" content="https://nap.nationalacademies.org/read/27997"> <meta name="citation_doi" content="10.17226/27997"> <meta name="prism.doi" content="10.17226/27997"> <meta name="dc.identifier" content="doi:10.17226/27997"> <link rel="canonical" href="https://nap.nationalacademies.org/read/27997/chapter/1"> <!-- <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.3/jquery.min.js"></script> --> <script src="/includes/js/jquery.min-3.1.0.js"></script> <!-- Data Layer --> <script> window.dataLayer = window.dataLayer || [];dataLayer.push({ 'pageTitle' : "Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief | Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief", 'bookID' : "27997", 'bookTitle' : "Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation", 'subtitle' : "Proceedings of a Workshop—in Brief", 'date' : "2024-10-01 11:00:00", 'report_type' : { 'code': "workshop_in_brief", 'word': "Proceedings" }, 'status' : "final", 'display_authors' : "National Academies of Sciences, Engineering, and Medicine; &lt;a href=&quot;/author/PGA&quot;&gt;Policy and Global Affairs&lt;/a&gt;; &lt;a href=&quot;/initiative/committee-on-science-technology-and-law&quot;&gt;Committee on Science, Technology, and Law&lt;/a&gt;; Paula Whitacre, Steven Kendall, and Anne-Marie Mazza, Rapporteurs", 'orgs': [ { 'acronym': "CSTL", 'name': "", 'url': "https://www.nationalacademies.org/cstl/committee-on-science-technology-and-law" }, ], 'divisions': [ { 'acronym': "", 'name': "", 'url': "http://www.nationalacademies.org/pga" }, ], 'bookTopic' : "Policy for Science and Technology", 'bookDivision' : "PGA", 'loginStatus' : "guest", 'form' : "subscribe" }); </script> <script>(function(w,d,s,l,i){w[l]=w[l]||[];w[l].push({'gtm.start': new Date().getTime(),event:'gtm.js'});var f=d.getElementsByTagName(s)[0], j=d.createElement(s),dl=l!='dataLayer'?'&l='+l:'';j.async=true;j.src= 'https://www.googletagmanager.com/gtm.js?id='+i+dl;f.parentNode.insertBefore(j,f); })(window,document,'script','dataLayer','GTM-N9MJ23');</script> <!-- Global site tag (gtag.js) - Google AdWords: 1071160892 --> <script async src="https://www.googletagmanager.com/gtag/js?id=AW-1071160892"></script> <script> window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'AW-1071160892'); gtag('config', 'AW-1071160892'); </script> <script id="mcjs">!function(c,h,i,m,p){m=c.createElement(h),p=c.getElementsByTagName(h)[0],m.async=1,m.src=i,p.parentNode.insertBefore(m,p)}(document,"script","https://chimpstatic.com/mcjs-connected/js/users/eaea39b6442dc4e0d08e6aa4a/208cb1a4b431cae38b54bdb6f.js");</script> <!--Optimize with GTM and GA includes page--> <style>.async-hide { opacity: 0 !important} </style> <script> (function(a,s,y,n,c,h,i,d,e){s.className+=' '+y;h.start=1*new Date; h.end=i=function(){s.className=s.className.replace(RegExp(' ?'+y),'')}; (a[n]=a[n]||[]).hide=h;setTimeout(function(){i();h.end=null},c);h.timeout=c; })(window,document.documentElement,'async-hide','dataLayer',4000, {'GTM-PXVRTFR':true}); </script> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','https://www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-2382644-4', 'auto'); // Update tracker settings ga('require', 'GTM-PXVRTFR'); // Add this line </script> <link href='//fonts.googleapis.com/css?family=Open+Sans:300,300italic,400,400italic,700,700italic' rel='stylesheet' type='text/css'> <link href='//fonts.googleapis.com/css?family=Lora:400,400italic,700,700italic' rel='stylesheet' type='text/css'> <link rel="stylesheet" href="/read/css/openbook.css?v=1"> <link rel="stylesheet" href="/read/stylesheets/27997/styles/custom.css"> <style>pre { background-color: transparent !important; }</style> </head> <body> <!-- Google Tag Manager (nos cript) --> <noscript><iframe src="https://www.googletagmanager.com/ns.html?id=GTM-N9MJ23" height="0" width="0" style="display:none;visibility:hidden"></iframe></noscript> <!-- End Google Tag Manager (noscript) --> <div id="container"> <div class="site-overlay"></div> <div id="scotch-panel" style="display: none;"></div> <header> <a class="logo" href="/"> <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 361.4 50.74" id="nasemLogo" data-name="nasemLogo" role="img"> <title>Logo for the National Academies Press</title> <defs> <style> .d{fill:#fff} </style> </defs> <g id="b"> <g id="c"> <path class="d" d="M6.6 46.14H4.64L0 32.8h2.76l2.98 9.08L9 32.68h1.44l3.08 9.2 3.2-9.08h2.76l-4.94 13.34h-1.96l-2.92-8.46-3.06 8.46ZM26.3 37.48h2.44v8.66H26.3v-1.12c-.88 1.26-2.04 1.4-2.66 1.4-2.66 0-4.42-2.08-4.42-4.62s1.74-4.6 4.3-4.6c.66 0 1.96.12 2.78 1.4v-1.12Zm-4.56 4.32c0 1.38 1 2.46 2.4 2.46s2.4-1.08 2.4-2.46-1-2.44-2.4-2.44-2.4 1.06-2.4 2.44ZM35.98 39.54c-.58-.54-1.16-.58-1.42-.58-.58 0-.94.28-.94.7 0 .22.1.56.78.78l.58.18c.68.22 1.7.56 2.2 1.24.26.36.44.88.44 1.44 0 .78-.26 1.56-.96 2.2-.7.64-1.54.92-2.6.92-1.8 0-2.82-.86-3.36-1.44l1.28-1.48c.48.56 1.2 1 1.92 1 .68 0 1.2-.34 1.2-.94 0-.54-.44-.76-.76-.88l-.56-.2c-.62-.22-1.34-.5-1.86-1.04-.4-.42-.66-.96-.66-1.66 0-.84.4-1.54.9-1.96.68-.54 1.56-.62 2.26-.62.64 0 1.66.08 2.76.92l-1.2 1.42ZM39.54 31.56h2.44v7.04c.46-.64 1.28-1.4 2.68-1.4.76 0 1.88.22 2.56 1.06.6.74.68 1.6.68 2.44v5.44h-2.44v-4.72c0-.48-.02-1.24-.46-1.7-.38-.4-.9-.44-1.14-.44-.64 0-1.14.22-1.5.76-.36.56-.38 1.18-.38 1.68v4.42h-2.44V31.56ZM51.64 32.36c.8 0 1.46.66 1.46 1.46s-.66 1.46-1.46 1.46-1.46-.66-1.46-1.46.66-1.46 1.46-1.46Zm1.22 5.12v8.66h-2.44v-8.66h2.44ZM55.38 37.48h2.44v1.12c.46-.64 1.28-1.4 2.68-1.4.76 0 1.88.22 2.56 1.06.6.74.68 1.6.68 2.44v5.44H61.3v-4.72c0-.48-.02-1.24-.46-1.7-.38-.4-.9-.44-1.14-.44-.64 0-1.14.22-1.5.76-.36.56-.38 1.18-.38 1.68v4.42h-2.44v-8.66ZM72.86 37.48h2.44v8.26c0 1.32-.12 2.7-1.2 3.78-.6.6-1.64 1.22-3.42 1.22-1.68 0-2.68-.5-3.24-1-.68-.58-1.2-1.6-1.4-2.58h2.64c.1.4.32.8.6 1.04.48.44 1.14.46 1.48.46.7 0 1.16-.24 1.48-.54.56-.54.62-1.22.62-1.92v-1.22c-.84 1.28-2 1.44-2.66 1.44-1.16 0-2.16-.36-3-1.18-.64-.64-1.38-1.76-1.38-3.44 0-1.36.5-2.62 1.34-3.44.76-.74 1.82-1.16 2.98-1.16 1.62 0 2.38.88 2.72 1.4v-1.12Zm-3.74 2.48c-.4.36-.78.94-.78 1.84 0 .76.28 1.32.64 1.7.52.56 1.18.76 1.76.76s1.2-.22 1.62-.6c.5-.46.78-1.16.78-1.86 0-.82-.36-1.38-.7-1.74-.48-.5-1.04-.7-1.7-.7-.78 0-1.32.32-1.62.6ZM80.66 39.56v6.58h-2.44v-6.58h-.96v-2.08h.96v-2.96h2.44v2.96h1.68v2.08h-1.68ZM91.84 38.42c.86.78 1.46 1.98 1.46 3.4s-.6 2.6-1.46 3.38c-.74.68-1.86 1.22-3.5 1.22s-2.76-.54-3.5-1.22c-.86-.78-1.46-1.98-1.46-3.38s.6-2.62 1.46-3.4c.74-.68 1.86-1.22 3.5-1.22s2.76.54 3.5 1.22Zm-1.06 3.4c0-1.36-1.04-2.46-2.44-2.46s-2.44 1.1-2.44 2.46 1.02 2.44 2.44 2.44 2.44-1.12 2.44-2.44ZM95.38 37.48h2.44v1.12c.46-.64 1.28-1.4 2.68-1.4.76 0 1.88.22 2.56 1.06.6.74.68 1.6.68 2.44v5.44h-2.44v-4.72c0-.48-.02-1.24-.46-1.7-.38-.4-.9-.44-1.14-.44-.64 0-1.14.22-1.5.76-.36.56-.38 1.18-.38 1.68v4.42h-2.44v-8.66ZM107.3 43.38l2.24.26-2.82 5.3-1.54-.18 2.12-5.38ZM119.72 32.8c1.86 0 3.48.2 4.96 1.3 1.62 1.22 2.56 3.16 2.56 5.38s-.92 4.14-2.68 5.38c-1.56 1.1-3.02 1.28-4.88 1.28h-3.44V32.8h3.48Zm-.88 11.1h.8c.66 0 2.1-.04 3.26-.88 1.06-.76 1.66-2.12 1.66-3.54s-.58-2.76-1.64-3.56c-1.08-.8-2.4-.88-3.28-.88h-.8v8.86ZM139.52 45.66c-.8.42-1.86.78-3.3.78-2.34 0-3.76-.8-4.88-1.86-1.54-1.48-2.14-3.14-2.14-5.1 0-2.42 1-4.16 2.14-5.22 1.34-1.26 3-1.76 4.9-1.76.94 0 2.06.16 3.28.82v3.04c-1.22-1.4-2.74-1.5-3.22-1.5-2.68 0-4.42 2.2-4.42 4.66 0 2.96 2.28 4.56 4.52 4.56 1.24 0 2.34-.54 3.12-1.46v3.04Z"/> <g> <path class="d" d="M2.1 19.18V.13L15.05 13.7V1.34h2.73v18.94L4.83 6.71v12.46H2.1ZM33.61 14.87h-7.62l-1.95 4.31H21.1L29.97.29l8.45 18.88h-2.94l-1.87-4.31Zm-1.1-2.57-2.65-6.05-2.75 6.05h5.4ZM45.35 3.91v15.27h-2.73V3.91h-4.09V1.34h10.91v2.57h-4.09ZM55.65 1.34v17.84h-2.73V1.34h2.73ZM78.17 10.27c0 5.24-3.93 9.25-9.28 9.25s-9.28-4.01-9.28-9.25 3.93-9.25 9.28-9.25 9.28 4.01 9.28 9.25Zm-2.73 0c0-3.96-2.73-6.74-6.55-6.74s-6.55 2.78-6.55 6.74 2.73 6.74 6.55 6.74 6.55-2.78 6.55-6.74ZM82.13 19.18V.13l12.95 13.56V1.34h2.73v18.94L84.86 6.72v12.46h-2.73ZM113.65 14.87h-7.62l-1.95 4.31h-2.94L110.01.29l8.45 18.88h-2.94l-1.87-4.31Zm-1.1-2.57-2.65-6.05-2.75 6.05h5.4ZM124.51 1.34v15.27h5.24v2.57h-7.97V1.34h2.73ZM151.17 14.87h-7.62l-1.95 4.31h-2.94L147.54.29l8.45 18.88h-2.94l-1.87-4.31Zm-1.1-2.57-2.65-6.05-2.75 6.05h5.4ZM171.05 5.51c-1.98-1.79-3.82-1.98-4.84-1.98-3.88 0-6.5 2.86-6.5 6.79s2.73 6.69 6.53 6.69c2.14 0 3.8-1.1 4.81-2.06v3.24c-1.79 1.07-3.66 1.34-4.89 1.34-3.24 0-5.27-1.47-6.31-2.46-2.09-1.95-2.86-4.23-2.86-6.74 0-3.29 1.36-5.56 2.86-6.98 1.85-1.74 3.99-2.33 6.45-2.33 1.63 0 3.21.29 4.76 1.28v3.21ZM185.71 14.87h-7.62l-1.95 4.31h-2.94L182.07.29l8.45 18.88h-2.94l-1.87-4.31Zm-1.1-2.57-2.65-6.05-2.75 6.05h5.4ZM197.59 1.34c2.49 0 4.65.27 6.63 1.74 2.17 1.63 3.42 4.23 3.42 7.2s-1.23 5.54-3.58 7.2c-2.09 1.47-4.04 1.71-6.53 1.71h-3.69V1.34h3.75Zm-1.02 15.27h1.18c1.34 0 3.18-.11 4.73-1.26 1.23-.94 2.43-2.65 2.43-5.08s-1.12-4.2-2.41-5.14c-1.55-1.12-3.45-1.23-4.76-1.23h-1.18v12.71ZM220.91 3.91h-7.12v4.52h6.9V11h-6.9v5.62h7.12v2.57h-9.84V1.34h9.84v2.57ZM224.98 19.18 228.73 0l6.21 13.86L241.2 0l3.37 19.18h-2.78l-1.82-10.75-5.19 11.58-5.14-11.61-1.87 10.78h-2.78ZM252.1 1.34v17.84h-2.73V1.34h2.73ZM267.29 3.91h-7.12v4.52h6.9V11h-6.9v5.62h7.12v2.57h-9.84V1.34h9.84v2.57ZM279.6 5.3c-.16-.4-.4-.78-.8-1.12-.56-.46-1.15-.64-1.93-.64-1.66 0-2.51.99-2.51 2.14 0 .54.19 1.47 1.93 2.17l1.79.72c3.29 1.34 4.2 3.21 4.2 5.24 0 3.4-2.41 5.72-5.75 5.72-2.06 0-3.29-.78-4.2-1.79-.96-1.07-1.39-2.25-1.5-3.48l2.7-.59c0 .88.32 1.71.75 2.27.51.64 1.26 1.07 2.33 1.07 1.66 0 2.94-1.2 2.94-3s-1.39-2.57-2.57-3.05l-1.71-.72c-1.47-.62-3.64-1.85-3.64-4.52 0-2.41 1.87-4.71 5.22-4.71 1.93 0 3.02.72 3.58 1.2.48.43.99 1.04 1.36 1.79l-2.19 1.28ZM298 1.34c1.1 0 2.62.11 3.91.96 1.5.99 2.22 2.7 2.22 4.33 0 1.04-.27 2.65-1.74 3.91-1.42 1.2-3.05 1.39-4.36 1.39h-1.23v7.25h-2.73V1.34H298Zm-1.2 8.08h1.23c2.35 0 3.48-1.15 3.48-2.81 0-.99-.4-2.75-3.5-2.75h-1.2v5.56ZM310.63 1.34c2.17 0 3.56.27 4.71 1.04 1.95 1.31 2.11 3.42 2.11 4.2 0 2.57-1.58 4.44-3.96 4.98l5.56 7.62h-3.29l-5.11-7.3h-.48v7.3h-2.73V1.34h3.18Zm-.46 8.19h.86c.75 0 3.8-.08 3.8-2.92 0-2.51-2.38-2.75-3.72-2.75h-.94v5.67ZM332.11 3.91h-7.12v4.52h6.9V11h-6.9v5.62h7.12v2.57h-9.84V1.34h9.84v2.57ZM344.42 5.3c-.16-.4-.4-.78-.8-1.12-.56-.46-1.15-.64-1.93-.64-1.66 0-2.51.99-2.51 2.14 0 .54.19 1.47 1.93 2.17l1.79.72c3.29 1.34 4.2 3.21 4.2 5.24 0 3.4-2.41 5.72-5.75 5.72-2.06 0-3.29-.78-4.2-1.79-.96-1.07-1.39-2.25-1.5-3.48l2.7-.59c0 .88.32 1.71.75 2.27.51.64 1.26 1.07 2.33 1.07 1.66 0 2.94-1.2 2.94-3s-1.39-2.57-2.57-3.05l-1.71-.72c-1.47-.62-3.64-1.85-3.64-4.52 0-2.41 1.87-4.71 5.22-4.71 1.93 0 3.02.72 3.58 1.2.48.43.99 1.04 1.36 1.79l-2.19 1.28ZM358.73 5.3c-.16-.4-.4-.78-.8-1.12-.56-.46-1.15-.64-1.93-.64-1.66 0-2.51.99-2.51 2.14 0 .54.19 1.47 1.93 2.17l1.79.72c3.29 1.34 4.2 3.21 4.2 5.24 0 3.4-2.41 5.72-5.75 5.72-2.06 0-3.29-.78-4.2-1.79-.96-1.07-1.39-2.25-1.5-3.48l2.7-.59c0 .88.32 1.71.75 2.27.51.64 1.26 1.07 2.33 1.07 1.66 0 2.94-1.2 2.94-3s-1.39-2.57-2.57-3.05l-1.71-.72c-1.47-.62-3.64-1.85-3.64-4.52 0-2.41 1.87-4.71 5.22-4.71 1.93 0 3.02.72 3.58 1.2.48.43.99 1.04 1.36 1.79l-2.19 1.28Z"/> </g> </g> </g> </svg> </a> <img class="logo-print" src="/read/img/openbook-header-print.png" border="0" alt="National Academies Press: OpenBook" width="1170" height="87"> <nav class="header-links"> <a href="https://nap.nationalacademies.org/login.php"> <span data-icon="&#xe602;"></span> <span class="icon-text"> Login</span> </a> <a href="https://nap.nationalacademies.org/login.php?action=new"> <span data-icon="&#xe638;"></span> <span class="icon-text"> Register</span> </a> <a href="https://nap.nationalacademies.org/cart/cart.php"> <span data-icon="&#xe639;"></span> <span class="icon-text"> Cart</span> </a> <a class="tourbus-depart" href="#"> <span data-icon="&#xe63c;"></span> <span class="icon-text"> Help</span> </a> </nav> </header> <div class="title-row"> <div class="title-info"> <h1 class="title-book"> <a href="https://nap.nationalacademies.org/catalog/27997/evolving-technological-legal-and-social-solutions-to-counter-online-disinformation">Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</a> <span class="title-copyright">(2024)</span> </h1> <h2 class="title-chapter"><strong>Chapter:</strong> Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</h2> </div> <div class="get-this-book"> <a href="#" class="button"> <span class="get-this-book-icon" data-icon="&#xe65a;"></span> <span class="get-this-book-text">Get This Book</span> <span class="get-this-book-caret" data-icon="&#xe895;"></span> </a> </div> <div class="clearfix"></div> </div> <div class="print-info"> Visit <span class="print-info-link">NAP.edu/10766</span> to get more information about this book, to buy it in print, or to download it as a free PDF. </div> <nav id="book-tools" class="book-tools affix-top"> <div id="ob-overlay-v1" class="hide"> <div class="contain"> <div class="ob-overlay-header"> <h1>Looking for other ways to read this?</h1> <div id="ob-close"><span data-icon="&#xe72e;" class="ob-overlay-close"></span></div> </div> <div class="flexcontainer"> <div class="flexrow"> <div class="flexitem1"> <!--book cover--> <div class="overlaycov"><img src="https://nap.nationalacademies.org/cover/27997/450" class="ob-overlay-cover" alt="cover image"> </div> </div> <div class="flexitem2"> <h2>IN ADDITION TO READING ONLINE, THIS TITLE IS AVAILABLE IN THESE FORMATS:</h2> <div class="flexrow ob-overlay-buyoptions"> <a href="https://nap.nationalacademies.org/login.php?record_id=27997&amp;page=https%3A%2F%2Fnap.nationalacademies.org%2Fdownload%2F27997" class="ob-overlay-btn btn-obo-pdf"> <span class="ob-overlay-format">PDF</span> <span class="ob-overlay-price">FREE</span> <span class="ob-overlay-action">Download <span data-icon="&#xe6ef;"></span></span></a> </div> <p><span data-icon="&#xe63a;"></span> MyNAP members <strong>SAVE 10%</strong> off online.<br> Not a MyNAP member yet? <a href="#" class="ob-overlay-link">Register for a free account</a> to start saving and receiving special member only perks. </p> </div> </div> <div class="flexcontainer-right"><a href="#" class="btn-overlay-close">No thanks. I’ll keep reading <span class="icon" data-icon="&#xe89b"></span></a></div> </div> </div> </div> <div class="site-overlay"></div> <div class="click-overlay"></div> <ul> <li class="tools-contents"> <a href="#" class="tools-toc menu-btn button"> <span data-icon="&#xe6ea;"></span> <span class="icon-text"> Contents</span> </a> </li> <li class="divider"></li> <li class="tools-chapter"> <span title="Previous Chapter" class="tools-chapter-previous button disabled"><span data-icon="&#xe898;"></span></span> <span class="tools-label">Chapter</span> <span title="Next Chapter" class="tools-chapter-next button disabled"><span data-icon="&#xe89B;"></span></span> </li> <li class="divider"></li> <li class="tools-page"> <a title="Previous Page" href="#" class="tools-page-previous button"> <span data-icon="&#xe89A;"></span> </a> <form class="go-to-page" action=""> <span class="tools-label">Page</span> <input type="text" name="go-to-page" class="go-to-page-number" placeholder="#" min="1" max="13" value=""> <span class="tools-label">of 13</span> </form> <a title="Next Page" href="#" class="tools-page-next button"> <span data-icon="&#xe899;"></span> </a> </li> <li class="divider"></li> <li class="tools-view"> <a href="#" class="toggle-image button "><span data-icon="&#xe649;"></span> <span class="icon-text">Original Pages</span></a> <a href="#" class="toggle-html button hide"><span data-icon="&#xe65c;"></span> <span class="icon-text">Text Pages</span></a> </li> <li class="divider"></li> <li class="tools-search"> <form class="search" action="/booksearch.php"> <input type="hidden" name="record_id" value="27997"> <input type="text" name="term" placeholder="Search this book..." tabindex="-1" value="" autocomplete="off"> <span class="search-icon" data-icon="&#xe6b6;"></span> <span class="search-matches-count"></span> </form> </li> <li class="get-this-book"> <a href="" class="button" title="Buy or download a copy of this book."> <span data-icon="&#xe65a;"></span> <span class="icon-text">About This Book</span> <span data-icon="&#xe895;"></span> </a> </li> </ul> </nav> <div class="content"> <div id="openbook-html" > <div class="page" data-page="1" data-display="1" data-format="html"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 1 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/1" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="1"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <div class="clearfix"></div> <div class="page-html"> <table style="width: 100%;"> <colgroup> <col style="width: 50%;"> <col style="width: 50%;"> </colgroup> <tbody> <tr> <td><img alt="NATIONAL ACADEMIES Sciences Engineering Medicine" src="/openbook/27997/xhtml/images/logo.jpg" width="373" height="69" class="no-padding"></td> <td><b>Proceedings of a Workshop&mdash;in Brief</b></td> </tr> </tbody> </table> <h1>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation</h1> <h1 class="h1a">Proceedings of a Workshop&mdash;in Brief</h1> <hr> <p class="tx">The online information environment enables the global exchange of information and ideas, but it also contributes to the proliferation of disinformation.<sup><a role="doc-noteref" epub:type="noteref" href="#pz1-7" id="Axq">1</a></sup> Online platforms operate at a scale where human-based content moderation to counter disinformation is impractical or at least very expensive and where purely technical solutions are challenging because content is often context-dependent. The speed, scale, and complexity of this ecosystem suggests that solutions are needed that consider the global nature of disinformation and effectively blend technical and human capabilities.</p> <p class="tx">On April 10 and 11, 2024, an ad hoc committee under the auspices of the National Academies of Sciences, Engineering, and Medicine&rsquo;s Committee on Science, Technology, and Law (CSTL) convened a virtual workshop to consider practicable solutions to counter online disinformation, particularly in social media. The workshop agenda was organized around four interrelated areas: content moderation; educational interventions; technological interventions; and regulatory and other incentives and disincentives for behavior change. A broad &ldquo;call for solutions&rdquo; was issued to solicit novel approaches to detect, measure, and mitigate disinformation on social media, and 14 submissions were selected for discussion as part the workshop.<sup><a role="doc-noteref" epub:type="noteref" href="#pz1-11" id="Aqnl">2</a></sup> Invited speakers made brief remarks about their proposed solutions, and via an interactive &ldquo;whiteboard,&rdquo; workshop participants shared comments and questions throughout the event. Introductory audio-visual resources (e.g., short videos or slide decks) with additional detail about speakers&rsquo; work were available prior to the workshop.<sup><a role="doc-noteref" epub:type="noteref" href="#pz1-12" id="A9sV">3</a></sup></p> <p class="tx"><b>Martha Minow</b> (Harvard University), co-chair of CSTL, welcomed participants, noting that the current workshop builds upon earlier CSTL work on Section 230 of the Communications Decency Act of 1996, which sought</p> <p class="tx1-1">__________________</p> <section class="footnote" epub:type="rearnotes"> <p epub:type="rearnote" class="fn" id="pz1-7"><sup><a href="#Axq">1</a></sup> <span role="doc-footnote" epub:type="footnote">Defining, understanding, measuring, and addressing disinformation is challenging, with some rejecting the term &ldquo;disinformation&rdquo; and others suggesting that the field of disinformation studies must be fundamentally rethought. Indeed, while the topic of the workshop was disinformation, in their remarks, speakers referred to disinformation, misinformation, and malinformation.</span></p> <p epub:type="rearnote" class="fn-1"><span role="doc-footnote" epub:type="footnote">For the purpose of these proceedings, disinformation is defined as information that is intentionally false and intended to deceive and mislead, hiding the interest and identity of those who developed and initially disseminated the disinformation. Misinformation is defined as false information presented as fact regardless of the intent to deceive, and malinformation is defined as information based on fact that is used out of context to mislead, manipulate, or harm.</span></p> <p epub:type="rearnote" class="fn" id="pz1-11"><sup><a href="#Aqnl">2</a></sup> <span role="doc-footnote" epub:type="footnote">More than 100 submissions were received. In addition to inviting presentations on selected solutions, the workshop planning committee invited seven individuals to speak about their work on disinformation as part of the four panel sessions.</span></p> <p epub:type="rearnote" class="fn" id="pz1-12"><sup><a href="#A9sV">3</a></sup> <span role="doc-footnote" epub:type="footnote">For the workshop&rsquo;s statement of task, biographical sketches of planning committee members and speakers, call for solutions, whiteboard, and introductory audio-visual resources, and workshop video, see <a href="https://www.nationalacademies.org/event/41384_04-2024_evolving-technological-legal-and-social-solutions-to-counter-disinformation-in-social-media-a-workshop">https://www.nationalacademies.org/event/41384_04-2024_evolving-technological-legal-and-social-solutions-to-counter-disinformation-in-social-media-a-workshop</a>.</span></p> </section> </div> </div> <div class="page" data-page="2" data-display="2" data-format="html"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 2 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/2" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="2"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <div class="clearfix"></div> <div class="page-html"> <p class="tx1">to foster the growth of the internet by providing certain immunities for internet-based technology companies.<sup><a role="doc-noteref" epub:type="noteref" href="#pz2-5" id="ARgI">4</a></sup></p> <p class="tx">Planning committee co-chair <b>Saul Perlmutter</b> (University of California, Berkeley and Lawrence Berkeley National Laboratory) shared the committee&rsquo;s overarching goals: to be collaborative in a generative workshop; get many ideas and approaches on the table&mdash;ideally those that might be new to some; and suggest new collaborations, areas of research, and proposals to counter disinformation. Planning committee co-chair <b>Joan Donovan</b> (Boston University) said that the workshop is an important opportunity to identify effective measures for dealing with destructive online behavior.</p> <p class="tx">To open the workshop, <b>Jonathan Corpus Ong</b> (University of Massachusetts Amherst) provided an ethnographic perspective to raise participant awareness of the range of issues associated with disinformation. He challenged the field of disinformation studies to be more globally minded and community driven. We shouldn&rsquo;t simply find ways to be more inclusive, he said, but must work to transform unjust systems, institutions, and ways of working. He suggested that we should challenge tech advocacies and media representations that perpetuate stereotypes of the Global South and ask how the disinformation studies space can empower civil society and researchers to lead in knowledge creation and design solutions.</p> <p class="tx">Ong questioned whether it is possible to give researchers in the Global South access to the same toolkits (e.g., ad libraries, transparency audits, and local platform policy officers) and legal support measures available in the Global North. As an example of divergence of Global South and Global North activism, he referenced an instance where technology activists spoke out against proposed United Nations Educational, Scientific and Cultural Organization (UNESCO) guidelines for regulating digital platforms because they felt that the guidelines would enable a rubber stamping of local over-regulation. Separately, he described how, at a South-to-South knowledge exchange workshop in Rio de Janeiro, a leader of a Myanmar election coalition called out researchers from the North who inadvertently transformed the landscape of intervention by, for example, siphoning off local researchers to write case studies for Global North policy makers (at the expense of making strategic, hyperlocal interventions). As a blueprint for South-to-South knowledge exchange, Ong identified core principles developed by Global South researchers: 1) the region is a source of creativity, innovations, and solutions: 2) ways of working in the Global South are more culturally proximate and resonant than in the Global North and this means that tools from the Global North cannot just be replicated; 3) reflexivity in collaboration, citation, and knowledge creation is important; and 4) custom-built solutions begin with a critique of unjust global governance structures and extractive systems of knowledge creation.</p> <h2>CONTENT MODERATION</h2> <p class="tx1">The first panel session, moderated by planning committee member <b>Amelia Acker</b> (University of Texas at Austin), explored approaches to counter online disinformation through content moderation, i.e., processes for reviewing and monitoring user-generated content for adherence to guidelines and standards.</p> <p class="tx"><b>Brigham Adams</b> (Goodly Labs) described Public Editor, a system designed to alert readers of popular online content to reasoning errors, cognitive biases, and rhetorical manipulations by labeling words and phrases to call attention to manipulations and inaccuracies. Readers can see and learn from these labels, and the methods underpinning the ontology are open, transparent, and customizable. Adams said the overall goal of the project is to tilt public discourse toward better reasoning.</p> <p class="tx"><b>Amelia Burke-Garcia</b> (NORC at the University of Chicago) and colleagues are developing a model that assesses online COVID-19 vaccination information. The model assesses how messages are framed&mdash;though it does not necessarily assess their veracity. Particular challenges arise from the fact that disinformation sites use health communication best practices to appear official (e.g., through the use of professional-appearing visuals and text). The model, which is undergoing testing and is expected to be deployed later this year, can be applied to contexts other than COVID-19 vaccinations. Importantly,</p> <p class="tx1-1">__________________</p> <section class="footnote" epub:type="rearnotes"> <p epub:type="rearnote" class="fn" id="pz2-5"><sup><a href="#ARgI">4</a></sup> <span role="doc-footnote" epub:type="footnote">National Academies of Sciences, Engineering, and Medicine. 2021. <i>Section 230 Protections: Can Legal Revisions or Novel Technologies Limit Online Misinformation and Abuse?: Proceedings of a Workshop&mdash;in Brief</i>. Washington, DC: National Academies Press. <a href="https://doi.org/10.17226/26280">https://doi.org/10.17226/26280</a>.</span></p> </section> </div> </div> <div class="page" data-page="3" data-display="3" data-format="html"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 3 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/3" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="3"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <div class="clearfix"></div> <div class="page-html"> <p class="tx1">the project has examined where bias may be introduced in model development, and learnings from a bias assessment will be made available.</p> <p class="tx"><b>Sarah Cen</b> (Massachusetts Institute of Technology) said that her work addresses two types of barriers that make tackling disinformation challenging: legal and regulatory. Legal barriers are related to Section 230 language and First Amendment free speech guarantees (discussed further below). Regulatory barriers relate to how regulations are implemented (e.g., as guidelines subject to compliance audits). She and colleagues have avoided a global definition of what is &ldquo;good,&rdquo; instead creating an approach around a more flexible standard. A requirement that platforms only curate election-related content from white-listed sources might exclude some legitimate sources, but a standard might say that curated content may come from a wide range of sources if it is similar to content available from white-listed sources. Cen&rsquo;s team develops content-auditing procedures that are practical and interpretable and have guarantees&mdash;which means that they can be characterized in terms of what they can and cannot be used for.</p> <p class="tx"><b>Brenden Kuerbis</b> (Georgia Institute of Technology) spoke about research by the Internet Governance Project, which focuses on the scope, pattern, and trends of online, artificial intelligence (AI) enabled disinformation during (nuclear) emergencies and on developing means to counteract the disinformation threat. He suggested that disinformation is a component within a broader propaganda framework. A preliminary project finding suggests that there is a need for a neutral, non-state-led networked governance structure with experts observing and countering threats, similar to those that have addressed spam, phishing, and other transnational cybersecurity issues. Such structures have focused on actual harms caused and the economic incentives of actors that engage in these activities.</p> <p class="tx"><b>J. Nathan Matias</b> (Cornell University) said that adaptive algorithms or AI systems may adapt and respond to user behavior in ways where it is unclear whether they promote or hinder the spread of misinformation. Further, a long history of exclusion by race, gender, and culture has led to biases across computing and the social sciences. He suggested that these act as obstacles to creating infrastructure to address misinformation.<sup><a role="doc-noteref" epub:type="noteref" href="#AQs" id="Att">5</a></sup> The Citizens and Technology (CAT) Lab at Cornell University is co-creating software tools and research infrastructure for large-scale data analysis and experimentation. Matias underscored the need to protect researchers working in the media, academia, and civil society through efforts like the Coalition of Independent Technology Research,<sup><a role="doc-noteref" epub:type="noteref" href="#ANG" id="Av9">6</a></sup> though he emphasized that it is important to support all individuals who do essential work&mdash;not just researchers.</p> <p class="tx"><b>John Wihbey</b> (Northeastern University) proposed a research agenda stimulated by AI developments. Wihbey foresees, though does not necessarily endorse, greater use of chatbots that integrate Large-Language Models (LLMs) to counter disinformation. He suggested that, while LLMs offer advantages to social media companies in terms of speed and scale, there are ethical and design risks associated with their use. The classic triad for dealing with disinformation is to remove, reduce, and inform, but a new response might be a top-down authority that takes the form of a chatbot that interfaces with users and provides assistance, mediation, and warnings when disinformation is encountered. Wihbey called for &ldquo;getting ahead&rdquo; of the potential impacts of LLMs and &ldquo;bulking up&rdquo; ethical frameworks through considerations of how AI-related principles might be applied to situations where LLMs are used to counter disinformation.</p> <h2>REFLECTIONS FROM THE PANEL DISCUSSANT</h2> <p class="tx1"><b>Nicole Cooke</b> (University of South Carolina) reflected on themes of access, transparency, community wisdom, and the need to include humans in interventions. She noted Adams&rsquo; emphasis on the need to account for cognitive, social, and emotional biases and observed that it is important both to agree on a taxonomy when talking about mis-, dis-, and malinformation<sup><a role="doc-noteref" epub:type="noteref" href="#ANn" id="A3SZ">7</a></sup> and to involve different disciplines and the public. Cooke likened efforts to create solutions to counter disinformation as &ldquo;building the train tracks as the train is already speeding along.&rdquo; She questioned the feasibility of collaboration with governments when information is omitted, withheld, or weaponized. Cen&rsquo;s research led Cooke to consider approaches for getting more people to understand inter-</p> <p class="tx1-1">__________________</p> <section class="footnote" epub:type="rearnotes"> <p epub:type="rearnote" class="fn" id="AQs"><sup><a href="#Att">5</a></sup> <span role="doc-footnote" epub:type="footnote">For a definition of misinformation, see <a href="/read/27997/chapter/1#pz1-7">footnote 1</a>.</span></p> <p epub:type="rearnote" class="fn" id="ANG"><sup><a href="#Av9">6</a></sup> <span role="doc-footnote" epub:type="footnote">See <a href="https://independenttechresearch.org/">https://independenttechresearch.org/</a>.</span></p> <p epub:type="rearnote" class="fn" id="ANn"><sup><a href="#A3SZ">7</a></sup> <span role="doc-footnote" epub:type="footnote">For a definition of malinformation, see <a href="/read/27997/chapter/1#pz1-7">footnote 1</a>.</span></p> </section> </div> </div> <div class="page" data-page="4" data-display="4" data-format="html"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 4 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/4" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="4"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <div class="clearfix"></div> <div class="page-html"> <p class="tx1">ventions. She noted that when technology leaders have testified before Congress, it is apparent that the technical aspects of disinformation are challenging to policy makers; to make the most of interventions, more education will be involved. She expressed appreciation for Wihbey&rsquo;s focus on ethics and keeping people at the forefront when considering the use of LLM chatbots.</p> <h2>DISCUSSION</h2> <p class="tx1">Acker asked about infrastructure and partnerships needed to implement solutions offered by panelists. Adams said that Public Editor needs 2,000-3,000 annotators of content and is building its workforce, but that it has created tools for classrooms and citizen scientists. Burke-Garcia said her (and other health) projects addressing disinformation will grow through collaboration with health experts, technology experts, and community members. Matias said that it is a challenge to engage a diverse public and called for more work to engage communities, akin to the Cooperative Extension System of the U.S. Department of Agriculture.<sup><a role="doc-noteref" epub:type="noteref" href="#pz4-4" id="Auu">8</a></sup> Cen said that there is a need for an auditing infrastructure for platforms, given limits on the amount of information that researchers can ask platforms to provide. Wihbey highlighted the National Internet Observatory, which is being designed to provide researchers with access to large, representative panels of U.S. internet users for the purpose of researching online behaviors.<sup><a role="doc-noteref" epub:type="noteref" href="#pz4-5" id="AYrA">9</a></sup></p> <p class="tx">An audience member noted the challenges of implementing solutions for countering disinformation when many politicians and members of the public are suspicious of scientists and their motives. Burke-Garcia said that community-trusted opinion leaders can become &ldquo;support mechanisms&rdquo; for sharing information and solutions with their communities, as happens with education about public health. Matias mentioned the debate over design as an influencer of outcomes and referred to a lawsuit filed by a group of attorneys-general against Meta over products designed to keep young users online longer and return to the platform repeatedly.<sup><a role="doc-noteref" epub:type="noteref" href="#pz4-6" id="AGH">10</a></sup> He suggested that a decision against Meta could open the way for a range of potential regulations and interventions.</p> <p class="tx">Wihbey and Adams said that there is a role for both automated and human moderation of online content. Wihbey called for incentives to encourage platforms to collaborate more with civil society organizations, for example in the training of LLM chatbots to identify hate speech. Adams said that human moderation is challenging to scale and is relatively unstructured (in that different moderators may bring in their own biases and assess the same content differently). AI-enabled content moderation, while scalable, may miss content nuances discerned by humans.</p> <p class="tx">Wihbey highlighted the problem of information taken out of context (i.e., malinformation), noting that Russian propaganda outlets have taken legitimate stories about the United States and amplified and distorted the negative elements to achieve disinformation goals. Cen pointed out the challenges in identifying the provenance of data. Humans typically verify information by checking its sources but, it is not always possible to identify the source of social media content. Rather than label an entire article &ldquo;fake news,&rdquo; Adams said that we might instead identify particular pieces of the content that are problematic. He noted that research that identifies social harms can induce social media platforms to change practices, as required under the European Union&rsquo;s (EU&rsquo;s) Digital Services Act (DSA).<sup><a role="doc-noteref" epub:type="noteref" href="#pz4-13" id="A5E">11</a></sup></p> <h2>EDUCATIONAL INTERVENTIONS</h2> <p class="tx1">The second panel session, moderated by planning committee member <b>Kate Starbird</b> (University of Washington), explored educational interventions that could help users become more resistant to disinformation and other forms of manipulation.</p> <p class="tx"><b>Matthew Groh</b> (Northwestern University) said that countering disinformation on social media now means countering AI-generated disinformation. It is important for consumers to understand how AI-generated media is created to understand its capabilities and limitations, because with this understanding, they can better dif-</p> <p class="tx1-1">__________________</p> <section class="footnote" epub:type="rearnotes"> <p epub:type="rearnote" class="fn" id="pz4-4"><sup><a href="#Auu">8</a></sup> <span role="doc-footnote" epub:type="footnote">See <a href="https://www.nifa.usda.gov/about-nifa/how-we-work/extension/cooperative-extension-system">https://www.nifa.usda.gov/about-nifa/how-we-work/extension/cooperative-extension-system</a>.</span></p> <p epub:type="rearnote" class="fn" id="pz4-5"><sup><a href="#AYrA">9</a></sup> <span role="doc-footnote" epub:type="footnote">The observatory is being built at Northeastern University with funding from the National Science Foundation. See <a href="https://nationalinternetobservatory.org">https://nationalinternetobservatory.org</a>.</span></p> <p epub:type="rearnote" class="fn" id="pz4-6"><sup><a href="#AGH">10</a></sup> <span role="doc-footnote" epub:type="footnote">For more information, see, e.g., <a href="https://www.cnbc.com/2023/10/24/bipartisan-group-of-ags-sue-meta-for-addictive-features.html">https://www.cnbc.com/2023/10/24/bipartisan-group-of-ags-sue-meta-for-addictive-features.html</a>.</span></p> <p epub:type="rearnote" class="fn" id="pz4-13"><sup><a href="#A5E">11</a></sup> <span role="doc-footnote" epub:type="footnote">The Digital Service Act (see <a href="https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en">https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en</a>) is discussed in greater depth below.</span></p> </section> </div> </div> <div class="page" data-page="5" data-display="5" data-format="html"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 5 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/5" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="5"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <div class="clearfix"></div> <div class="page-html"> <p class="tx1">ferentiate between what has been generated by AI and what has not (and may therefore be better equipped to recognize deep-fake videos, images, or audio). Groh suggested that media literacy should encompass generative AI literacy so that individuals can better discern potential AI-generated disinformation. He called for ongoing research to evaluate when, where, why, and how ordinary people can spot AI-generated images and on how to boost the ability to determine that material is AI-generated.</p> <p class="tx"><b>Jonathan Osborne</b> (Stanford University) reported on a middle-school classroom curriculum developed with <b>Daniel Pimentel</b> (University of Alabama) to build resistance to scientific misinformation. The curriculum consists of nine, once-monthly lessons that expose students to &ldquo;the grammar and language of science&rdquo; so that they can develop &ldquo;epistemic vigilance when confronted with misinformation&hellip;giving them the tools to make good choices about what to trust.&rdquo; The curriculum will be tested with 50 science teachers and for efficacy.<sup><a role="doc-noteref" epub:type="noteref" href="#pz5-5" id="AXz">12</a></sup></p> <p class="tx"><b>Rub&eacute;n Piacentini</b> (University of Rosario, Argentina) said that disinformation in his country, and perhaps others, exists not just in social media but also in schools and other contexts. Piacentini suggested that students have little exposure to scientific experimentation, which is detrimental when they enter universities and careers. To help increase scientific literacy and skills, he has helped develop a citizen science research program that measures air quality throughout the community. Participants complete a questionnaire about what they know about air quality, conduct experiments, and then re-test their knowledge based on their hands-on experience.</p> <p class="tx"><b>Sander van der Linden</b> (University of Cambridge) described the challenge of identifying information that is false or that uses manipulative techniques. Citizens, he said, must be empowered to discern disinformation techniques in a manner that does not restrict their ability to form opinions. Van der Linden described a project to build &ldquo;immunization&rdquo; programs in collaboration with social media companies. YouTube has, for example, posted pre-bunks (&ldquo;weakened doses of misinformation or techniques used to produce misinformation to refute the techniques in advance&rdquo;), so that users can learn how to discern what is real and what is not. While lab results show promise, achieving large scale &ldquo;herd immunity&rdquo; to the effects of disinformation will be a challenge.</p> <p class="tx"><b>Matt Verich</b> (The Disinformation Project) approaches disinformation from a national security perspective. He observed that disinformation is targeted at the public, but that the public is missing from conversations about disinformation. The Disinformation Project brings awareness to teenagers with limited knowledge of how they are being targeted on social media. It offers extracurricular, multidisciplinary project-based activities for student-led chapters that provide the students with an understanding of tactics bad actors use to manipulate, deceive, and divide. Resources and research are made available to teens that empower them to take action by changing online behaviors.</p> <h2>REFLECTIONS FROM THE PANEL DISCUSSANT</h2> <p class="tx1"><b>Sam Wineburg</b> (Stanford University) noted that the panelists&rsquo; interventions illustrate the need for multiple educational interventions. He asked a series of questions: How does each intervention identify the biggest threats and do the instructional designs proposed tackle them?; Are the interventions scalable at schools and what is the research base behind them?; Are the proposed measures ecologically valid and do they match what students do online? He emphasized the importance of research questions and educational interventions grounded in students&rsquo; real-world behaviors and actions.</p> <h2>DISCUSSION</h2> <p class="tx1">For Verich, a priority is to increase teens&rsquo; awareness that the internet, while powerful, has dangers. He focuses on disinformation as a political weapon, noting that activities focused on this aspect provide residual benefits that allow teens to tackle other problems on the internet. Reflecting on a quote by physicist Richard Feynman: &ldquo;What I cannot create, I do not understand,&rdquo; Groh posited that if people understand how misinformation is created, they can discern when it is being used.</p> <p class="tx">Starbird asked about how to balance learning about disinformation tactics with instruction that may teach people to become manipulative or cynical. Van der Linden</p> <p class="tx1-1">__________________</p> <section class="footnote" epub:type="rearnotes"> <p epub:type="rearnote" class="fn" id="pz5-5"><sup><a href="#AXz">12</a></sup> <span role="doc-footnote" epub:type="footnote">Osborne pointed attendees to the publication &ldquo;Science Education in an Age of Disinformation&rdquo; for elaboration (J. Osborne et al. 2023. &ldquo;Science Education in and Age of Misinformation.&rdquo; <i>Science Education</i> 107: 553-571).</span></p> </section> </div> </div> <div class="page" data-page="6" data-display="6" data-format="html"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 6 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/6" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="6"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <div class="clearfix"></div> <div class="page-html"> <p class="tx1">drew on an inoculation analogy: vaccines provide a small quantity of a virus but not enough to make people ill. In a similar way, his project is designed to provide a toolkit that is insufficient for online manipulation of others. He added that, as is the case with vaccines, it is important to monitor side effects.</p> <p class="tx">Starbird noted that the proposed educational interventions can take place both inside and outside the classroom. Verich said that The Disinformation Project chose an extracurricular approach but that, for a durable response, interventions should take place both inside and outside of the classroom to promote behaviors that will mitigate the most critical harms.</p> <p class="tx">A participant asked how to support community-engaged solutions. Verich said that The Disinformation Project chapters are tailored to their communities. Communities also pass information to each other, van der Linden said. Starbird noted that some individuals and groups have claimed that media literacy brainwashes people. &ldquo;Even as we gather here,&rdquo; she said, &ldquo;there are politicized efforts to undermine&rdquo; efforts to counter disinformation and stop it in its tracks. &ldquo;When we approach educational interventions,&rdquo; she continued, &ldquo;how are we going to do that in a world where there are people who benefit from others being manipulated and do not necessarily want these interventions to be out there&rdquo; and who &ldquo;seem to be winning the day in some places?&rdquo;</p> <h2>TECHNOLOGICAL INTERVENTIONS</h2> <p class="tx1">Planning committee member <b>Beth Mara Goldberg</b> (Jigsaw/Google) moderated a panel session on technological interventions for countering disinformation that can complement content moderation and education, policy, and regulation.</p> <p class="tx"><b>Wajeeha Ahmad</b> (Stanford University) discussed her research to quantify and counter the financial incentives for spreading disinformation. She said that disinformation websites, like other websites, make money through advertising. Advertisements from companies across industries appear on disinformation websites, often without their knowledge. Her work suggests that up to 13 percent of consumers would change their buying behaviors if they knew a company was advertising on disinformation sites. Ahmad emphasized that financial incentives to spread disinformation can be countered by improving transparency on where companies are advertising.</p> <p class="tx"><b>Michelle Amazeen</b> (Boston University) discussed how mainstream media contribute to disinformation by disguising paid content to look like news, which confuses readers. Her research suggests that paid content, also known as sponsored content or native advertising, may influence readers&rsquo; perceptions when they believe it is regular news content. It may also influence how journalists cover topics related to the advertiser. Moreover, when sponsored content is shared on social media, disclosures mandated by the Federal Trade Commission (FTC) often disappear,<sup><a role="doc-noteref" epub:type="noteref" href="#pz6-10" id="A46I">13</a></sup> and Google search results do not always distinguish that content is sponsored. Native advertising may lead news outlets not to report unfavorable or contradictory information about the sponsoring company, which Amazeen characterized as a form of disinformation. She described her work with a team developing a protocol to create a native advertising observatory to identify and catalog thousands of native ad campaigns, beginning with those related to climate change, for use by researchers.</p> <p class="tx"><b>Christopher Impey</b> (University of Arizona) described his research group&rsquo;s two-pronged approach to combat science misinformation. One prong is an instructional module that can fit into any science course to teach students how to detect &ldquo;fake science.&rdquo; The second prong, which relies on machine learning, involves the development of a browser plug-in and smartphone app that will activate when a user goes to a webpage with scientific content and alert users to potential misinformation. Impey compiled a list of the top 100 pseudo-science terms that appear on 1.3 billion unique webpages. Students are curating (i.e., classifying) articles based upon their legitimacy in order to develop training sets for neural networks.</p> <p class="tx">Pinch-hitting for <b>Andrew Jenks</b> (Coalition for Content Provenance and Authenticity [C2PA]), planning committee member <b>Eric Horvitz</b> (Microsoft), who has made technical contributions to media provenance efforts, described C2PA&rsquo;s efforts to provide verification that content comes from a trusted source and has not</p> <p class="tx1-1">__________________</p> <section class="footnote" epub:type="rearnotes"> <p epub:type="rearnote" class="fn" id="pz6-10"><sup><a href="#A46I">13</a></sup> <span role="doc-footnote" epub:type="footnote">For information about FTC disclosure requirements, see <a href="https://www.ftc.gov/business-guidance/resources/disclosures-101-social-media-influencers">https://www.ftc.gov/business-guidance/resources/disclosures-101-social-media-influencers</a>.</span></p> </section> </div> </div> <div class="page" data-page="7" data-display="7" data-format="html"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 7 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/7" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="7"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <div class="clearfix"></div> <div class="page-html"> <p class="tx1">been modified. C2PA is working with about 100 companies and 2000 organizations to establish digital media provenance standards. Horvitz noted that, several weeks ago, large technology companies announced they will embrace a C2PA standard called Content Credentials. The credential provides a graphic icon that indicates whether metadata and other critical information is available as cryptographically signed information that provides verification of the source and provenance of the content.<sup><a role="doc-noteref" epub:type="noteref" href="#pz7-6" id="ARs">14</a></sup></p> <p class="tx"><b>Ming Ming Chiu</b> (Education University of Hong Kong) presented work on behalf of his collaborators, <b>Jeong-Nam Kim</b> and <b>David Ebert</b> of the University of Oklahoma. Chiu said that many approaches to detect disinformation are opaque, untestable, or narrow. His team created <i>Deceptive Writing Theory</i>, a multilevel theory of disinformation causes and conditions (e.g., political or financial motives) and developed <i>Statistical Discourse Analysis and Multilevel Diffusion Analysis</i>, analytic tools that model how messages spread within and across sub-populations&mdash;by numbers of users, speed, and diffusion patterns. An AI-powered dashboard displays this information for each message and a threat level. This aids government officials tracking dangerous messages, provides students with tools to detect them, and gives researchers from multiple disciplines the opportunity to build upon their work.</p> <p class="tx"><b>Dongwon Lee</b> (Pennsylvania State University) is developing computational tools to flag disinformation at scale. He noted that existing academic tools have limited functionality. Social media companies have in-house solutions to flag disinformation, but do not share their algorithms. Lee suggested that LLMs have the potential to become a democratizing platform for the detection and explanation of disinformation in different settings and languages. His research has found that it is still relatively easy to fool LLMs into generating factually incorrect information at scale, but they can be configured to detect human and AI-generated disinformation with reasonable accuracy.</p> <h2>REFLECTIONS FROM THE PANEL DISCUSSANT</h2> <p class="tx1"><b>Samuel Woolley</b> (University of Texas at Austin) welcomed the variety of technical tools available to aid producers and receivers of information and to assess content. He applauded Ahmad&rsquo;s ideas to counter disinformation through interventions that have financial implications for the producers of disinformation; Amazeen&rsquo;s suggestions on how to dismantle or disrupt the native advertising ecosystem; and Chiu&rsquo;s ideas for testing theories about how messages are spread (and for presenting the information on a dashboard). He suggested that Lee&rsquo;s effort to flag disinformation at scale using LLMs would be useful, particularly because of their capacity to perform in languages other than English. Impey&rsquo;s machine-learning effort has the capacity to distinguish disinformation across massive amounts of content. Woolley noted a need for resources and emphasized the importance of collaboration across disciplines, companies, and organizations, citing C2PA as an example. &ldquo;No single technological solution to this problem will actually deal with&rdquo; disinformation, so &ldquo;reiterating that to everyone we meet is so important,&rdquo; especially to policy makers and others who are looking for a &ldquo;magic bullet.&rdquo;</p> <h2>DISCUSSION</h2> <p class="tx1">Horvitz said that traditional media also amplify content through, for example, word choices in headlines. Amazeen noted that search engine optimization has had an impact on headlines. In addition, news websites and apps often suggest content based on prior reading history, which can include sponsored content. Ahmad said that academic collaborations with companies that deal with advertisers (such as AdTech platforms, i.e., advertising technology software and tools advertisers use to buy, manage, and analyze digital advertising) would be helpful in identifying the extent to which companies&rsquo; behavior changes as a result of providing greater transparency to consumers.</p> <p class="tx">Woolley said that unpacking connections between advertising, public relations, and political manipulation is important. He noted that funding for longitudinal studies on disinformation is dwindling. &ldquo;There is a large-scale politicization of this work that has the potential to continue to hinder our ability to do this work,&rdquo; he said. &ldquo;People set up a dichotomy between free speech and open content across platforms or privacy and safety. That dichotomy must be rejected.&rdquo;</p> <p class="tx1-1">__________________</p> <section class="footnote" epub:type="rearnotes"> <p epub:type="rearnote" class="fn" id="pz7-6"><sup><a href="#ARs">14</a></sup> <span role="doc-footnote" epub:type="footnote">For more information, see <a href="https://c2pa.org/post/contentcredentials/">https://c2pa.org/post/contentcredentials/</a>.</span></p> </section> </div> </div> <div class="page" data-page="8" data-display="8" data-format="html"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 8 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/8" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="8"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <div class="clearfix"></div> <div class="page-html"> <p class="tx1">Planning committee member <b>Aziz Z. Huq</b> (University of Chicago) asked about data on consumer adoption when tools are available. &ldquo;If we think that individuals are not going to take advantage of opportunities to mitigate disinformation at a retail or individualized level, maybe that pushes toward more wholesale solutions,&rdquo; he said. Horvitz said more research is needed on whether consumers will use the Content Credential or other standards that he discussed. Ahmad said that companies normally outsource their advertising operations, and it is important to make it easier for companies to know where their advertising appears and improve transparency to make that information persistently available to companies and consumers.</p> <p class="tx">Kim emphasized that disinformation occurs worldwide and that technological solutions must be multilingual and multicultural, especially to reach poorer regions. Ebert added that that multiple strategies are needed to counter long-term disinformation campaigns, which often use different techniques. He said that visual representations of information flows exert a powerful effect on researchers and the public. Goldberg highlighted larger, slow, long-term, subtle sets of disinformation narratives, which Starbird and colleagues have referred to as &ldquo;frames.&rdquo; Frames set, for example, by native advertising and generative AI, can subtly influence how the world is seen.</p> <p class="tx">Donovan suggested the application of a &ldquo;public works mindset&rdquo; to move the field forward. &ldquo;If the metaphor of an information superhighway is somewhat accurate, where are the signs, where are the guardrails? If people are looking for credible information, particularly about public health, why don&rsquo;t they run into credible information quickly?&rdquo;</p> <p class="tx">Chiu said that forcing social media users to wait a few seconds before forwarding a message can increase their reflexivity and change their forwarding behaviors. Woolley said that it is important to think of who should be held accountable for the impacts of disinformation and who should do the work of solving the problem. Horvitz noted that there are federal and state legislative efforts to require media provenancing, such as the California Provenance Authenticity and Watermarking Standards Act.<sup><a role="doc-noteref" epub:type="noteref" href="#pz8-5" id="A21">15</a></sup></p> <p class="tx">Impey said that authority is questioned in all areas and that technological approaches to counter disinformation cannot be &ldquo;black-boxed.&rdquo; Kim raised the concept of norm-chilling, which posits that, when people perceive they hold a minority opinion, they may remain silent. He suggested, however, that, if norm-chilling causes people not to amplify a disinformation message, that could be a positive.</p> <p class="tx">Chiu said that scale-up of interventions differs across political contexts. In Asia, for example, some ministries of information can impose requirements on companies that the U.S. government cannot (e.g., requiring the use of certain plug-ins on web browsers to flag disinformation). Horvitz called for greater public understanding of the value of media provenance and for funding for academic teams to assist in guiding the technology.</p> <p class="tx">Planning committee member <b>Susan S. Silbey</b> (Massachusetts Institute of Technology) said that, &ldquo;We have been here before.&rdquo; She said that new technologies have always caused disruption to &ldquo;the fundamental patterns of life,&rdquo; even if anonymity contributes to the current problem. Producers of drugs, automobiles, and other products are held responsible for what they make or do through science, law, and other means. Human groups function by holding each other responsible for what they have done, she said, but no one is held responsible for disinformation.</p> <h2>REGULATORY AND OTHER INCENTIVES AND DISINCENTIVES FOR BEHAVIOR CHANGE</h2> <p class="tx1">Huq, moderator of the workshop&rsquo;s final panel session, said that disinformation is shaped by a range of dynamics, suggesting that the educational solutions to disinformation available to consumers and commercial entities turn on regulatory choices made by government.</p> <p class="tx"><b>Joshua Braun</b> (University of Massachusetts Amherst) discussed financial incentives for disinformation. He said that such incentives have curbed progress in moderation and towards transparency, and the space has been difficult to reform. Digital advertising, with many players and transitions, is prone, he said, to what Charles Perrow referred to as normal accidents.<sup><a role="doc-noteref" epub:type="noteref" href="#pz8-12" id="Ano">16</a></sup> Rather than strictly</p> <p class="tx1-1">__________________</p> <section class="footnote" epub:type="rearnotes"> <p epub:type="rearnote" class="fn" id="pz8-5"><sup><a href="#A21">15</a></sup> <span role="doc-footnote" epub:type="footnote">See <a href="https://trackbill.com/bill/california-assembly-bill-3211-california-provenance-authenticity-and-watermarking-standards/2520580/">https://trackbill.com/bill/california-assembly-bill-3211-california-provenance-authenticity-and-watermarking-standards/2520580/</a>.</span></p> <p epub:type="rearnote" class="fn" id="pz8-12"><sup><a href="#Ano">16</a></sup> <span role="doc-footnote" epub:type="footnote">See C. Perrow. 2000. <i>Normal Accidents: Living with High-Risk Technologies</i>. Princeton, NJ: Princeton University Press. Perrow recognized two dimensions of risk&mdash;linear versus interactive complexity, and tight versus loose coupling. A &ldquo;normal accident&rdquo; is a situation in which the systems are so complex and interrelated, and opportunities for human intervention are so reduced, that an accident of some sort is almost inevitable. Braun sug-</span></p> </section> </div> </div> <div class="page" data-page="9" data-display="9" data-format="html"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 9 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/9" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="9"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <div class="clearfix"></div> <div class="page-html"> <p class="tx1">technological solutions, &ldquo;normal accident&rdquo; literature suggests other solutions, such as problem solving that walks away from systems that are hard to manage, makes it easier for humans to intervene, decreases system complexity, focuses on change management and organizational cultures, and accepts failures as inevitable (but limits them). He said that these five approaches to problem solving might be applied to containing disinformation spread through AdTech and profit-driven disinformation.</p> <p class="tx"><b>David Bray</b> (Henry Stimson Center) said that, while the increasing democratization of technology can be both good and bad, the tradecraft of information discernment has not been democratized. He looks at how to deter bad actors who use multiple means to spread disinformation against the interests of national defense, law enforcement, and civil norms (his biggest concern). Nonprofits, labs, and other groups must be involved in efforts to counter disinformation&mdash;not just governments or social media platforms. People need &ldquo;digital dignity&rdquo; wherein their human rights are respected online, but Bray said that their voice has been taken away by the taking of their data. Privacy protections, crowdsourcing, and a system in which professional certified data scientists accept responsibility (in a manner similar to certified public accountants) have a role to play in making technology that benefits customers, citizens, and communities.</p> <p class="tx"><b>Nandini Jammi</b> (Check My Ads and Sleeping Giants) said that the disinformation crisis is fueled by advertising. As an example of digital ad watchdog activity, she described an intervention her organization engaged in when they were contacted about a scam that showed up on Google searches. They singled out the Google Vice President of Global Advertising and asked community members to email him directly about the scam. Google responded immediately by changing pertinent policies and banning the scammer. Jammi and her colleagues are &ldquo;moving up the ladder&rdquo; from individual advertisers to holding accountable bodies and companies that allow disinformation, scams, and crimes to proliferate.<sup><a role="doc-noteref" epub:type="noteref" href="#pz9-5" id="AA0">17</a></sup></p> <p class="tx"><b>Jeff Kosseff</b> (U.S. Naval Academy) discussed what is and is not possible in the regulation of disinformation.<sup><a role="doc-noteref" epub:type="noteref" href="#And" id="ADx0">18</a></sup> He briefly described the origins of Section 230 of the Communications Decency Act of 1996 and acknowledged the many calls for legislative amendments to the section. Kosseff suggested that, even if Section 230 were to be repealed, the First Amendment protects a lot of false speech under the premise that, except for a few narrowly defined exceptions, speech should be safeguarded. He cautioned against the impulse for new legal and regulatory actions, calling instead for government efforts to fund media literacy, support libraries, and foster transparency and trust.</p> <p class="tx"><b>Nathalie Smuha</b> (KU Leuven) said that, although laws do not solve everything (and are subject to abuse), they are a powerful tool in democratic societies. In Europe, different regulatory initiatives have been taken to address the concerns around disinformation, which U.S. institutions could explore for inspiration. She said that the most prominent initiative is the DSA, which has consequences for platforms outside the EU&mdash;for example in instances where their services are available to European consumers.<sup><a role="doc-noteref" epub:type="noteref" href="#pz9-9" id="Acl">19</a></sup> The DSA calls for increased transparency about algorithms, training, and terms and conditions, rather than identifying what types of speech must be taken down. The Act mandates that platforms and search engines that have a systemic impact undertake risk impact assessments. Social harms to users must be documented, acknowledged, and mitigated by social media companies. The Act also establishes a system of independent, impartial, trusted content flaggers with whom platforms must cooperate, puts in place processes for challenging a platform&rsquo;s decision to take down content, and compels the largest platforms to allow researcher access to key data. Procedural elements of the DSA make the Act a powerful mechanism for countering disinformation.</p> <p class="tx1-1">__________________</p> <section class="footnote" epub:type="rearnotes"> <p epub:type="rearnote" class="fn1"><span role="doc-footnote" epub:type="footnote">gested that Perrow&rsquo;s Normal Accident Theory could aid in the diagnosis of problems and offer a taxonomy to look for solutions.</span></p> <p epub:type="rearnote" class="fn" id="pz9-5"><sup><a href="#AA0">17</a></sup> <span role="doc-footnote" epub:type="footnote">She offered as examples of organizations involved in accountability the Trustworthy Accountability Group (TAG) (see <a href="https://www.tagtoday.net">https://www.tagtoday.net</a>) and the Media Ratings Council (MRC) (see <a href="https://mediaratingcouncil.org">https://mediaratingcouncil.org</a>). TAG &ldquo;advances its mission of eliminating fraudulent traffic, sharing of threat intelligence, promoting brand safety and enabling transparency by connecting industry leaders, analyzing threats, and sharing best practices worldwide&rdquo; and MRC &ldquo;is a not-for-profit industry self-regulatory body, established in 1963 at the request of U.S. Congress, that audits and accredits media measurement products and data sources across Digital, Out-of-Home, Print, Radio, Television, and cross-media products.&rdquo;</span></p> <p epub:type="rearnote" class="fn" id="And"><sup><a href="#ADx0">18</a></sup> <span role="doc-footnote" epub:type="footnote">Kosseff wrote a book on Section 230 (see J. Kosseff. 2019. <i>The Twenty-Six Words that Created the Internet</i>. Ithaca, NY: Cornell University Press.) and spoke at the National Academies&rsquo; 2021 workshop <i>Section 230 Protections: Can Legal Revisions or Novel Technologies Limit Online Misinformation and Abuse?</i> (see <a href="/read/27997/chapter/1#pz2-5">footnote 4</a>).</span></p> <p epub:type="rearnote" class="fn" id="pz9-9"><sup><a href="#Acl">19</a></sup> <span role="doc-footnote" epub:type="footnote">For more information on the DSA, see <a href="https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en">https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en</a>.</span></p> </section> </div> </div> <div class="page" data-page="10" data-display="10" data-format="html"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 10 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/10" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="10"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <div class="clearfix"></div> <div class="page-html"> <h2>REFLECTIONS FROM THE PANEL DISCUSSANT</h2> <p class="tx1"><b>Brandie Nonnecke</b> (University of California, Berkeley) reflected on scale and complexity, asking whether the complexity of the internet is a necessary evil or whether digital ad markets and other features can be simplified. Taking a long view, she said that misinformation and disinformation have always occurred, because &ldquo;those in the past who had the means to record history also had the means to write it in their favor.&rdquo; As modes of inquiry change, so do actions, and it is important to treat causes, not just symptoms. Nonnecke said that a lack of trust in institutions allows actors to spread mis- and disinformation. She said that, while Section 230 and the DSA can give users greater power to choose content, modes of inquiry are needed to study what might happen, for example, if users choose content and consume only content that appeals to their world view.</p> <h2>DISCUSSION</h2> <p class="tx1">Huq suggested that a person listening to the panelists would think that law and regulation do not have a significant role to play in addressing disinformation, but that there are many similar instances of &ldquo;wicked problems&rdquo; wherein it would be extraordinary to say the state has no role.</p> <p class="tx">Bray said that, for free societies, it should not be solely up to the nation-state to solve the problem of disinformation, as this risks being perceived as censorship. He said that the line between political rhetoric, advertising, and disinformation is blurry, and that regulatory processes can be manipulated by technology. As an example, he drew upon his experience at the Federal Communications Commission when calls for public comments in 2014 and 2017 were inundated with huge numbers of bot-generated comments. Bray said that AI-generated content will make it even harder to distinguish between human and bot interactions. He said that he doesn&rsquo;t want too much power to remain with one sector, be it government or industry. Instead, he said that collectives, such as nonprofits, universities, or other groups of people, must be engaged. Bray suggested that an ideal law to mitigate disinformation would ensure that individuals have digital dignity and the right to know how their data is used.</p> <p class="tx">Kosseff differentiated between legal versus regulatory responses. Free speech protections are critical to democracy, and he warned against narrowing free speech in any circumstance. Regardless of one&rsquo;s political views, it is not a good idea for judges to order takedowns of politicians&rsquo; postings, he said.</p> <p class="tx">The digital advertising space is set up to maximize consumer engagement, and the focus is on the individual at the expense of context, Braun said. When the results of social media advertising are not successful, the industry has wanted to collect more data in a form of commercial surveillance.<sup><a role="doc-noteref" epub:type="noteref" href="#pz10-10" id="AL0">20</a></sup> Braun suggested that Congress could help rebalance the scales in favor of consumers by limiting the data that companies can collect or the uses to which it can be put. Huq mentioned the Supreme Court case <i>Sorrell v. IMS Health</i>, which ruled that collection and dissemination of data is speech covered by the First Amendment.<sup><a role="doc-noteref" epub:type="noteref" href="#pz10-11" id="ASUl">21</a></sup> &ldquo;Were <i>Sorrell</i> to be pushed to its logical extreme, many forms of data regulation that are implicitly in the proposals that Kosseff and Bray identified would be off the table,&rdquo; he said.</p> <p class="tx">Jammi said her organization operates in the free market space by pitting market forces against each other. The initial focus has been on consumers versus advertisers. Constantly leveraging these forces against each other helps consumers have their voices heard at the highest level, she said. She agreed that consumers do not understand how their data are used and said that the next step for regulation is to force transparency. She said that a national registry of data brokers is needed to shed light on the identities of data brokers and the data they collect.</p> <p class="tx">Smuha agreed that a combination of technical, organizational, and legal solutions is needed, but emphasized that, as members of a democratic society, we should come together in public deliberation to establish rules for governance. This, she said, is the spirit behind European legislation. Smuha said that, while the DSA is imperfect, it enables a wide range of stakeholders to play a role in safeguarding the online space. Adams called atten-</p> <p class="tx1-1">__________________</p> <section class="footnote" epub:type="rearnotes"> <p epub:type="rearnote" class="fn" id="pz10-10"><sup><a href="#AL0">20</a></sup> <span role="doc-footnote" epub:type="footnote">Commercial surveillance is the business of collecting, analyzing, and profiting from. information about people. See, e.g., <a href="https://www.ftc.gov/news-events/news/press-releases/2022/08/ftc-explores-rules-cracking-down-commercial-surveillance-lax-data-security-practices">https://www.ftc.gov/news-events/news/press-releases/2022/08/ftc-explores-rules-cracking-down-commercial-surveillance-lax-data-security-practices</a>.</span></p> <p epub:type="rearnote" class="fn" id="pz10-11"><sup><a href="#ASUl">21</a></sup> <span role="doc-footnote" epub:type="footnote">This case is summarized at <a href="https://www.oyez.org/cases/2010/10-779">https://www.oyez.org/cases/2010/10-779</a>.</span></p> </section> </div> </div> <div class="page" data-page="11" data-display="11" data-format="html"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 11 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/11" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="11"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <div class="clearfix"></div> <div class="page-html"> <p class="tx1">tion to the DSA&rsquo;s provisions on data sharing and on the legal concept of social harm. If social harms can be demonstrated, provisions kick in that fund experimental research on interventions to mitigate the harms.</p> <p class="tx">Kosseff said that no single solution, including media literacy or more government transparency, will fix the problem of misinformation. He said that commercial speech (inclusive of scams targeted to consumers) can be regulated by the FTC, but urged caution in enacting legislation or regulations that remove protections to free speech because &ldquo;threats to democracy are so much greater because, once you get rid of those protections, you&rsquo;re not going back.&rdquo;</p> <p class="tx">Bray agreed with Kosseff&rsquo;s caution about interventions and warned against interventions that lead to autocracies of thought and reality. He expressed hope that, by learning how people fall for scams, we will collect data and metrics that can be useful when considering ways to enhance the ability of individuals and communities to better discern authentic vs. inauthentic information.</p> <p class="tx">Huq called attention to questions submitted by the audience about the feasibility of a licensing system for generative AI and the right of free speech versus the right to amplify speech, which he referred to as &ldquo;speech at the retail level and speech at scale.&rdquo; Bray suggested that there are lessons to be learned from the experience with amateur radio licenses in the 1920s. At the time, radio was seen as having both promise and perils. Licensing did not prohibit speaking at scale, but licensees had to go through a process and identify who they were to obtain a license. Bray warned about restrictions on amplifying speech, which he suggested could lead to abuses such as governments not allowing private citizens to broadcast their views. Instead, he suggested licensing, certification, or a requirement that data brokers and others take an ethical oath akin to the Hippocratic Oath. He said that Australia is experimenting with citizen juries where representative members of the public can raise concerns or objections about a range of topics that can include online content. Kosseff suggested that regulating amplification would have the same impact as regulating speech because platforms exist to amplify or de-amplify speech.</p> <h2>LOOKING FORWARD: REFLECTIONS FROM THE PLANNING COMMITTEE</h2> <p class="tx1">At the conclusion of the workshop, six members of the workshop planning committee shared what they had taken away from workshop discussions.</p> <p class="tx">Donovan reflected on the need to understand the DSA and called for a transition to durable institutions and infrastructure that can create a healthy environment where people have the right to truth. She challenged social media companies, search engines, and other information conduits to put more work into ontologies and ranking systems. The field&rsquo;s increasing coordination is heartening, she said, but she shared frustration that the code of TALK (Timely Accurate Local Knowledge) has not been cracked as this is the kind of information that people need to make informed decisions. While there is promise in technology and citizen participation, some profit from disinformation. Donovan said that we need to protect biometric privacy as AI technology advances and said that we all hold the keys to unlocking a future different than the present.</p> <p class="tx">Horvitz said that traditional media must become involved in countering disinformation and collaborate with other stakeholders to build media provenance technologies. Interventions that show promise should be pursued, even if they seem obvious or like small steps. Research is needed to develop best practices, pool results, and think through the players and disciplines that should be involved (e.g., social psychology, cognitive psychology, technology). It is important to understand the implications of technologies to prevent them from backfiring and causing harm, including in the Global South. For Horvitz, a big question is, &ldquo;How do we avoid entering a post-epistemic world that our grandchildren will be living in?&rdquo; He expressed concern about the potential of adversarial generative AI systems that can understand the world, take goals from autocratic rulers, and generate believable stories that combine disinformation and real-world events.</p> <p class="tx">Huq noted that many words and concepts&mdash;such as free speech, market, and state&mdash;can refer to a range of possibilities. He reflected on the implications of moving away from a dichotomy between markets and states toward a choice between different social equilibria: that</p> </div> </div> <div class="page" data-page="12" data-display="12" data-format="html"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 12 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/12" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="12"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <div class="clearfix"></div> <div class="page-html"> <p class="tx1">is, between a healthy public sphere or one in which bad speech drives out good and where it is hard to rely on the machinery of government to produce laws to protect speech and democratic values. While it is impossible to imagine either an entirely civic or entirely regulatory approach moving towards good, he is now more skeptical of relying on individuals at the retail level to equip themselves with the right tools to counter disinformation. Disinformation is a problem at scale that democratic governments must address.</p> <p class="tx">Silbey urged more attention to history. Creators, innovators, and disruptors often have minimal notions of how the social ecosystem works, she said. People are not just producers or consumers, they are participants. Data transparency, access, and privacy are important to consider when thinking about how to protect individuals, but it is equally important to create the capacity for accountability for failures. The market can do this, she said, but law, professional norms, and standardization are also critical. There are actions at multiple levels of scale&mdash;individuals, masses, organizations, systems&mdash;but capacities at each of these scales are unequal. This has resulted in a concentration of ownership and a proliferation of bad actors. Silbey said that it is important to understand how people buy into disinformation, but it is also important to remember that many &ldquo;unknown unknowns&rdquo; exist.</p> <p class="tx">Starbird noted that sponsors of disinformation research have shifted priorities in recent years. No one direction is right or wrong, she said, but online disinformation requires a multi-pronged approach&mdash;and education is an important dimension.</p> <p class="tx">Perlmutter said that he was encouraged to hear about ideas that can work to counter disinformation despite the many challenges identified during the workshop. He expressed interest in research, experimentation, and monitoring. We have encountered challenging periods before, he said, but multiple approaches can lead to progress. He suggested that it is important to develop techniques for citizen oversight so that we are not at the mercy of trusting that industry or government will effectively address disinformation.</p> </div> </div> <div class="page" data-page="13" data-display="13" data-format="html"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 13 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/13" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="13"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <div class="clearfix"></div> <div class="page-html"> <p class="tx1"><b>DISCLAIMER</b> This Proceedings of a Workshop&mdash;in Brief has been prepared by <b>Paula Whitacre, Steven Kendall</b>, and <b>Anne-Marie Mazza</b> as a factual summary of what occurred at the meeting. The committee&rsquo;s role was limited to planning the event. The statements made are those of the individual workshop participants and do not necessarily represent the views of all participants, the project sponsors, the planning committee, the Committee on Science, Technology, and Law, or the National Academies.</p> <p class="tx1-1"><b>REVIEWERS</b> To ensure that it meets institutional standards for quality and objectivity, this Proceedings of a Workshop&mdash;in Brief was reviewed by <b>Jon Bateman</b>, Carnegie Endowment for International Peace; <b>Joshua Braun</b>, University of Massachusetts Amherst; and <b>Steven Brill</b>, NewsGuard. <b>Marilyn Baker</b>, National Academies of Sciences, Engineering, and Medicine, served as the review coordinator.</p> <p class="tx1-1"><b>PLANNING COMMITTEE Joan Donovan</b> (<i>Co-chair</i>), Boston University; <b>Saul Perlmutter</b> (<i>Co-chair</i>), University of California Berkeley and Lawrence Berkeley National Laboratory; <b>Amelia Acker</b>, University of Texas at Austin; <b>Hany Farid</b>, University of California, Berkeley; <b>Beth Mara Goldberg</b>, Jigsaw (Google); <b>Mark Hansen</b>, Columbia University; <b>Eric Horvitz</b>, Microsoft; <b>Aziz Z. Huq</b>, University of Chicago; <b>Susan S. Silbey</b>, Massachusetts Institute of Technology; <b>Kate Starbird</b>, University of Washington.</p> <p class="tx1-1"><b>NATIONAL ACADEMIES OF SCIENCES, ENGINEERING, AND MEDICINE STAFF Steven Kendall</b>, Senior Program Officer; <b>Anne-Marie Mazza</b>, Senior Director; <b>Renee Daly</b>, Senior Program Assistant.</p> <p class="tx1-1"><b>SPONSORS</b> This project was funded by the Gordon and Betty Moore Foundation and the Rockefeller Foundation.</p> <p class="tx-2"><b>SUGGESTED CITATION</b> National Academies of Sciences, Engineering, and Medicine. 2024. <i>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop&mdash;in Brief</i>. Washington, DC: National Academies Press. <a href="https://doi.org/10.17226/27997">https://doi.org/10.17226/27997</a>.</p> <table style="width: 100%;"> <colgroup> <col style="width: 70%;"> <col style="width: 30%;"> </colgroup> <tbody> <tr class="col4"> <td class="tdt"> <p class="tx1-2"><b>Policy and Global Affairs</b></p> <p class="tx1-2"><i>Copyright 2024 by the National Academy of Sciences. All rights reserved.</i></p> </td> <td class="tdt"><img alt="NATIONAL ACADEMIES Sciences Engineering Medicine The National Academies provide independent, trustworthy advice that advances solutions to society&rsquo;s most complex challenges." src="/openbook/27997/xhtml/images/img-13-1.jpg" width="303" height="206"></td> </tr> </tbody> </table> </div> </div> </div> <div id="openbook-image" class="hide"> <div class="page" data-page="1" data-display="1" data-format="image"> <div class="page-image"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 1 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/1" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="1"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC" data-original="https://nap.nationalacademies.org/books/27997/gif/1.gif" alt="Page 1" width="1200" height="1553"> </div> </div> <div class="page" data-page="2" data-display="2" data-format="image"> <div class="page-image"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 2 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/2" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="2"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC" data-original="https://nap.nationalacademies.org/books/27997/gif/2.gif" alt="Page 2" width="1200" height="1553"> </div> </div> <div class="page" data-page="3" data-display="3" data-format="image"> <div class="page-image"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 3 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/3" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="3"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC" data-original="https://nap.nationalacademies.org/books/27997/gif/3.gif" alt="Page 3" width="1200" height="1553"> </div> </div> <div class="page" data-page="4" data-display="4" data-format="image"> <div class="page-image"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 4 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/4" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="4"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC" data-original="https://nap.nationalacademies.org/books/27997/gif/4.gif" alt="Page 4" width="1200" height="1553"> </div> </div> <div class="page" data-page="5" data-display="5" data-format="image"> <div class="page-image"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 5 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/5" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="5"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC" data-original="https://nap.nationalacademies.org/books/27997/gif/5.gif" alt="Page 5" width="1200" height="1553"> </div> </div> <div class="page" data-page="6" data-display="6" data-format="image"> <div class="page-image"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 6 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/6" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="6"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC" data-original="https://nap.nationalacademies.org/books/27997/gif/6.gif" alt="Page 6" width="1200" height="1553"> </div> </div> <div class="page" data-page="7" data-display="7" data-format="image"> <div class="page-image"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 7 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/7" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="7"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC" data-original="https://nap.nationalacademies.org/books/27997/gif/7.gif" alt="Page 7" width="1200" height="1553"> </div> </div> <div class="page" data-page="8" data-display="8" data-format="image"> <div class="page-image"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 8 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/8" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="8"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC" data-original="https://nap.nationalacademies.org/books/27997/gif/8.gif" alt="Page 8" width="1200" height="1553"> </div> </div> <div class="page" data-page="9" data-display="9" data-format="image"> <div class="page-image"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 9 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/9" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="9"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC" data-original="https://nap.nationalacademies.org/books/27997/gif/9.gif" alt="Page 9" width="1200" height="1553"> </div> </div> <div class="page" data-page="10" data-display="10" data-format="image"> <div class="page-image"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 10 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/10" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="10"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC" data-original="https://nap.nationalacademies.org/books/27997/gif/10.gif" alt="Page 10" width="1200" height="1553"> </div> </div> <div class="page" data-page="11" data-display="11" data-format="image"> <div class="page-image"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 11 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/11" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="11"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC" data-original="https://nap.nationalacademies.org/books/27997/gif/11.gif" alt="Page 11" width="1200" height="1553"> </div> </div> <div class="page" data-page="12" data-display="12" data-format="image"> <div class="page-image"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 12 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/12" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="12"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC" data-original="https://nap.nationalacademies.org/books/27997/gif/12.gif" alt="Page 12" width="1200" height="1553"> </div> </div> <div class="page" data-page="13" data-display="13" data-format="image"> <div class="page-image"> <div class="page-tools"> <div class="page-tools-inner"> <span class="page-number">Page 13 <span class="page-status-text"></span></span> <span class="page-actions"> <span class="page-actions-inner"> <a href="#" class="page-action share addthis_button" title="Share this page" addthis:url="http://www.nap.edu/read/27997/page/13" id="atic"> <span data-icon="&#xe607;"></span> <span class="icon-text">Share</span> </a> <a href="#" class="page-action cite" title="Cite this page"> <span data-icon="&#xe6ad;"></span> <span class="icon-text">Cite</span> </a> </span> </span> </div> <div class="page-tools-panel"> <div class="panel-citation"> <strong>Suggested Citation:</strong>"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief." National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</em>. Washington, DC: The National Academies Press. doi: 10.17226/27997. <div href="#" class="panel-close">&times;</div> </div> <div class="panel-bookmark"> <form action="" class="bookmark-form"> <input type="hidden" name="id" value=""> <input type="hidden" name="record_id" value="27997"> <input type="hidden" name="page" value="13"> <input type="hidden" name="chapter" value="1"> <div class="input-wrapper"> <input class="bookmark-note" type="text" name="note" placeholder="Add a note to your bookmark" value="" tabindex="-1"> </div> <div class="button-wrapper"> <a class="bookmark-button-save button"><span data-icon="&#xe72f;"></span><span class="icon-text">Save</span></a> </div> <div class="button-wrapper"> <a class="bookmark-button-delete button"><span data-icon="&#xe72e;"></span><span class="icon-text">Cancel</span></a> </div> </form> </div> </div> </div> <img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC" data-original="https://nap.nationalacademies.org/books/27997/gif/13.gif" alt="Page 13" width="1200" height="1553"> </div> </div> </div> </div> <footer> <a href="http://www.nas.edu" title="The National Academies of Sciences, Engineering, and Medicine" target="_blank"> <img src="/images/hdr/logo-nasem-wht-lg.png" border="0" alt="The National Academies of Sciences, Engineering, and Medicine" style="max-width: 20%; padding: 10px;"></a> <address> <span class="nasem-name">The National Academies of Sciences, Engineering, and Medicine <br></span> 500 Fifth Street, NW | Washington, DC 20001 </address> <div class="copyright-info"><br> <a href="https://www.nationalacademies.org/legal" target=_blank style="color: white;">Copyright &copy; 2025 National Academy of Sciences. All rights reserved.</a> <a class="legal" href="https://www.nationalacademies.org/legal/terms" style="color: white;" target=_blank>Terms of Use and Privacy Statement</a> </div> </footer> </div> <div id="info-overlay" class="info-overlay"> <div class="info-heading"> <span class="info-heading-title"> Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief </span> <span class="info-heading-static"> Get This Book </span> <div class="info-close">&times;</div> </div> <div class="info-body"> <div class="info-cover-image"> <a href="https://nap.nationalacademies.org/catalog/27997/evolving-technological-legal-and-social-solutions-to-counter-online-disinformation"><img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC" data-original="https://nap.nationalacademies.org/cover/27997/450" width="450" alt=" Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief"></a> </div> <div class="info-buttons"> <div class="mynap-info"> MyNAP members save 10% online. <br> <a href="https://www.nap.edu/login.php?page=https%3A%2F%2Fnap.nationalacademies.org%2Fread%2F27997%2Fchapter%2F1">Login</a> or <a href="https://www.nap.edu/login.php?action=new&page=https%3A%2F%2Fnap.nationalacademies.org%2Fread%2F27997%2Fchapter%2F1">Register</a> to save! </div> <a href="https://nap.nationalacademies.org/download/27997" class="button download-button">Download Free PDF</a> </div> <div class="info-description"> <p>The online information environment enables the global exchange of information and ideas, but it also contributes to the proliferation of disinformation. Online platforms operate at a scale where human-based content moderation to counter disinformation is impractical or at least very expensive and where purely technical solutions are challenging because content is often context-dependent. The speed, scale, and complexity of this ecosystem suggests that solutions are needed that consider the global nature of disinformation and effectively blend technical and human capabilities.</p> <p>On April 10 and 11, 2024, an ad hoc committee under the auspices of the National Academies of Sciences, Engineering, and Medicine's Committee on Science, Technology, and Law convened a virtual workshop to consider practicable solutions to counter online disinformation, particularly in social media. This publication summarizes the presentation and discussion of the workshop.</p> <p class="info-overlay-btn info-close">READ FREE ONLINE</p> </div> </div> </div> <nav id="toc-menu" class="toc-menu"> <div class="toc-inner"> <h2>Contents</h2> <a href="#" class="toc-close"> <span data-icon="&#xe72b;"></span> </a> <ul> <li class="toc-search"> <form class="search" action="/booksearch.php"> <input type="hidden" name="record_id" value="27997"> <input type="text" name="term" placeholder="Search this book..." tabindex="-1" value=""> <span class="search-icon" data-icon="&#xe6b6;"></span> </form> </li> <li class="active"> <span class="chapter-name"> <a href="/read/27997/chapter/1">Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop—in Brief</a> </span> <span class="chapter-page-range">1–13</span> <span class="chapter-skim"> <a href="/nap-cgi/skimchap.cgi?recid=27997&chap=1–13"> <span data-icon="&#xe6f9;"></span> </a> </span> </li> </ul> </div> </nav> <ol id="openbook-tour" class="tourbus-legs"> <li data-orientation="centered" data-scroll-to="0" data-width="450"> <a href="javascript:void(0);" class="tour-close">&times;</a> <h2>Welcome to OpenBook!</h2> <p>You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.</p> <p>Do you want to take a quick tour of the OpenBook's features?</p> <a href="javascript:void(0);" class="tourbus-stop">No Thanks</a> <a href="javascript:void(0);" class="tourbus-next">Take a Tour &raquo;</a> </li> <li data-el=".tools-contents" data-align="left" data-arrow="25" data-left="16"> <a href="javascript:void(0);" class="tour-close">&times;</a> <p>Show this book's <strong>table of contents</strong>, where you can jump to any chapter by name.</p> <a href="javascript:void(0);" class="tourbus-prev">&laquo; Back</a> <a href="javascript:void(0);" class="tourbus-next">Next &raquo;</a> </li> <li data-el=".tools-chapter" data-align="left" data-arrow="25"> <a href="javascript:void(0);" class="tour-close">&times;</a> <p>...or use these buttons to go back to the <strong>previous</strong> chapter or skip to the <strong>next</strong> one.</p> <a href="javascript:void(0);" class="tourbus-prev">&laquo; Back</a> <a href="javascript:void(0);" class="tourbus-next">Next &raquo;</a> </li> <li data-el=".tools-page" data-align="center"> <a href="javascript:void(0);" class="tour-close">&times;</a> <p>Jump up to the <strong>previous</strong> page or down to the <strong>next</strong> one. Also, you can type in a page number and press <kbd>Enter</kbd> to go directly to that page in the book.</p> <a href="javascript:void(0);" class="tourbus-prev">&laquo; Back</a> <a href="javascript:void(0);" class="tourbus-next">Next &raquo;</a> </li> <li data-el=".tools-view" data-align="center"> <a href="javascript:void(0);" class="tour-close">&times;</a> <p>Switch between the <strong>Original Pages</strong>, where you can read the report as it appeared in print, and <strong>Text Pages</strong> for the web version, where you can highlight and search the text.</p> <a href="javascript:void(0);" class="tourbus-prev">&laquo; Back</a> <a href="javascript:void(0);" class="tourbus-next">Next &raquo;</a> </li> <li data-el=".book-tools .search" data-align="right" data-arrow="255"> <a href="javascript:void(0);" class="tour-close">&times;</a> <p>To <strong>search</strong> the entire text of this book, type in your search term here and press <kbd>Enter</kbd>.</p> <a href="javascript:void(0);" class="tourbus-prev">&laquo; Back</a> <a href="javascript:void(0);" class="tourbus-next">Next &raquo;</a> </li> <li data-el=".page-actions:visible:eq(0) .share" data-align="center" data-margin="20" data-orientation="left"> <a href="javascript:void(0);" class="tour-close">&times;</a> <p><strong>Share</strong> a link to this book page on your preferred social network or via email.</p> <a href="javascript:void(0);" class="tourbus-prev">&laquo; Back</a> <a href="javascript:void(0);" class="tourbus-next">Next &raquo;</a> </li> <li data-el=".page-actions:visible:eq(0) .cite" data-align="center" data-margin="20" data-orientation="left"> <a href="javascript:void(0);" class="tour-close">&times;</a> <p>View our <strong>suggested citation</strong> for this chapter.</p> <a href="javascript:void(0);" class="tourbus-prev">&laquo; Back</a> <a href="javascript:void(0);" class="tourbus-next">Next &raquo;</a> </li> <li data-el=".book-tools .get-this-book .button" data-align="right" data-arrow="255"> <a href="javascript:void(0);" class="tour-close">&times;</a> <p>Ready to take your reading offline? Click here to <strong>buy</strong> this book in print or <strong>download</strong> it as a free PDF, if available.</p> <a href="javascript:void(0);" class="tourbus-prev">&laquo; Back</a> <a href="javascript:void(0);" class="tourbus-next">Next &raquo;</a> </li> </ol> <div class="feedback-button active"> <span data-icon="&#xe678;"></span> <span class="icon-text">Stay Connected!</span> <span class="footer13_fb"><a href="https://www.facebook.com/nationalacademies"></a></span> <span class="footer13_tw"><a href="https://twitter.com/theNASEM"></a></span> <span class="footer13_inst"><a href="https://www.instagram.com/theNASEM/"></a></span> </div> <div id="email-panel" class="email-panel"> <form action="" class="email-form"> <h3 class="email-heading">Get Email Updates</h3> <div class="email-close">&times;</div> <p class="email-text">Do you enjoy reading reports from the Academies online <strong>for free</strong>? Sign up for email notifications and we'll let you know about new publications in your areas of interest when they're released.</p> <input type="hidden" name="group[6973][32768]" value="1"> <input type="hidden" name="SOURCE" value="OpenBook"> <input type="email" name="EMAIL" class="email-address" placeholder="my.email@example.com"> <button href="#" class="button email-signup-button"> <span data-icon="&#xe678;"></span>&nbsp;&nbsp; Send me updates!</button> </form> </div> <script> var openbook = { book: {"title":"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation","subtitle":"Proceedings of a Workshop\u2014in Brief","author":"Paula Whitacre, Steven Kendall, and Anne-Marie Mazza, Rapporteurs; Committee on Science, Technology, and Law; Policy and Global Affairs; National Academies of Sciences, Engineering, and Medicine","page_count":"13","approx":"off","alt_title":"","flat_isbn":"0309727901","forthcoming":"off","no_show":"off","pdf_avail":"off","prepub_pdf":"","for_sale":"off","footnote":"","master_pub":"","size":"8.5 x 11","copyright":"2024","delivery_date":"2024-10-01","embargo_date":"2024-10-01 11:00:00","bulk_pricing":"off","press_release":"off","warning":"","jhp_press_release":"off","has_cd":"off","display_openbook":"1","urls":"","track":"Z","redirect_url":"","publisher":"National Academies Press","report_type":{"code":"workshop_in_brief","word":"Proceedings","sequence":"30","description":"Proceedings published by the National Academies of Sciences, Engineering, and Medicine chronicle the presentations and discussions at a workshop, symposium, or other event convened by the National Academies. The statements and opinions contained in proceedings are those of the participants and are not endorsed by other participants, the planning committee, or the National Academies."},"report_type_text":"","view":"default","note":"","free_resources":1,"display_date":"2024-10-01","id":"27997","doi":"10.17226\/27997","url":{"catalog":"https:\/\/nap.nationalacademies.org\/catalog\/27997\/evolving-technological-legal-and-social-solutions-to-counter-online-disinformation","shortest":"https:\/\/nap.nationalacademies.org\/27997","short":"https:\/\/nap.nationalacademies.org\/catalog\/27997","login":"https:\/\/nap.nationalacademies.org\/login.php?record_id=27997","related":"https:\/\/nap.nationalacademies.org\/related.php?record_id=27997","cover":{"tiny":"https:\/\/nap.nationalacademies.org\/cover\/27997\/70","70px":"https:\/\/nap.nationalacademies.org\/cover\/27997\/70","mini":"https:\/\/nap.nationalacademies.org\/cover\/27997\/100","100px":"https:\/\/nap.nationalacademies.org\/cover\/27997\/100","200px":"https:\/\/nap.nationalacademies.org\/cover\/27997\/200","450px":"https:\/\/nap.nationalacademies.org\/cover\/27997\/450","150dpi":"https:\/\/nap.nationalacademies.org\/cover\/27997\/150dpi","300dpi":"https:\/\/nap.nationalacademies.org\/cover\/27997\/300dpi"},"openbook":"https:\/\/nap.nationalacademies.org\/read\/27997","search_inside":"https:\/\/nap.nationalacademies.org\/booksearch.php?record_id=27997","download":"https:\/\/nap.nationalacademies.org\/download\/27997","buy":null},"fulltitle":"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop\u2014in Brief","downloads":531,"topics":[{"topic_name":"Research and Data","id":"471","parent_id":"423","parent_name":"Policy for Science and Technology"}],"display_embargo":"October 1, 2024, at 11:00 am","description":"<p>The online information environment enables the global exchange of information and ideas, but it also contributes to the proliferation of disinformation. Online platforms operate at a scale where human-based content moderation to counter disinformation is impractical or at least very expensive and where purely technical solutions are challenging because content is often context-dependent. The speed, scale, and complexity of this ecosystem suggests that solutions are needed that consider the global nature of disinformation and effectively blend technical and human capabilities.<\/p>\n<p>On April 10 and 11, 2024, an ad hoc committee under the auspices of the National Academies of Sciences, Engineering, and Medicine's Committee on Science, Technology, and Law convened a virtual workshop to consider practicable solutions to counter online disinformation, particularly in social media. This publication summarizes the presentation and discussion of the workshop.<\/p>","bindings":{"pdf":{"pdf_book":{"isbn":{"0":"0-309-72790-1"},"isbn13":{"0":"978-0-309-72790-7"},"price":{"0":"0"},"binding":{"0":"pdfb"},"for_sale":{"0":"on"},"free":{"0":"on"},"type":{"0":"pdf_book"},"word":{"0":"PDF Full Book"},"catalog_word":{"0":"PDF BOOK"},"bind_type":{"0":"electronic"}},"globally_free":1},"pdf_chapter":{"isbn":{},"isbn13":{},"price":{},"binding":{"0":"pdfc"},"for_sale":{},"free":{},"type":{"0":"pdf_chapter"},"word":{},"catalog_word":{},"bind_type":{}}},"product":null,"status":"final","authors":{"author":[{"sequence":"1","name":"Paula Whitacre","role":"Rapporteur"},{"sequence":"2","name":"Steven Kendall","role":"Rapporteur"},{"sequence":"3","name":"Anne-Marie Mazza","role":"Rapporteur"},{"sequence":"4","name":"Committee on Science, Technology, and Law","role":"Committee"},{"sequence":"5","name":"Policy and Global Affairs","role":"Division","acronym":"PGA","url":"http:\/\/www.nationalacademies.org\/pga"},{"sequence":"6","name":"National Academies of Sciences, Engineering, and Medicine","role":"Academy"}]},"display_authors":"National Academies of Sciences, Engineering, and Medicine; <a href=\"\/author\/PGA\">Policy and Global Affairs<\/a>; <a href=\"\/initiative\/committee-on-science-technology-and-law\">Committee on Science, Technology, and Law<\/a>; Paula Whitacre, Steven Kendall, and Anne-Marie Mazza, Rapporteurs","report_type_description":"Proceedings published by the National Academies of Sciences, Engineering, and Medicine chronicle the presentations and discussions at a workshop, symposium, or other event convened by the National Academies. The statements and opinions contained in proceedings are those of the participants and are not endorsed by other participants, the planning committee, or the National Academies.","all_products":[],"jhp":false,"orgs":{"CSTL":{"acronym":"CSTL","name":{"0":"Committee on Science, Technology, and Law"},"url":{"0":"https:\/\/www.nationalacademies.org\/cstl\/committee-on-science-technology-and-law"}}},"divisions":{"PGA":{"acronym":null,"name":{"0":"Policy and Global Affairs"},"url":{"0":"http:\/\/www.nationalacademies.org\/pga"}}},"resources":{"rel_resources":[{"form":"openbook","path":{"0":"http:\/\/www.nap.edu\/openbook.php?record_id=27997"},"comment":{"0":"Updated By Allegro"},"description":"","link_title":"Openbook","word":"OpenBook"}]},"links":{"download":[],"external":[],"commission":[]},"chapters":{"1":{"sequence":"1","title":{"0":"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop\u2014in Brief"},"price":{},"free":{"0":"on"},"page_range":"1\u201313","type":"pdf_chapter","binding":"pdfc","bind_type":"electronic","pages":["1","2","3","4","5","6","7","8","9","10","11","12","13"]}},"citation":{"string":"National Academies of Sciences, Engineering, and Medicine. 2024. <em>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop\u2014in Brief<\/em>. Washington, DC: The National Academies Press.","formats":{"bibtex":{"title":"Bibtex","url":"\/citation.php?type=bibtex&record_id=27997"},"endnote":{"title":"EndNote","url":"\/citation.php?type=enw&record_id=27997"},"ris":{"title":"Reference Manager","url":"\/citation.php?type=ris&record_id=27997"}}},"last_modified":"Fri, 11 Oct 2024 09:02:23 EDT"}, myChapter: {"sequence":"1","title":{"0":"Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop\u2014in Brief"},"price":{},"free":{"0":"on"},"page_range":"1\u201313","type":"pdf_chapter","binding":"pdfc","bind_type":"electronic","pages":["1","2","3","4","5","6","7","8","9","10","11","12","13"],"html":{"1":"\n <table style=\"width: 100%;\">\n <colgroup>\n <col style=\"width: 50%;\">\n <col style=\"width: 50%;\">\n <\/colgroup>\n <tbody>\n <tr>\n <td><img alt=\"NATIONAL ACADEMIES Sciences Engineering Medicine\" src=\"\/openbook\/27997\/xhtml\/images\/logo.jpg\" width=\"373\" height=\"69\" class=\"no-padding\"><\/td>\n <td><b>Proceedings of a Workshop&mdash;in Brief<\/b><\/td>\n <\/tr>\n <\/tbody>\n <\/table>\n <h1>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation<\/h1>\n <h1 class=\"h1a\">Proceedings of a Workshop&mdash;in Brief<\/h1>\n <hr>\n <p class=\"tx\">The online information environment enables the global exchange of information and ideas, but it also contributes to the proliferation of disinformation.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz1-7\" id=\"Axq\">1<\/a><\/sup> Online platforms operate at a scale where human-based content moderation to counter disinformation is impractical or at least very expensive and where purely technical solutions are challenging because content is often context-dependent. The speed, scale, and complexity of this ecosystem suggests that solutions are needed that consider the global nature of disinformation and effectively blend technical and human capabilities.<\/p>\n <p class=\"tx\">On April 10 and 11, 2024, an ad hoc committee under the auspices of the National Academies of Sciences, Engineering, and Medicine&rsquo;s Committee on Science, Technology, and Law (CSTL) convened a virtual workshop to consider practicable solutions to counter online disinformation, particularly in social media. The workshop agenda was organized around four interrelated areas: content moderation; educational interventions; technological interventions; and regulatory and other incentives and disincentives for behavior change. A broad &ldquo;call for solutions&rdquo; was issued to solicit novel approaches to detect, measure, and mitigate disinformation on social media, and 14 submissions were selected for discussion as part the workshop.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz1-11\" id=\"Aqnl\">2<\/a><\/sup> Invited speakers made brief remarks about their proposed solutions, and via an interactive &ldquo;whiteboard,&rdquo; workshop participants shared comments and questions throughout the event. Introductory audio-visual resources (e.g., short videos or slide decks) with additional detail about speakers&rsquo; work were available prior to the workshop.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz1-12\" id=\"A9sV\">3<\/a><\/sup><\/p>\n <p class=\"tx\"><b>Martha Minow<\/b> (Harvard University), co-chair of CSTL, welcomed participants, noting that the current workshop builds upon earlier CSTL work on Section 230 of the Communications Decency Act of 1996, which sought<\/p>\n <p class=\"tx1-1\">__________________<\/p>\n <section class=\"footnote\" epub:type=\"rearnotes\">\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz1-7\"><sup><a href=\"#Axq\">1<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">Defining, understanding, measuring, and addressing disinformation is challenging, with some rejecting the term &ldquo;disinformation&rdquo; and others suggesting that the field of disinformation studies must be fundamentally rethought. Indeed, while the topic of the workshop was disinformation, in their remarks, speakers referred to disinformation, misinformation, and malinformation.<\/span><\/p>\n <p epub:type=\"rearnote\" class=\"fn-1\"><span role=\"doc-footnote\" epub:type=\"footnote\">For the purpose of these proceedings, disinformation is defined as information that is intentionally false and intended to deceive and mislead, hiding the interest and identity of those who developed and initially disseminated the disinformation. Misinformation is defined as false information presented as fact regardless of the intent to deceive, and malinformation is defined as information based on fact that is used out of context to mislead, manipulate, or harm.<\/span><\/p>\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz1-11\"><sup><a href=\"#Aqnl\">2<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">More than 100 submissions were received. In addition to inviting presentations on selected solutions, the workshop planning committee invited seven individuals to speak about their work on disinformation as part of the four panel sessions.<\/span><\/p>\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz1-12\"><sup><a href=\"#A9sV\">3<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">For the workshop&rsquo;s statement of task, biographical sketches of planning committee members and speakers, call for solutions, whiteboard, and introductory audio-visual resources, and workshop video, see <a href=\"https:\/\/www.nationalacademies.org\/event\/41384_04-2024_evolving-technological-legal-and-social-solutions-to-counter-disinformation-in-social-media-a-workshop\">https:\/\/www.nationalacademies.org\/event\/41384_04-2024_evolving-technological-legal-and-social-solutions-to-counter-disinformation-in-social-media-a-workshop<\/a>.<\/span><\/p>\n <\/section>\n\n","2":"\n <p class=\"tx1\">to foster the growth of the internet by providing certain immunities for internet-based technology companies.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz2-5\" id=\"ARgI\">4<\/a><\/sup><\/p>\n <p class=\"tx\">Planning committee co-chair <b>Saul Perlmutter<\/b> (University of California, Berkeley and Lawrence Berkeley National Laboratory) shared the committee&rsquo;s overarching goals: to be collaborative in a generative workshop; get many ideas and approaches on the table&mdash;ideally those that might be new to some; and suggest new collaborations, areas of research, and proposals to counter disinformation. Planning committee co-chair <b>Joan Donovan<\/b> (Boston University) said that the workshop is an important opportunity to identify effective measures for dealing with destructive online behavior.<\/p>\n <p class=\"tx\">To open the workshop, <b>Jonathan Corpus Ong<\/b> (University of Massachusetts Amherst) provided an ethnographic perspective to raise participant awareness of the range of issues associated with disinformation. He challenged the field of disinformation studies to be more globally minded and community driven. We shouldn&rsquo;t simply find ways to be more inclusive, he said, but must work to transform unjust systems, institutions, and ways of working. He suggested that we should challenge tech advocacies and media representations that perpetuate stereotypes of the Global South and ask how the disinformation studies space can empower civil society and researchers to lead in knowledge creation and design solutions.<\/p>\n <p class=\"tx\">Ong questioned whether it is possible to give researchers in the Global South access to the same toolkits (e.g., ad libraries, transparency audits, and local platform policy officers) and legal support measures available in the Global North. As an example of divergence of Global South and Global North activism, he referenced an instance where technology activists spoke out against proposed United Nations Educational, Scientific and Cultural Organization (UNESCO) guidelines for regulating digital platforms because they felt that the guidelines would enable a rubber stamping of local over-regulation. Separately, he described how, at a South-to-South knowledge exchange workshop in Rio de Janeiro, a leader of a Myanmar election coalition called out researchers from the North who inadvertently transformed the landscape of intervention by, for example, siphoning off local researchers to write case studies for Global North policy makers (at the expense of making strategic, hyperlocal interventions). As a blueprint for South-to-South knowledge exchange, Ong identified core principles developed by Global South researchers: 1) the region is a source of creativity, innovations, and solutions: 2) ways of working in the Global South are more culturally proximate and resonant than in the Global North and this means that tools from the Global North cannot just be replicated; 3) reflexivity in collaboration, citation, and knowledge creation is important; and 4) custom-built solutions begin with a critique of unjust global governance structures and extractive systems of knowledge creation.<\/p>\n <h2>CONTENT MODERATION<\/h2>\n <p class=\"tx1\">The first panel session, moderated by planning committee member <b>Amelia Acker<\/b> (University of Texas at Austin), explored approaches to counter online disinformation through content moderation, i.e., processes for reviewing and monitoring user-generated content for adherence to guidelines and standards.<\/p>\n <p class=\"tx\"><b>Brigham Adams<\/b> (Goodly Labs) described Public Editor, a system designed to alert readers of popular online content to reasoning errors, cognitive biases, and rhetorical manipulations by labeling words and phrases to call attention to manipulations and inaccuracies. Readers can see and learn from these labels, and the methods underpinning the ontology are open, transparent, and customizable. Adams said the overall goal of the project is to tilt public discourse toward better reasoning.<\/p>\n <p class=\"tx\"><b>Amelia Burke-Garcia<\/b> (NORC at the University of Chicago) and colleagues are developing a model that assesses online COVID-19 vaccination information. The model assesses how messages are framed&mdash;though it does not necessarily assess their veracity. Particular challenges arise from the fact that disinformation sites use health communication best practices to appear official (e.g., through the use of professional-appearing visuals and text). The model, which is undergoing testing and is expected to be deployed later this year, can be applied to contexts other than COVID-19 vaccinations. Importantly,<\/p>\n <p class=\"tx1-1\">__________________<\/p>\n <section class=\"footnote\" epub:type=\"rearnotes\">\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz2-5\"><sup><a href=\"#ARgI\">4<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">National Academies of Sciences, Engineering, and Medicine. 2021. <i>Section 230 Protections: Can Legal Revisions or Novel Technologies Limit Online Misinformation and Abuse?: Proceedings of a Workshop&mdash;in Brief<\/i>. Washington, DC: National Academies Press. <a href=\"https:\/\/doi.org\/10.17226\/26280\">https:\/\/doi.org\/10.17226\/26280<\/a>.<\/span><\/p>\n <\/section>\n\n","3":"\n <p class=\"tx1\">the project has examined where bias may be introduced in model development, and learnings from a bias assessment will be made available.<\/p>\n <p class=\"tx\"><b>Sarah Cen<\/b> (Massachusetts Institute of Technology) said that her work addresses two types of barriers that make tackling disinformation challenging: legal and regulatory. Legal barriers are related to Section 230 language and First Amendment free speech guarantees (discussed further below). Regulatory barriers relate to how regulations are implemented (e.g., as guidelines subject to compliance audits). She and colleagues have avoided a global definition of what is &ldquo;good,&rdquo; instead creating an approach around a more flexible standard. A requirement that platforms only curate election-related content from white-listed sources might exclude some legitimate sources, but a standard might say that curated content may come from a wide range of sources if it is similar to content available from white-listed sources. Cen&rsquo;s team develops content-auditing procedures that are practical and interpretable and have guarantees&mdash;which means that they can be characterized in terms of what they can and cannot be used for.<\/p>\n <p class=\"tx\"><b>Brenden Kuerbis<\/b> (Georgia Institute of Technology) spoke about research by the Internet Governance Project, which focuses on the scope, pattern, and trends of online, artificial intelligence (AI) enabled disinformation during (nuclear) emergencies and on developing means to counteract the disinformation threat. He suggested that disinformation is a component within a broader propaganda framework. A preliminary project finding suggests that there is a need for a neutral, non-state-led networked governance structure with experts observing and countering threats, similar to those that have addressed spam, phishing, and other transnational cybersecurity issues. Such structures have focused on actual harms caused and the economic incentives of actors that engage in these activities.<\/p>\n <p class=\"tx\"><b>J. Nathan Matias<\/b> (Cornell University) said that adaptive algorithms or AI systems may adapt and respond to user behavior in ways where it is unclear whether they promote or hinder the spread of misinformation. Further, a long history of exclusion by race, gender, and culture has led to biases across computing and the social sciences. He suggested that these act as obstacles to creating infrastructure to address misinformation.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#AQs\" id=\"Att\">5<\/a><\/sup> The Citizens and Technology (CAT) Lab at Cornell University is co-creating software tools and research infrastructure for large-scale data analysis and experimentation. Matias underscored the need to protect researchers working in the media, academia, and civil society through efforts like the Coalition of Independent Technology Research,<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#ANG\" id=\"Av9\">6<\/a><\/sup> though he emphasized that it is important to support all individuals who do essential work&mdash;not just researchers.<\/p>\n <p class=\"tx\"><b>John Wihbey<\/b> (Northeastern University) proposed a research agenda stimulated by AI developments. Wihbey foresees, though does not necessarily endorse, greater use of chatbots that integrate Large-Language Models (LLMs) to counter disinformation. He suggested that, while LLMs offer advantages to social media companies in terms of speed and scale, there are ethical and design risks associated with their use. The classic triad for dealing with disinformation is to remove, reduce, and inform, but a new response might be a top-down authority that takes the form of a chatbot that interfaces with users and provides assistance, mediation, and warnings when disinformation is encountered. Wihbey called for &ldquo;getting ahead&rdquo; of the potential impacts of LLMs and &ldquo;bulking up&rdquo; ethical frameworks through considerations of how AI-related principles might be applied to situations where LLMs are used to counter disinformation.<\/p>\n <h2>REFLECTIONS FROM THE PANEL DISCUSSANT<\/h2>\n <p class=\"tx1\"><b>Nicole Cooke<\/b> (University of South Carolina) reflected on themes of access, transparency, community wisdom, and the need to include humans in interventions. She noted Adams&rsquo; emphasis on the need to account for cognitive, social, and emotional biases and observed that it is important both to agree on a taxonomy when talking about mis-, dis-, and malinformation<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#ANn\" id=\"A3SZ\">7<\/a><\/sup> and to involve different disciplines and the public. Cooke likened efforts to create solutions to counter disinformation as &ldquo;building the train tracks as the train is already speeding along.&rdquo; She questioned the feasibility of collaboration with governments when information is omitted, withheld, or weaponized. Cen&rsquo;s research led Cooke to consider approaches for getting more people to understand inter-<\/p>\n <p class=\"tx1-1\">__________________<\/p>\n <section class=\"footnote\" epub:type=\"rearnotes\">\n <p epub:type=\"rearnote\" class=\"fn\" id=\"AQs\"><sup><a href=\"#Att\">5<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">For a definition of misinformation, see <a href=\"\/read\/27997\/chapter\/1#pz1-7\">footnote 1<\/a>.<\/span><\/p>\n <p epub:type=\"rearnote\" class=\"fn\" id=\"ANG\"><sup><a href=\"#Av9\">6<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">See <a href=\"https:\/\/independenttechresearch.org\/\">https:\/\/independenttechresearch.org\/<\/a>.<\/span><\/p>\n <p epub:type=\"rearnote\" class=\"fn\" id=\"ANn\"><sup><a href=\"#A3SZ\">7<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">For a definition of malinformation, see <a href=\"\/read\/27997\/chapter\/1#pz1-7\">footnote 1<\/a>.<\/span><\/p>\n <\/section>\n\n","4":"\n <p class=\"tx1\">ventions. She noted that when technology leaders have testified before Congress, it is apparent that the technical aspects of disinformation are challenging to policy makers; to make the most of interventions, more education will be involved. She expressed appreciation for Wihbey&rsquo;s focus on ethics and keeping people at the forefront when considering the use of LLM chatbots.<\/p>\n <h2>DISCUSSION<\/h2>\n <p class=\"tx1\">Acker asked about infrastructure and partnerships needed to implement solutions offered by panelists. Adams said that Public Editor needs 2,000-3,000 annotators of content and is building its workforce, but that it has created tools for classrooms and citizen scientists. Burke-Garcia said her (and other health) projects addressing disinformation will grow through collaboration with health experts, technology experts, and community members. Matias said that it is a challenge to engage a diverse public and called for more work to engage communities, akin to the Cooperative Extension System of the U.S. Department of Agriculture.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz4-4\" id=\"Auu\">8<\/a><\/sup> Cen said that there is a need for an auditing infrastructure for platforms, given limits on the amount of information that researchers can ask platforms to provide. Wihbey highlighted the National Internet Observatory, which is being designed to provide researchers with access to large, representative panels of U.S. internet users for the purpose of researching online behaviors.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz4-5\" id=\"AYrA\">9<\/a><\/sup><\/p>\n <p class=\"tx\">An audience member noted the challenges of implementing solutions for countering disinformation when many politicians and members of the public are suspicious of scientists and their motives. Burke-Garcia said that community-trusted opinion leaders can become &ldquo;support mechanisms&rdquo; for sharing information and solutions with their communities, as happens with education about public health. Matias mentioned the debate over design as an influencer of outcomes and referred to a lawsuit filed by a group of attorneys-general against Meta over products designed to keep young users online longer and return to the platform repeatedly.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz4-6\" id=\"AGH\">10<\/a><\/sup> He suggested that a decision against Meta could open the way for a range of potential regulations and interventions.<\/p>\n <p class=\"tx\">Wihbey and Adams said that there is a role for both automated and human moderation of online content. Wihbey called for incentives to encourage platforms to collaborate more with civil society organizations, for example in the training of LLM chatbots to identify hate speech. Adams said that human moderation is challenging to scale and is relatively unstructured (in that different moderators may bring in their own biases and assess the same content differently). AI-enabled content moderation, while scalable, may miss content nuances discerned by humans.<\/p>\n <p class=\"tx\">Wihbey highlighted the problem of information taken out of context (i.e., malinformation), noting that Russian propaganda outlets have taken legitimate stories about the United States and amplified and distorted the negative elements to achieve disinformation goals. Cen pointed out the challenges in identifying the provenance of data. Humans typically verify information by checking its sources but, it is not always possible to identify the source of social media content. Rather than label an entire article &ldquo;fake news,&rdquo; Adams said that we might instead identify particular pieces of the content that are problematic. He noted that research that identifies social harms can induce social media platforms to change practices, as required under the European Union&rsquo;s (EU&rsquo;s) Digital Services Act (DSA).<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz4-13\" id=\"A5E\">11<\/a><\/sup><\/p>\n <h2>EDUCATIONAL INTERVENTIONS<\/h2>\n <p class=\"tx1\">The second panel session, moderated by planning committee member <b>Kate Starbird<\/b> (University of Washington), explored educational interventions that could help users become more resistant to disinformation and other forms of manipulation.<\/p>\n <p class=\"tx\"><b>Matthew Groh<\/b> (Northwestern University) said that countering disinformation on social media now means countering AI-generated disinformation. It is important for consumers to understand how AI-generated media is created to understand its capabilities and limitations, because with this understanding, they can better dif-<\/p>\n <p class=\"tx1-1\">__________________<\/p>\n <section class=\"footnote\" epub:type=\"rearnotes\">\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz4-4\"><sup><a href=\"#Auu\">8<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">See <a href=\"https:\/\/www.nifa.usda.gov\/about-nifa\/how-we-work\/extension\/cooperative-extension-system\">https:\/\/www.nifa.usda.gov\/about-nifa\/how-we-work\/extension\/cooperative-extension-system<\/a>.<\/span><\/p>\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz4-5\"><sup><a href=\"#AYrA\">9<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">The observatory is being built at Northeastern University with funding from the National Science Foundation. See <a href=\"https:\/\/nationalinternetobservatory.org\">https:\/\/nationalinternetobservatory.org<\/a>.<\/span><\/p>\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz4-6\"><sup><a href=\"#AGH\">10<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">For more information, see, e.g., <a href=\"https:\/\/www.cnbc.com\/2023\/10\/24\/bipartisan-group-of-ags-sue-meta-for-addictive-features.html\">https:\/\/www.cnbc.com\/2023\/10\/24\/bipartisan-group-of-ags-sue-meta-for-addictive-features.html<\/a>.<\/span><\/p>\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz4-13\"><sup><a href=\"#A5E\">11<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">The Digital Service Act (see <a href=\"https:\/\/commission.europa.eu\/strategy-and-policy\/priorities-2019-2024\/europe-fit-digital-age\/digital-services-act_en\">https:\/\/commission.europa.eu\/strategy-and-policy\/priorities-2019-2024\/europe-fit-digital-age\/digital-services-act_en<\/a>) is discussed in greater depth below.<\/span><\/p>\n <\/section>\n\n","5":"\n <p class=\"tx1\">ferentiate between what has been generated by AI and what has not (and may therefore be better equipped to recognize deep-fake videos, images, or audio). Groh suggested that media literacy should encompass generative AI literacy so that individuals can better discern potential AI-generated disinformation. He called for ongoing research to evaluate when, where, why, and how ordinary people can spot AI-generated images and on how to boost the ability to determine that material is AI-generated.<\/p>\n <p class=\"tx\"><b>Jonathan Osborne<\/b> (Stanford University) reported on a middle-school classroom curriculum developed with <b>Daniel Pimentel<\/b> (University of Alabama) to build resistance to scientific misinformation. The curriculum consists of nine, once-monthly lessons that expose students to &ldquo;the grammar and language of science&rdquo; so that they can develop &ldquo;epistemic vigilance when confronted with misinformation&hellip;giving them the tools to make good choices about what to trust.&rdquo; The curriculum will be tested with 50 science teachers and for efficacy.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz5-5\" id=\"AXz\">12<\/a><\/sup><\/p>\n <p class=\"tx\"><b>Rub&eacute;n Piacentini<\/b> (University of Rosario, Argentina) said that disinformation in his country, and perhaps others, exists not just in social media but also in schools and other contexts. Piacentini suggested that students have little exposure to scientific experimentation, which is detrimental when they enter universities and careers. To help increase scientific literacy and skills, he has helped develop a citizen science research program that measures air quality throughout the community. Participants complete a questionnaire about what they know about air quality, conduct experiments, and then re-test their knowledge based on their hands-on experience.<\/p>\n <p class=\"tx\"><b>Sander van der Linden<\/b> (University of Cambridge) described the challenge of identifying information that is false or that uses manipulative techniques. Citizens, he said, must be empowered to discern disinformation techniques in a manner that does not restrict their ability to form opinions. Van der Linden described a project to build &ldquo;immunization&rdquo; programs in collaboration with social media companies. YouTube has, for example, posted pre-bunks (&ldquo;weakened doses of misinformation or techniques used to produce misinformation to refute the techniques in advance&rdquo;), so that users can learn how to discern what is real and what is not. While lab results show promise, achieving large scale &ldquo;herd immunity&rdquo; to the effects of disinformation will be a challenge.<\/p>\n <p class=\"tx\"><b>Matt Verich<\/b> (The Disinformation Project) approaches disinformation from a national security perspective. He observed that disinformation is targeted at the public, but that the public is missing from conversations about disinformation. The Disinformation Project brings awareness to teenagers with limited knowledge of how they are being targeted on social media. It offers extracurricular, multidisciplinary project-based activities for student-led chapters that provide the students with an understanding of tactics bad actors use to manipulate, deceive, and divide. Resources and research are made available to teens that empower them to take action by changing online behaviors.<\/p>\n <h2>REFLECTIONS FROM THE PANEL DISCUSSANT<\/h2>\n <p class=\"tx1\"><b>Sam Wineburg<\/b> (Stanford University) noted that the panelists&rsquo; interventions illustrate the need for multiple educational interventions. He asked a series of questions: How does each intervention identify the biggest threats and do the instructional designs proposed tackle them?; Are the interventions scalable at schools and what is the research base behind them?; Are the proposed measures ecologically valid and do they match what students do online? He emphasized the importance of research questions and educational interventions grounded in students&rsquo; real-world behaviors and actions.<\/p>\n <h2>DISCUSSION<\/h2>\n <p class=\"tx1\">For Verich, a priority is to increase teens&rsquo; awareness that the internet, while powerful, has dangers. He focuses on disinformation as a political weapon, noting that activities focused on this aspect provide residual benefits that allow teens to tackle other problems on the internet. Reflecting on a quote by physicist Richard Feynman: &ldquo;What I cannot create, I do not understand,&rdquo; Groh posited that if people understand how misinformation is created, they can discern when it is being used.<\/p>\n <p class=\"tx\">Starbird asked about how to balance learning about disinformation tactics with instruction that may teach people to become manipulative or cynical. Van der Linden<\/p>\n <p class=\"tx1-1\">__________________<\/p>\n <section class=\"footnote\" epub:type=\"rearnotes\">\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz5-5\"><sup><a href=\"#AXz\">12<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">Osborne pointed attendees to the publication &ldquo;Science Education in an Age of Disinformation&rdquo; for elaboration (J. Osborne et al. 2023. &ldquo;Science Education in and Age of Misinformation.&rdquo; <i>Science Education<\/i> 107: 553-571).<\/span><\/p>\n <\/section>\n\n","6":"\n <p class=\"tx1\">drew on an inoculation analogy: vaccines provide a small quantity of a virus but not enough to make people ill. In a similar way, his project is designed to provide a toolkit that is insufficient for online manipulation of others. He added that, as is the case with vaccines, it is important to monitor side effects.<\/p>\n <p class=\"tx\">Starbird noted that the proposed educational interventions can take place both inside and outside the classroom. Verich said that The Disinformation Project chose an extracurricular approach but that, for a durable response, interventions should take place both inside and outside of the classroom to promote behaviors that will mitigate the most critical harms.<\/p>\n <p class=\"tx\">A participant asked how to support community-engaged solutions. Verich said that The Disinformation Project chapters are tailored to their communities. Communities also pass information to each other, van der Linden said. Starbird noted that some individuals and groups have claimed that media literacy brainwashes people. &ldquo;Even as we gather here,&rdquo; she said, &ldquo;there are politicized efforts to undermine&rdquo; efforts to counter disinformation and stop it in its tracks. &ldquo;When we approach educational interventions,&rdquo; she continued, &ldquo;how are we going to do that in a world where there are people who benefit from others being manipulated and do not necessarily want these interventions to be out there&rdquo; and who &ldquo;seem to be winning the day in some places?&rdquo;<\/p>\n <h2>TECHNOLOGICAL INTERVENTIONS<\/h2>\n <p class=\"tx1\">Planning committee member <b>Beth Mara Goldberg<\/b> (Jigsaw\/Google) moderated a panel session on technological interventions for countering disinformation that can complement content moderation and education, policy, and regulation.<\/p>\n <p class=\"tx\"><b>Wajeeha Ahmad<\/b> (Stanford University) discussed her research to quantify and counter the financial incentives for spreading disinformation. She said that disinformation websites, like other websites, make money through advertising. Advertisements from companies across industries appear on disinformation websites, often without their knowledge. Her work suggests that up to 13 percent of consumers would change their buying behaviors if they knew a company was advertising on disinformation sites. Ahmad emphasized that financial incentives to spread disinformation can be countered by improving transparency on where companies are advertising.<\/p>\n <p class=\"tx\"><b>Michelle Amazeen<\/b> (Boston University) discussed how mainstream media contribute to disinformation by disguising paid content to look like news, which confuses readers. Her research suggests that paid content, also known as sponsored content or native advertising, may influence readers&rsquo; perceptions when they believe it is regular news content. It may also influence how journalists cover topics related to the advertiser. Moreover, when sponsored content is shared on social media, disclosures mandated by the Federal Trade Commission (FTC) often disappear,<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz6-10\" id=\"A46I\">13<\/a><\/sup> and Google search results do not always distinguish that content is sponsored. Native advertising may lead news outlets not to report unfavorable or contradictory information about the sponsoring company, which Amazeen characterized as a form of disinformation. She described her work with a team developing a protocol to create a native advertising observatory to identify and catalog thousands of native ad campaigns, beginning with those related to climate change, for use by researchers.<\/p>\n <p class=\"tx\"><b>Christopher Impey<\/b> (University of Arizona) described his research group&rsquo;s two-pronged approach to combat science misinformation. One prong is an instructional module that can fit into any science course to teach students how to detect &ldquo;fake science.&rdquo; The second prong, which relies on machine learning, involves the development of a browser plug-in and smartphone app that will activate when a user goes to a webpage with scientific content and alert users to potential misinformation. Impey compiled a list of the top 100 pseudo-science terms that appear on 1.3 billion unique webpages. Students are curating (i.e., classifying) articles based upon their legitimacy in order to develop training sets for neural networks.<\/p>\n <p class=\"tx\">Pinch-hitting for <b>Andrew Jenks<\/b> (Coalition for Content Provenance and Authenticity [C2PA]), planning committee member <b>Eric Horvitz<\/b> (Microsoft), who has made technical contributions to media provenance efforts, described C2PA&rsquo;s efforts to provide verification that content comes from a trusted source and has not<\/p>\n <p class=\"tx1-1\">__________________<\/p>\n <section class=\"footnote\" epub:type=\"rearnotes\">\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz6-10\"><sup><a href=\"#A46I\">13<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">For information about FTC disclosure requirements, see <a href=\"https:\/\/www.ftc.gov\/business-guidance\/resources\/disclosures-101-social-media-influencers\">https:\/\/www.ftc.gov\/business-guidance\/resources\/disclosures-101-social-media-influencers<\/a>.<\/span><\/p>\n <\/section>\n\n","7":"\n <p class=\"tx1\">been modified. C2PA is working with about 100 companies and 2000 organizations to establish digital media provenance standards. Horvitz noted that, several weeks ago, large technology companies announced they will embrace a C2PA standard called Content Credentials. The credential provides a graphic icon that indicates whether metadata and other critical information is available as cryptographically signed information that provides verification of the source and provenance of the content.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz7-6\" id=\"ARs\">14<\/a><\/sup><\/p>\n <p class=\"tx\"><b>Ming Ming Chiu<\/b> (Education University of Hong Kong) presented work on behalf of his collaborators, <b>Jeong-Nam Kim<\/b> and <b>David Ebert<\/b> of the University of Oklahoma. Chiu said that many approaches to detect disinformation are opaque, untestable, or narrow. His team created <i>Deceptive Writing Theory<\/i>, a multilevel theory of disinformation causes and conditions (e.g., political or financial motives) and developed <i>Statistical Discourse Analysis and Multilevel Diffusion Analysis<\/i>, analytic tools that model how messages spread within and across sub-populations&mdash;by numbers of users, speed, and diffusion patterns. An AI-powered dashboard displays this information for each message and a threat level. This aids government officials tracking dangerous messages, provides students with tools to detect them, and gives researchers from multiple disciplines the opportunity to build upon their work.<\/p>\n <p class=\"tx\"><b>Dongwon Lee<\/b> (Pennsylvania State University) is developing computational tools to flag disinformation at scale. He noted that existing academic tools have limited functionality. Social media companies have in-house solutions to flag disinformation, but do not share their algorithms. Lee suggested that LLMs have the potential to become a democratizing platform for the detection and explanation of disinformation in different settings and languages. His research has found that it is still relatively easy to fool LLMs into generating factually incorrect information at scale, but they can be configured to detect human and AI-generated disinformation with reasonable accuracy.<\/p>\n <h2>REFLECTIONS FROM THE PANEL DISCUSSANT<\/h2>\n <p class=\"tx1\"><b>Samuel Woolley<\/b> (University of Texas at Austin) welcomed the variety of technical tools available to aid producers and receivers of information and to assess content. He applauded Ahmad&rsquo;s ideas to counter disinformation through interventions that have financial implications for the producers of disinformation; Amazeen&rsquo;s suggestions on how to dismantle or disrupt the native advertising ecosystem; and Chiu&rsquo;s ideas for testing theories about how messages are spread (and for presenting the information on a dashboard). He suggested that Lee&rsquo;s effort to flag disinformation at scale using LLMs would be useful, particularly because of their capacity to perform in languages other than English. Impey&rsquo;s machine-learning effort has the capacity to distinguish disinformation across massive amounts of content. Woolley noted a need for resources and emphasized the importance of collaboration across disciplines, companies, and organizations, citing C2PA as an example. &ldquo;No single technological solution to this problem will actually deal with&rdquo; disinformation, so &ldquo;reiterating that to everyone we meet is so important,&rdquo; especially to policy makers and others who are looking for a &ldquo;magic bullet.&rdquo;<\/p>\n <h2>DISCUSSION<\/h2>\n <p class=\"tx1\">Horvitz said that traditional media also amplify content through, for example, word choices in headlines. Amazeen noted that search engine optimization has had an impact on headlines. In addition, news websites and apps often suggest content based on prior reading history, which can include sponsored content. Ahmad said that academic collaborations with companies that deal with advertisers (such as AdTech platforms, i.e., advertising technology software and tools advertisers use to buy, manage, and analyze digital advertising) would be helpful in identifying the extent to which companies&rsquo; behavior changes as a result of providing greater transparency to consumers.<\/p>\n <p class=\"tx\">Woolley said that unpacking connections between advertising, public relations, and political manipulation is important. He noted that funding for longitudinal studies on disinformation is dwindling. &ldquo;There is a large-scale politicization of this work that has the potential to continue to hinder our ability to do this work,&rdquo; he said. &ldquo;People set up a dichotomy between free speech and open content across platforms or privacy and safety. That dichotomy must be rejected.&rdquo;<\/p>\n <p class=\"tx1-1\">__________________<\/p>\n <section class=\"footnote\" epub:type=\"rearnotes\">\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz7-6\"><sup><a href=\"#ARs\">14<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">For more information, see <a href=\"https:\/\/c2pa.org\/post\/contentcredentials\/\">https:\/\/c2pa.org\/post\/contentcredentials\/<\/a>.<\/span><\/p>\n <\/section>\n\n","8":"\n <p class=\"tx1\">Planning committee member <b>Aziz Z. Huq<\/b> (University of Chicago) asked about data on consumer adoption when tools are available. &ldquo;If we think that individuals are not going to take advantage of opportunities to mitigate disinformation at a retail or individualized level, maybe that pushes toward more wholesale solutions,&rdquo; he said. Horvitz said more research is needed on whether consumers will use the Content Credential or other standards that he discussed. Ahmad said that companies normally outsource their advertising operations, and it is important to make it easier for companies to know where their advertising appears and improve transparency to make that information persistently available to companies and consumers.<\/p>\n <p class=\"tx\">Kim emphasized that disinformation occurs worldwide and that technological solutions must be multilingual and multicultural, especially to reach poorer regions. Ebert added that that multiple strategies are needed to counter long-term disinformation campaigns, which often use different techniques. He said that visual representations of information flows exert a powerful effect on researchers and the public. Goldberg highlighted larger, slow, long-term, subtle sets of disinformation narratives, which Starbird and colleagues have referred to as &ldquo;frames.&rdquo; Frames set, for example, by native advertising and generative AI, can subtly influence how the world is seen.<\/p>\n <p class=\"tx\">Donovan suggested the application of a &ldquo;public works mindset&rdquo; to move the field forward. &ldquo;If the metaphor of an information superhighway is somewhat accurate, where are the signs, where are the guardrails? If people are looking for credible information, particularly about public health, why don&rsquo;t they run into credible information quickly?&rdquo;<\/p>\n <p class=\"tx\">Chiu said that forcing social media users to wait a few seconds before forwarding a message can increase their reflexivity and change their forwarding behaviors. Woolley said that it is important to think of who should be held accountable for the impacts of disinformation and who should do the work of solving the problem. Horvitz noted that there are federal and state legislative efforts to require media provenancing, such as the California Provenance Authenticity and Watermarking Standards Act.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz8-5\" id=\"A21\">15<\/a><\/sup><\/p>\n <p class=\"tx\">Impey said that authority is questioned in all areas and that technological approaches to counter disinformation cannot be &ldquo;black-boxed.&rdquo; Kim raised the concept of norm-chilling, which posits that, when people perceive they hold a minority opinion, they may remain silent. He suggested, however, that, if norm-chilling causes people not to amplify a disinformation message, that could be a positive.<\/p>\n <p class=\"tx\">Chiu said that scale-up of interventions differs across political contexts. In Asia, for example, some ministries of information can impose requirements on companies that the U.S. government cannot (e.g., requiring the use of certain plug-ins on web browsers to flag disinformation). Horvitz called for greater public understanding of the value of media provenance and for funding for academic teams to assist in guiding the technology.<\/p>\n <p class=\"tx\">Planning committee member <b>Susan S. Silbey<\/b> (Massachusetts Institute of Technology) said that, &ldquo;We have been here before.&rdquo; She said that new technologies have always caused disruption to &ldquo;the fundamental patterns of life,&rdquo; even if anonymity contributes to the current problem. Producers of drugs, automobiles, and other products are held responsible for what they make or do through science, law, and other means. Human groups function by holding each other responsible for what they have done, she said, but no one is held responsible for disinformation.<\/p>\n <h2>REGULATORY AND OTHER INCENTIVES AND DISINCENTIVES FOR BEHAVIOR CHANGE<\/h2>\n <p class=\"tx1\">Huq, moderator of the workshop&rsquo;s final panel session, said that disinformation is shaped by a range of dynamics, suggesting that the educational solutions to disinformation available to consumers and commercial entities turn on regulatory choices made by government.<\/p>\n <p class=\"tx\"><b>Joshua Braun<\/b> (University of Massachusetts Amherst) discussed financial incentives for disinformation. He said that such incentives have curbed progress in moderation and towards transparency, and the space has been difficult to reform. Digital advertising, with many players and transitions, is prone, he said, to what Charles Perrow referred to as normal accidents.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz8-12\" id=\"Ano\">16<\/a><\/sup> Rather than strictly<\/p>\n <p class=\"tx1-1\">__________________<\/p>\n <section class=\"footnote\" epub:type=\"rearnotes\">\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz8-5\"><sup><a href=\"#A21\">15<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">See <a href=\"https:\/\/trackbill.com\/bill\/california-assembly-bill-3211-california-provenance-authenticity-and-watermarking-standards\/2520580\/\">https:\/\/trackbill.com\/bill\/california-assembly-bill-3211-california-provenance-authenticity-and-watermarking-standards\/2520580\/<\/a>.<\/span><\/p>\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz8-12\"><sup><a href=\"#Ano\">16<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">See C. Perrow. 2000. <i>Normal Accidents: Living with High-Risk Technologies<\/i>. Princeton, NJ: Princeton University Press. Perrow recognized two dimensions of risk&mdash;linear versus interactive complexity, and tight versus loose coupling. A &ldquo;normal accident&rdquo; is a situation in which the systems are so complex and interrelated, and opportunities for human intervention are so reduced, that an accident of some sort is almost inevitable. Braun sug-<\/span><\/p>\n <\/section>\n\n","9":"\n <p class=\"tx1\">technological solutions, &ldquo;normal accident&rdquo; literature suggests other solutions, such as problem solving that walks away from systems that are hard to manage, makes it easier for humans to intervene, decreases system complexity, focuses on change management and organizational cultures, and accepts failures as inevitable (but limits them). He said that these five approaches to problem solving might be applied to containing disinformation spread through AdTech and profit-driven disinformation.<\/p>\n <p class=\"tx\"><b>David Bray<\/b> (Henry Stimson Center) said that, while the increasing democratization of technology can be both good and bad, the tradecraft of information discernment has not been democratized. He looks at how to deter bad actors who use multiple means to spread disinformation against the interests of national defense, law enforcement, and civil norms (his biggest concern). Nonprofits, labs, and other groups must be involved in efforts to counter disinformation&mdash;not just governments or social media platforms. People need &ldquo;digital dignity&rdquo; wherein their human rights are respected online, but Bray said that their voice has been taken away by the taking of their data. Privacy protections, crowdsourcing, and a system in which professional certified data scientists accept responsibility (in a manner similar to certified public accountants) have a role to play in making technology that benefits customers, citizens, and communities.<\/p>\n <p class=\"tx\"><b>Nandini Jammi<\/b> (Check My Ads and Sleeping Giants) said that the disinformation crisis is fueled by advertising. As an example of digital ad watchdog activity, she described an intervention her organization engaged in when they were contacted about a scam that showed up on Google searches. They singled out the Google Vice President of Global Advertising and asked community members to email him directly about the scam. Google responded immediately by changing pertinent policies and banning the scammer. Jammi and her colleagues are &ldquo;moving up the ladder&rdquo; from individual advertisers to holding accountable bodies and companies that allow disinformation, scams, and crimes to proliferate.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz9-5\" id=\"AA0\">17<\/a><\/sup><\/p>\n <p class=\"tx\"><b>Jeff Kosseff<\/b> (U.S. Naval Academy) discussed what is and is not possible in the regulation of disinformation.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#And\" id=\"ADx0\">18<\/a><\/sup> He briefly described the origins of Section 230 of the Communications Decency Act of 1996 and acknowledged the many calls for legislative amendments to the section. Kosseff suggested that, even if Section 230 were to be repealed, the First Amendment protects a lot of false speech under the premise that, except for a few narrowly defined exceptions, speech should be safeguarded. He cautioned against the impulse for new legal and regulatory actions, calling instead for government efforts to fund media literacy, support libraries, and foster transparency and trust.<\/p>\n <p class=\"tx\"><b>Nathalie Smuha<\/b> (KU Leuven) said that, although laws do not solve everything (and are subject to abuse), they are a powerful tool in democratic societies. In Europe, different regulatory initiatives have been taken to address the concerns around disinformation, which U.S. institutions could explore for inspiration. She said that the most prominent initiative is the DSA, which has consequences for platforms outside the EU&mdash;for example in instances where their services are available to European consumers.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz9-9\" id=\"Acl\">19<\/a><\/sup> The DSA calls for increased transparency about algorithms, training, and terms and conditions, rather than identifying what types of speech must be taken down. The Act mandates that platforms and search engines that have a systemic impact undertake risk impact assessments. Social harms to users must be documented, acknowledged, and mitigated by social media companies. The Act also establishes a system of independent, impartial, trusted content flaggers with whom platforms must cooperate, puts in place processes for challenging a platform&rsquo;s decision to take down content, and compels the largest platforms to allow researcher access to key data. Procedural elements of the DSA make the Act a powerful mechanism for countering disinformation.<\/p>\n <p class=\"tx1-1\">__________________<\/p>\n <section class=\"footnote\" epub:type=\"rearnotes\">\n <p epub:type=\"rearnote\" class=\"fn1\"><span role=\"doc-footnote\" epub:type=\"footnote\">gested that Perrow&rsquo;s Normal Accident Theory could aid in the diagnosis of problems and offer a taxonomy to look for solutions.<\/span><\/p>\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz9-5\"><sup><a href=\"#AA0\">17<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">She offered as examples of organizations involved in accountability the Trustworthy Accountability Group (TAG) (see <a href=\"https:\/\/www.tagtoday.net\">https:\/\/www.tagtoday.net<\/a>) and the Media Ratings Council (MRC) (see <a href=\"https:\/\/mediaratingcouncil.org\">https:\/\/mediaratingcouncil.org<\/a>). TAG &ldquo;advances its mission of eliminating fraudulent traffic, sharing of threat intelligence, promoting brand safety and enabling transparency by connecting industry leaders, analyzing threats, and sharing best practices worldwide&rdquo; and MRC &ldquo;is a not-for-profit industry self-regulatory body, established in 1963 at the request of U.S. Congress, that audits and accredits media measurement products and data sources across Digital, Out-of-Home, Print, Radio, Television, and cross-media products.&rdquo;<\/span><\/p>\n <p epub:type=\"rearnote\" class=\"fn\" id=\"And\"><sup><a href=\"#ADx0\">18<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">Kosseff wrote a book on Section 230 (see J. Kosseff. 2019. <i>The Twenty-Six Words that Created the Internet<\/i>. Ithaca, NY: Cornell University Press.) and spoke at the National Academies&rsquo; 2021 workshop <i>Section 230 Protections: Can Legal Revisions or Novel Technologies Limit Online Misinformation and Abuse?<\/i> (see <a href=\"\/read\/27997\/chapter\/1#pz2-5\">footnote 4<\/a>).<\/span><\/p>\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz9-9\"><sup><a href=\"#Acl\">19<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">For more information on the DSA, see <a href=\"https:\/\/commission.europa.eu\/strategy-and-policy\/priorities-2019-2024\/europe-fit-digital-age\/digital-services-act_en\">https:\/\/commission.europa.eu\/strategy-and-policy\/priorities-2019-2024\/europe-fit-digital-age\/digital-services-act_en<\/a>.<\/span><\/p>\n <\/section>\n\n","10":"\n <h2>REFLECTIONS FROM THE PANEL DISCUSSANT<\/h2>\n <p class=\"tx1\"><b>Brandie Nonnecke<\/b> (University of California, Berkeley) reflected on scale and complexity, asking whether the complexity of the internet is a necessary evil or whether digital ad markets and other features can be simplified. Taking a long view, she said that misinformation and disinformation have always occurred, because &ldquo;those in the past who had the means to record history also had the means to write it in their favor.&rdquo; As modes of inquiry change, so do actions, and it is important to treat causes, not just symptoms. Nonnecke said that a lack of trust in institutions allows actors to spread mis- and disinformation. She said that, while Section 230 and the DSA can give users greater power to choose content, modes of inquiry are needed to study what might happen, for example, if users choose content and consume only content that appeals to their world view.<\/p>\n <h2>DISCUSSION<\/h2>\n <p class=\"tx1\">Huq suggested that a person listening to the panelists would think that law and regulation do not have a significant role to play in addressing disinformation, but that there are many similar instances of &ldquo;wicked problems&rdquo; wherein it would be extraordinary to say the state has no role.<\/p>\n <p class=\"tx\">Bray said that, for free societies, it should not be solely up to the nation-state to solve the problem of disinformation, as this risks being perceived as censorship. He said that the line between political rhetoric, advertising, and disinformation is blurry, and that regulatory processes can be manipulated by technology. As an example, he drew upon his experience at the Federal Communications Commission when calls for public comments in 2014 and 2017 were inundated with huge numbers of bot-generated comments. Bray said that AI-generated content will make it even harder to distinguish between human and bot interactions. He said that he doesn&rsquo;t want too much power to remain with one sector, be it government or industry. Instead, he said that collectives, such as nonprofits, universities, or other groups of people, must be engaged. Bray suggested that an ideal law to mitigate disinformation would ensure that individuals have digital dignity and the right to know how their data is used.<\/p>\n <p class=\"tx\">Kosseff differentiated between legal versus regulatory responses. Free speech protections are critical to democracy, and he warned against narrowing free speech in any circumstance. Regardless of one&rsquo;s political views, it is not a good idea for judges to order takedowns of politicians&rsquo; postings, he said.<\/p>\n <p class=\"tx\">The digital advertising space is set up to maximize consumer engagement, and the focus is on the individual at the expense of context, Braun said. When the results of social media advertising are not successful, the industry has wanted to collect more data in a form of commercial surveillance.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz10-10\" id=\"AL0\">20<\/a><\/sup> Braun suggested that Congress could help rebalance the scales in favor of consumers by limiting the data that companies can collect or the uses to which it can be put. Huq mentioned the Supreme Court case <i>Sorrell v. IMS Health<\/i>, which ruled that collection and dissemination of data is speech covered by the First Amendment.<sup><a role=\"doc-noteref\" epub:type=\"noteref\" href=\"#pz10-11\" id=\"ASUl\">21<\/a><\/sup> &ldquo;Were <i>Sorrell<\/i> to be pushed to its logical extreme, many forms of data regulation that are implicitly in the proposals that Kosseff and Bray identified would be off the table,&rdquo; he said.<\/p>\n <p class=\"tx\">Jammi said her organization operates in the free market space by pitting market forces against each other. The initial focus has been on consumers versus advertisers. Constantly leveraging these forces against each other helps consumers have their voices heard at the highest level, she said. She agreed that consumers do not understand how their data are used and said that the next step for regulation is to force transparency. She said that a national registry of data brokers is needed to shed light on the identities of data brokers and the data they collect.<\/p>\n <p class=\"tx\">Smuha agreed that a combination of technical, organizational, and legal solutions is needed, but emphasized that, as members of a democratic society, we should come together in public deliberation to establish rules for governance. This, she said, is the spirit behind European legislation. Smuha said that, while the DSA is imperfect, it enables a wide range of stakeholders to play a role in safeguarding the online space. Adams called atten-<\/p>\n <p class=\"tx1-1\">__________________<\/p>\n <section class=\"footnote\" epub:type=\"rearnotes\">\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz10-10\"><sup><a href=\"#AL0\">20<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">Commercial surveillance is the business of collecting, analyzing, and profiting from. information about people. See, e.g., <a href=\"https:\/\/www.ftc.gov\/news-events\/news\/press-releases\/2022\/08\/ftc-explores-rules-cracking-down-commercial-surveillance-lax-data-security-practices\">https:\/\/www.ftc.gov\/news-events\/news\/press-releases\/2022\/08\/ftc-explores-rules-cracking-down-commercial-surveillance-lax-data-security-practices<\/a>.<\/span><\/p>\n <p epub:type=\"rearnote\" class=\"fn\" id=\"pz10-11\"><sup><a href=\"#ASUl\">21<\/a><\/sup> <span role=\"doc-footnote\" epub:type=\"footnote\">This case is summarized at <a href=\"https:\/\/www.oyez.org\/cases\/2010\/10-779\">https:\/\/www.oyez.org\/cases\/2010\/10-779<\/a>.<\/span><\/p>\n <\/section>\n\n","11":"\n <p class=\"tx1\">tion to the DSA&rsquo;s provisions on data sharing and on the legal concept of social harm. If social harms can be demonstrated, provisions kick in that fund experimental research on interventions to mitigate the harms.<\/p>\n <p class=\"tx\">Kosseff said that no single solution, including media literacy or more government transparency, will fix the problem of misinformation. He said that commercial speech (inclusive of scams targeted to consumers) can be regulated by the FTC, but urged caution in enacting legislation or regulations that remove protections to free speech because &ldquo;threats to democracy are so much greater because, once you get rid of those protections, you&rsquo;re not going back.&rdquo;<\/p>\n <p class=\"tx\">Bray agreed with Kosseff&rsquo;s caution about interventions and warned against interventions that lead to autocracies of thought and reality. He expressed hope that, by learning how people fall for scams, we will collect data and metrics that can be useful when considering ways to enhance the ability of individuals and communities to better discern authentic vs. inauthentic information.<\/p>\n <p class=\"tx\">Huq called attention to questions submitted by the audience about the feasibility of a licensing system for generative AI and the right of free speech versus the right to amplify speech, which he referred to as &ldquo;speech at the retail level and speech at scale.&rdquo; Bray suggested that there are lessons to be learned from the experience with amateur radio licenses in the 1920s. At the time, radio was seen as having both promise and perils. Licensing did not prohibit speaking at scale, but licensees had to go through a process and identify who they were to obtain a license. Bray warned about restrictions on amplifying speech, which he suggested could lead to abuses such as governments not allowing private citizens to broadcast their views. Instead, he suggested licensing, certification, or a requirement that data brokers and others take an ethical oath akin to the Hippocratic Oath. He said that Australia is experimenting with citizen juries where representative members of the public can raise concerns or objections about a range of topics that can include online content. Kosseff suggested that regulating amplification would have the same impact as regulating speech because platforms exist to amplify or de-amplify speech.<\/p>\n <h2>LOOKING FORWARD: REFLECTIONS FROM THE PLANNING COMMITTEE<\/h2>\n <p class=\"tx1\">At the conclusion of the workshop, six members of the workshop planning committee shared what they had taken away from workshop discussions.<\/p>\n <p class=\"tx\">Donovan reflected on the need to understand the DSA and called for a transition to durable institutions and infrastructure that can create a healthy environment where people have the right to truth. She challenged social media companies, search engines, and other information conduits to put more work into ontologies and ranking systems. The field&rsquo;s increasing coordination is heartening, she said, but she shared frustration that the code of TALK (Timely Accurate Local Knowledge) has not been cracked as this is the kind of information that people need to make informed decisions. While there is promise in technology and citizen participation, some profit from disinformation. Donovan said that we need to protect biometric privacy as AI technology advances and said that we all hold the keys to unlocking a future different than the present.<\/p>\n <p class=\"tx\">Horvitz said that traditional media must become involved in countering disinformation and collaborate with other stakeholders to build media provenance technologies. Interventions that show promise should be pursued, even if they seem obvious or like small steps. Research is needed to develop best practices, pool results, and think through the players and disciplines that should be involved (e.g., social psychology, cognitive psychology, technology). It is important to understand the implications of technologies to prevent them from backfiring and causing harm, including in the Global South. For Horvitz, a big question is, &ldquo;How do we avoid entering a post-epistemic world that our grandchildren will be living in?&rdquo; He expressed concern about the potential of adversarial generative AI systems that can understand the world, take goals from autocratic rulers, and generate believable stories that combine disinformation and real-world events.<\/p>\n <p class=\"tx\">Huq noted that many words and concepts&mdash;such as free speech, market, and state&mdash;can refer to a range of possibilities. He reflected on the implications of moving away from a dichotomy between markets and states toward a choice between different social equilibria: that<\/p>\n\n","12":"\n <p class=\"tx1\">is, between a healthy public sphere or one in which bad speech drives out good and where it is hard to rely on the machinery of government to produce laws to protect speech and democratic values. While it is impossible to imagine either an entirely civic or entirely regulatory approach moving towards good, he is now more skeptical of relying on individuals at the retail level to equip themselves with the right tools to counter disinformation. Disinformation is a problem at scale that democratic governments must address.<\/p>\n <p class=\"tx\">Silbey urged more attention to history. Creators, innovators, and disruptors often have minimal notions of how the social ecosystem works, she said. People are not just producers or consumers, they are participants. Data transparency, access, and privacy are important to consider when thinking about how to protect individuals, but it is equally important to create the capacity for accountability for failures. The market can do this, she said, but law, professional norms, and standardization are also critical. There are actions at multiple levels of scale&mdash;individuals, masses, organizations, systems&mdash;but capacities at each of these scales are unequal. This has resulted in a concentration of ownership and a proliferation of bad actors. Silbey said that it is important to understand how people buy into disinformation, but it is also important to remember that many &ldquo;unknown unknowns&rdquo; exist.<\/p>\n <p class=\"tx\">Starbird noted that sponsors of disinformation research have shifted priorities in recent years. No one direction is right or wrong, she said, but online disinformation requires a multi-pronged approach&mdash;and education is an important dimension.<\/p>\n <p class=\"tx\">Perlmutter said that he was encouraged to hear about ideas that can work to counter disinformation despite the many challenges identified during the workshop. He expressed interest in research, experimentation, and monitoring. We have encountered challenging periods before, he said, but multiple approaches can lead to progress. He suggested that it is important to develop techniques for citizen oversight so that we are not at the mercy of trusting that industry or government will effectively address disinformation.<\/p>\n\n","13":"\n <p class=\"tx1\"><b>DISCLAIMER<\/b> This Proceedings of a Workshop&mdash;in Brief has been prepared by <b>Paula Whitacre, Steven Kendall<\/b>, and <b>Anne-Marie Mazza<\/b> as a factual summary of what occurred at the meeting. The committee&rsquo;s role was limited to planning the event. The statements made are those of the individual workshop participants and do not necessarily represent the views of all participants, the project sponsors, the planning committee, the Committee on Science, Technology, and Law, or the National Academies.<\/p>\n <p class=\"tx1-1\"><b>REVIEWERS<\/b> To ensure that it meets institutional standards for quality and objectivity, this Proceedings of a Workshop&mdash;in Brief was reviewed by <b>Jon Bateman<\/b>, Carnegie Endowment for International Peace; <b>Joshua Braun<\/b>, University of Massachusetts Amherst; and <b>Steven Brill<\/b>, NewsGuard. <b>Marilyn Baker<\/b>, National Academies of Sciences, Engineering, and Medicine, served as the review coordinator.<\/p>\n <p class=\"tx1-1\"><b>PLANNING COMMITTEE Joan Donovan<\/b> (<i>Co-chair<\/i>), Boston University; <b>Saul Perlmutter<\/b> (<i>Co-chair<\/i>), University of California Berkeley and Lawrence Berkeley National Laboratory; <b>Amelia Acker<\/b>, University of Texas at Austin; <b>Hany Farid<\/b>, University of California, Berkeley; <b>Beth Mara Goldberg<\/b>, Jigsaw (Google); <b>Mark Hansen<\/b>, Columbia University; <b>Eric Horvitz<\/b>, Microsoft; <b>Aziz Z. Huq<\/b>, University of Chicago; <b>Susan S. Silbey<\/b>, Massachusetts Institute of Technology; <b>Kate Starbird<\/b>, University of Washington.<\/p>\n <p class=\"tx1-1\"><b>NATIONAL ACADEMIES OF SCIENCES, ENGINEERING, AND MEDICINE STAFF Steven Kendall<\/b>, Senior Program Officer; <b>Anne-Marie Mazza<\/b>, Senior Director; <b>Renee Daly<\/b>, Senior Program Assistant.<\/p>\n <p class=\"tx1-1\"><b>SPONSORS<\/b> This project was funded by the Gordon and Betty Moore Foundation and the Rockefeller Foundation.<\/p>\n <p class=\"tx-2\"><b>SUGGESTED CITATION<\/b> National Academies of Sciences, Engineering, and Medicine. 2024. <i>Evolving Technological, Legal, and Social Solutions to Counter Online Disinformation: Proceedings of a Workshop&mdash;in Brief<\/i>. Washington, DC: National Academies Press. <a href=\"https:\/\/doi.org\/10.17226\/27997\">https:\/\/doi.org\/10.17226\/27997<\/a>.<\/p>\n <table style=\"width: 100%;\">\n <colgroup>\n <col style=\"width: 70%;\">\n <col style=\"width: 30%;\">\n <\/colgroup>\n <tbody>\n <tr class=\"col4\">\n <td class=\"tdt\">\n <p class=\"tx1-2\"><b>Policy and Global Affairs<\/b><\/p>\n <p class=\"tx1-2\"><i>Copyright 2024 by the National Academy of Sciences. All rights reserved.<\/i><\/p>\n <\/td>\n <td class=\"tdt\"><img alt=\"NATIONAL ACADEMIES Sciences Engineering Medicine The National Academies provide independent, trustworthy advice that advances solutions to society&rsquo;s most complex challenges.\" src=\"\/openbook\/27997\/xhtml\/images\/img-13-1.jpg\" width=\"303\" height=\"206\"><\/td>\n <\/tr>\n <\/tbody>\n <\/table>\n\n"}}, user: null, formats: {"image":"gif","html":"xhtml","dimensions":{"width":1200,"height":1553}}, directory: "27997", currentPage: '1', firstPage: '1', lastPage: '13', pagesRead: [] }; </script> <script src="/read/js/clipboard.js"></script> <script src="/read/js/vendor.min.js"></script> <script src="/read/js/jquery.highlight.js"></script> <script src="/read/js/openbook.js"></script> <script language="javascript"> /* var $overlay = $('.info-overlay'); if(! docCookies.hasItem("overlay")){ document.cookie="overlay=1; path=/"; window.setTimeout(function(){ $overlay.addClass('active').css('zIndex', '1200'); $overlay.find('img').attr('src', function() { return $(this).data('original'); }); }, 1000); } */ </script> <!-- 144.171.19.7 --> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10