CINXE.COM

Journal of Medical Internet Research - Clinically Applicable Segmentation of Head and Neck Anatomy for Radiotherapy: Deep Learning Algorithm Development and Validation Study

<!doctype html><html data-n-head-ssr lang="en" data-n-head="%7B%22lang%22:%7B%22ssr%22:%22en%22%7D%7D"><head ><meta data-n-head="ssr" charset="utf-8"><meta data-n-head="ssr" name="viewport" content="width=device-width, initial-scale=1"><meta data-n-head="ssr" name="msapplication-TileColor" content="#247CB3"><meta data-n-head="ssr" name="msapplication-TileImage" content="https://asset.jmir.pub/assets/static/images/mstile-144x144.png"><meta data-n-head="ssr" name="description" content="Background: Over half a million individuals are diagnosed with head and neck cancer each year globally. Radiotherapy is an important curative treatment for this disease, but it requires manual time to delineate radiosensitive organs at risk. This planning process can delay treatment while also introducing interoperator variability, resulting in downstream radiation dose differences. Although auto-segmentation algorithms offer a potentially time-saving solution, the challenges in defining, quantifying, and achieving expert performance remain. Objective: Adopting a deep learning approach, we aim to demonstrate a 3D U-Net architecture that achieves expert-level performance in delineating 21 distinct head and neck organs at risk commonly segmented in clinical practice. Methods: The model was trained on a data set of 663 deidentified computed tomography scans acquired in routine clinical practice and with both segmentations taken from clinical practice and segmentations created by experienced radiographers as part of this research, all in accordance with consensus organ at risk definitions. Results: We demonstrated the model’s clinical applicability by assessing its performance on a test set of 21 computed tomography scans from clinical practice, each with 21 organs at risk segmented by 2 independent experts. We also introduced surface Dice similarity coefficient, a new metric for the comparison of organ delineation, to quantify the deviation between organ at risk surface contours rather than volumes, better reflecting the clinical task of correcting errors in automated organ segmentations. The model’s generalizability was then demonstrated on 2 distinct open-source data sets, reflecting different centers and countries to model training. Conclusions: Deep learning is an effective and clinically applicable technique for the segmentation of the head and neck anatomy for radiotherapy. With appropriate validation studies and regulatory approvals, this system could improve the efficiency, consistency, and safety of radiotherapy pathways. "><meta data-n-head="ssr" name="keywords" content="artificial intelligence; machine learning; radiotherapy; convolutional neural networks; segmentation; contouring; surface dsc; unet"><meta data-n-head="ssr" name="DC.Title" content="Clinically Applicable Segmentation of Head and Neck Anatomy for Radiotherapy: Deep Learning Algorithm Development and Validation Study"><meta data-n-head="ssr" name="DC.Subject" content="artificial intelligence; machine learning; radiotherapy; convolutional neural networks; segmentation; contouring; surface dsc; unet"><meta data-n-head="ssr" name="DC.Description" content="Background: Over half a million individuals are diagnosed with head and neck cancer each year globally. Radiotherapy is an important curative treatment for this disease, but it requires manual time to delineate radiosensitive organs at risk. This planning process can delay treatment while also introducing interoperator variability, resulting in downstream radiation dose differences. Although auto-segmentation algorithms offer a potentially time-saving solution, the challenges in defining, quantifying, and achieving expert performance remain. Objective: Adopting a deep learning approach, we aim to demonstrate a 3D U-Net architecture that achieves expert-level performance in delineating 21 distinct head and neck organs at risk commonly segmented in clinical practice. Methods: The model was trained on a data set of 663 deidentified computed tomography scans acquired in routine clinical practice and with both segmentations taken from clinical practice and segmentations created by experienced radiographers as part of this research, all in accordance with consensus organ at risk definitions. Results: We demonstrated the model’s clinical applicability by assessing its performance on a test set of 21 computed tomography scans from clinical practice, each with 21 organs at risk segmented by 2 independent experts. We also introduced surface Dice similarity coefficient, a new metric for the comparison of organ delineation, to quantify the deviation between organ at risk surface contours rather than volumes, better reflecting the clinical task of correcting errors in automated organ segmentations. The model’s generalizability was then demonstrated on 2 distinct open-source data sets, reflecting different centers and countries to model training. Conclusions: Deep learning is an effective and clinically applicable technique for the segmentation of the head and neck anatomy for radiotherapy. With appropriate validation studies and regulatory approvals, this system could improve the efficiency, consistency, and safety of radiotherapy pathways. "><meta data-n-head="ssr" name="DC.Publisher" content="Journal of Medical Internet Research"><meta data-n-head="ssr" name="DC.Publisher.Address" content="JMIR Publications // 130 Queens Quay East, Unit 1100 // Toronto, ON, M5A 0P6"><meta data-n-head="ssr" name="DC.Date" scheme="ISO8601" content="2021-07-12"><meta data-n-head="ssr" name="DC.Type" content="Text.Serial.Journal"><meta data-n-head="ssr" name="DC.Format" scheme="IMT" content="text/xml"><meta data-n-head="ssr" name="DC.Identifier" content="doi:10.2196/26151"><meta data-n-head="ssr" name="DC.Language" scheme="ISO639-1" content="EN"><meta data-n-head="ssr" name="DC.Relation" content="World"><meta data-n-head="ssr" name="DC.Source" content="J Med Internet Res 2021;23(7):e26151 https://www.jmir.org/2021/7/e26151"><meta data-n-head="ssr" name="DC.Rights" content=""><meta data-n-head="ssr" property="og:title" content="Clinically Applicable Segmentation of Head and Neck Anatomy for Radiotherapy: Deep Learning Algorithm Development and Validation Study"><meta data-n-head="ssr" property="og:type" content="article"><meta data-n-head="ssr" property="og:url" content="https://www.jmir.org/2021/7/e26151"><meta data-n-head="ssr" property="og:image" content="https://asset.jmir.pub/assets/391d0ca13f4f8f74602e2dbe63d25e08.png"><meta data-n-head="ssr" property="og:site_name" content="Journal of Medical Internet Research"><meta data-n-head="ssr" name="twitter:card" content="summary_large_image"><meta data-n-head="ssr" name="twitter:site" content="@jmirpub"><meta data-n-head="ssr" name="twitter:title" content="Clinically Applicable Segmentation of Head and Neck Anatomy for Radiotherapy: Deep Learning Algorithm Development and Validation Study"><meta data-n-head="ssr" name="twitter:description" content="Background: Over half a million individuals are diagnosed with head and neck cancer each year globally. Radiotherapy is an important curative treatment for this disease, but it requires manual time to delineate radiosensitive organs at risk. This planning process can delay treatment while also introducing interoperator variability, resulting in downstream radiation dose differences. Although auto-segmentation algorithms offer a potentially time-saving solution, the challenges in defining, quantifying, and achieving expert performance remain. Objective: Adopting a deep learning approach, we aim to demonstrate a 3D U-Net architecture that achieves expert-level performance in delineating 21 distinct head and neck organs at risk commonly segmented in clinical practice. Methods: The model was trained on a data set of 663 deidentified computed tomography scans acquired in routine clinical practice and with both segmentations taken from clinical practice and segmentations created by experienced radiographers as part of this research, all in accordance with consensus organ at risk definitions. Results: We demonstrated the model’s clinical applicability by assessing its performance on a test set of 21 computed tomography scans from clinical practice, each with 21 organs at risk segmented by 2 independent experts. We also introduced surface Dice similarity coefficient, a new metric for the comparison of organ delineation, to quantify the deviation between organ at risk surface contours rather than volumes, better reflecting the clinical task of correcting errors in automated organ segmentations. The model’s generalizability was then demonstrated on 2 distinct open-source data sets, reflecting different centers and countries to model training. Conclusions: Deep learning is an effective and clinically applicable technique for the segmentation of the head and neck anatomy for radiotherapy. With appropriate validation studies and regulatory approvals, this system could improve the efficiency, consistency, and safety of radiotherapy pathways. "><meta data-n-head="ssr" name="twitter:image" content="https://asset.jmir.pub/assets/391d0ca13f4f8f74602e2dbe63d25e08.png"><meta data-n-head="ssr" name="citation_title" content="Clinically Applicable Segmentation of Head and Neck Anatomy for Radiotherapy: Deep Learning Algorithm Development and Validation Study"><meta data-n-head="ssr" name="citation_journal_title" content="Journal of Medical Internet Research"><meta data-n-head="ssr" name="citation_publisher" content="JMIR Publications Inc., Toronto, Canada"><meta data-n-head="ssr" name="citation_doi" content="10.2196/26151"><meta data-n-head="ssr" name="citation_issue" content="7"><meta data-n-head="ssr" name="citation_volume" content="23"><meta data-n-head="ssr" name="citation_firstpage" content="e26151"><meta data-n-head="ssr" name="citation_date" content="2021-07-12"><meta data-n-head="ssr" name="citation_abstract_html_url" content="https://www.jmir.org/2021/7/e26151"><meta data-n-head="ssr" name="citation_abstract_pdf_url" content="https://www.jmir.org/2021/7/e26151/PDF"><meta data-n-head="ssr" name="DC.Creator" content="Stanislav"><meta data-n-head="ssr" name="DC.Contributor" content="Stanislav Nikolov"><meta data-n-head="ssr" name="DC.Contributor" content="Sam Blackwell"><meta data-n-head="ssr" name="DC.Contributor" content="Alexei Zverovitch"><meta data-n-head="ssr" name="DC.Contributor" content="Ruheena Mendes"><meta data-n-head="ssr" name="DC.Contributor" content="Michelle Livne"><meta data-n-head="ssr" name="DC.Contributor" content="Jeffrey De Fauw"><meta data-n-head="ssr" name="DC.Contributor" content="Yojan Patel"><meta data-n-head="ssr" name="DC.Contributor" content="Clemens Meyer"><meta data-n-head="ssr" name="DC.Contributor" content="Harry Askham"><meta data-n-head="ssr" name="DC.Contributor" content="Bernadino Romera-Paredes"><meta data-n-head="ssr" name="DC.Contributor" content="Christopher Kelly"><meta data-n-head="ssr" name="DC.Contributor" content="Alan Karthikesalingam"><meta data-n-head="ssr" name="DC.Contributor" content="Carlton Chu"><meta data-n-head="ssr" name="DC.Contributor" content="Dawn Carnell"><meta data-n-head="ssr" name="DC.Contributor" content="Cheng Boon"><meta data-n-head="ssr" name="DC.Contributor" content="Derek D&#x27;Souza"><meta data-n-head="ssr" name="DC.Contributor" content="Syed Ali Moinuddin"><meta data-n-head="ssr" name="DC.Contributor" content="Bethany Garie"><meta data-n-head="ssr" name="DC.Contributor" content="Yasmin McQuinlan"><meta data-n-head="ssr" name="DC.Contributor" content="Sarah Ireland"><meta data-n-head="ssr" name="DC.Contributor" content="Kiarna Hampton"><meta data-n-head="ssr" name="DC.Contributor" content="Krystle Fuller"><meta data-n-head="ssr" name="DC.Contributor" content="Hugh Montgomery"><meta data-n-head="ssr" name="DC.Contributor" content="Geraint Rees"><meta data-n-head="ssr" name="DC.Contributor" content="Mustafa Suleyman"><meta data-n-head="ssr" name="DC.Contributor" content="Trevor Back"><meta data-n-head="ssr" name="DC.Contributor" content="Cían Owen Hughes"><meta data-n-head="ssr" name="DC.Contributor" content="Joseph R Ledsam"><meta data-n-head="ssr" name="DC.Contributor" content="Olaf Ronneberger"><meta data-n-head="ssr" name="citation_authors" content="Stanislav Nikolov"><meta data-n-head="ssr" name="citation_authors" content="Sam Blackwell"><meta data-n-head="ssr" name="citation_authors" content="Alexei Zverovitch"><meta data-n-head="ssr" name="citation_authors" content="Ruheena Mendes"><meta data-n-head="ssr" name="citation_authors" content="Michelle Livne"><meta data-n-head="ssr" name="citation_authors" content="Jeffrey De Fauw"><meta data-n-head="ssr" name="citation_authors" content="Yojan Patel"><meta data-n-head="ssr" name="citation_authors" content="Clemens Meyer"><meta data-n-head="ssr" name="citation_authors" content="Harry Askham"><meta data-n-head="ssr" name="citation_authors" content="Bernadino Romera-Paredes"><meta data-n-head="ssr" name="citation_authors" content="Christopher Kelly"><meta data-n-head="ssr" name="citation_authors" content="Alan Karthikesalingam"><meta data-n-head="ssr" name="citation_authors" content="Carlton Chu"><meta data-n-head="ssr" name="citation_authors" content="Dawn Carnell"><meta data-n-head="ssr" name="citation_authors" content="Cheng Boon"><meta data-n-head="ssr" name="citation_authors" content="Derek D&#x27;Souza"><meta data-n-head="ssr" name="citation_authors" content="Syed Ali Moinuddin"><meta data-n-head="ssr" name="citation_authors" content="Bethany Garie"><meta data-n-head="ssr" name="citation_authors" content="Yasmin McQuinlan"><meta data-n-head="ssr" name="citation_authors" content="Sarah Ireland"><meta data-n-head="ssr" name="citation_authors" content="Kiarna Hampton"><meta data-n-head="ssr" name="citation_authors" content="Krystle Fuller"><meta data-n-head="ssr" name="citation_authors" content="Hugh Montgomery"><meta data-n-head="ssr" name="citation_authors" content="Geraint Rees"><meta data-n-head="ssr" name="citation_authors" content="Mustafa Suleyman"><meta data-n-head="ssr" name="citation_authors" content="Trevor Back"><meta data-n-head="ssr" name="citation_authors" content="Cían Owen Hughes"><meta data-n-head="ssr" name="citation_authors" content="Joseph R Ledsam"><meta data-n-head="ssr" name="citation_authors" content="Olaf Ronneberger"><title>Journal of Medical Internet Research - Clinically Applicable Segmentation of Head and Neck Anatomy for Radiotherapy: Deep Learning Algorithm Development and Validation Study</title><link data-n-head="ssr" rel="apple-touch-icon" sizes="57x57" href="https://asset.jmir.pub/assets/static/images/apple-touch-icon-57x57.png"><link data-n-head="ssr" rel="apple-touch-icon" sizes="114x114" href="https://asset.jmir.pub/assets/static/images/apple-touch-icon-114x114.png"><link data-n-head="ssr" rel="apple-touch-icon" sizes="72x72" href="https://asset.jmir.pub/assets/static/images/apple-touch-icon-72x72.png"><link data-n-head="ssr" rel="apple-touch-icon" sizes="144x144" href="https://asset.jmir.pub/assets/static/images/apple-touch-icon-144x144.png"><link data-n-head="ssr" rel="apple-touch-icon" sizes="60x60" href="https://asset.jmir.pub/assets/static/images/apple-touch-icon-60x60.png"><link data-n-head="ssr" rel="apple-touch-icon" sizes="120x120" href="https://asset.jmir.pub/assets/static/images/apple-touch-icon-120x120.png"><link data-n-head="ssr" rel="apple-touch-icon" sizes="76x76" href="https://asset.jmir.pub/assets/static/images/apple-touch-icon-76x76.png"><link data-n-head="ssr" rel="apple-touch-icon" sizes="152x152" href="https://asset.jmir.pub/assets/static/images/apple-touch-icon-152x152.png"><link data-n-head="ssr" rel="icon" type="image/png" href="https://asset.jmir.pub/assets/static/images/favicon-196x196.png" sizes="196x196"><link data-n-head="ssr" rel="icon" type="image/png" href="https://asset.jmir.pub/assets/static/images/favicon-160x160.png" sizes="160x160"><link data-n-head="ssr" rel="icon" type="image/png" href="https://asset.jmir.pub/assets/static/images/favicon-96x96.png" sizes="96x96"><link data-n-head="ssr" rel="icon" type="image/png" href="https://asset.jmir.pub/assets/static/images/favicon-16x16.png" sizes="16x16"><link data-n-head="ssr" rel="icon" type="image/png" href="https://asset.jmir.pub/assets/static/images/favicon-32x32.png" sizes="32x32"><link data-n-head="ssr" rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/4.3.1/css/bootstrap-grid.min.css" defer><link data-n-head="ssr" rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/6.4.2/css/all.min.css" defer><link data-n-head="ssr" rel="stylesheet" href="https://fonts.googleapis.com/css?family=Roboto:100,100i,300,300i,400,400i,500,500i,700,700i,900,900i&amp;display=swap" defer><link data-n-head="ssr" rel="schema.DC" href="http://purl.org/dc/elements/1.1/"><script data-n-head="ssr" type="text/javascript" id="hs-script-loader" src="//js.hs-scripts.com/19668141.js"></script><script data-n-head="ssr" data-hid="twitter-script" type="text/javascript" charset="utf-8"> !function(e,t,n,s,u,a){e.twq||(s=e.twq=function(){s.exe?s.exe.apply(s,arguments):s.queue.push(arguments);},s.version='1.1',s.queue=[],u=t.createElement(n),u.async=!0,u.src='https://static.ads-twitter.com/uwt.js',a=t.getElementsByTagName(n)[0],a.parentNode.insertBefore(u,a))}(window,document,'script');twq('config','o7i83'); </script><script data-n-head="ssr" src="https://js.trendmd.com/trendmd-ns.min.js" defer data-trendmdconfig="{"element": "#trendmd-suggestions"}"></script><link rel="preload" href="/_nuxt/d762be0.js" as="script"><link rel="preload" href="/_nuxt/114bf28.js" as="script"><link rel="preload" href="/_nuxt/557a776.js" as="script"><link rel="preload" href="/_nuxt/f65c9d0.js" as="script"><link rel="preload" href="/_nuxt/b7cb769.js" as="script"><link rel="preload" href="/_nuxt/65075b6.js" as="script"><link rel="preload" href="/_nuxt/8499a81.js" as="script"><link rel="preload" href="/_nuxt/60620ca.js" as="script"><link rel="preload" href="/_nuxt/7bab8a9.js" as="script"><link rel="preload" href="/_nuxt/6c336c8.js" as="script"><link rel="preload" href="/_nuxt/e63f738.js" as="script"><link rel="preload" href="/_nuxt/2f3783d.js" as="script"><link rel="preload" href="/_nuxt/8679c54.js" as="script"><link rel="preload" href="/_nuxt/84ba7a4.js" as="script"><link rel="preload" href="/_nuxt/2042386.js" as="script"><link rel="preload" href="/_nuxt/7aa873e.js" as="script"><link rel="preload" href="/_nuxt/4f5e72f.js" as="script"><link rel="preload" href="/_nuxt/512387a.js" as="script"><link rel="preload" href="/_nuxt/69df2f4.js" as="script"><link rel="preload" href="/_nuxt/c8abcfd.js" as="script"><link rel="preload" href="/_nuxt/ef88041.js" as="script"><link rel="preload" href="/_nuxt/4e67ec6.js" as="script"><link rel="preload" href="/_nuxt/edec9c1.js" as="script"><link rel="preload" href="/_nuxt/a0160a8.js" as="script"><link rel="preload" href="/_nuxt/88d69a9.js" as="script"><link rel="preload" href="/_nuxt/456ce06.js" as="script"><link rel="preload" href="/_nuxt/3049f10.js" as="script"><link rel="preload" href="/_nuxt/fec3470.js" as="script"><link rel="preload" href="/_nuxt/48052c9.js" as="script"><link rel="preload" href="/_nuxt/f76b897.js" as="script"><link rel="preload" href="/_nuxt/f2fb177.js" as="script"><link rel="preload" href="/_nuxt/fdd6c0c.js" as="script"><link rel="preload" href="/_nuxt/51c3064.js" as="script"><link rel="preload" href="/_nuxt/67892e3.js" as="script"><link rel="preload" href="/_nuxt/6c2c095.js" as="script"><link rel="preload" href="/_nuxt/8977131.js" as="script"><link rel="preload" href="/_nuxt/e3a1965.js" as="script"><link rel="preload" href="/_nuxt/c7860cc.js" as="script"><link rel="preload" href="/_nuxt/ceb5b90.js" as="script"><link rel="preload" href="/_nuxt/2715511.js" as="script"><link rel="preload" href="/_nuxt/7819e44.js" as="script"><link rel="preload" href="/_nuxt/8d00881.js" as="script"><link rel="preload" href="/_nuxt/6171ac4.js" as="script"><link rel="preload" href="/_nuxt/9a8e2c4.js" as="script"><link rel="preload" href="/_nuxt/3c04820.js" as="script"><link rel="preload" href="/_nuxt/713c552.js" as="script"><link rel="preload" href="/_nuxt/ee7c71d.js" as="script"><link rel="preload" href="/_nuxt/0c77ea6.js" as="script"><link rel="preload" href="/_nuxt/c4453ca.js" as="script"><link rel="preload" href="/_nuxt/f3339c8.js" as="script"><link rel="preload" href="/_nuxt/76715db.js" as="script"><style data-vue-ssr-id="7975b328:0 c46114bc:0 b8d2539e:0 f8852dd4:0 a2b5256c:0 19d22970:0 e58746f2:0 6f03c485:0 aca923aa:0 79d6150e:0 cd6e1cc0:0 776ec731:0 d8137ec4:0">.mt-0{margin-top:0!important}.mt-5{margin-top:5px!important}.mt-10{margin-top:10px!important}.mt-15{margin-top:15px!important}.mt-20{margin-top:20px!important}.mt-25{margin-top:25px!important}.mt-30{margin-top:30px!important}.mt-35{margin-top:35px!important}.mt-40{margin-top:40px!important}.mt-45{margin-top:45px!important}.mt-50{margin-top:50px!important}.mt-55{margin-top:55px!important}.mt-60{margin-top:60px!important}.mt-65{margin-top:65px!important}.mt-70{margin-top:70px!important}.mt-75{margin-top:75px!important}.mt-80{margin-top:80px!important}.mt-85{margin-top:85px!important}.mt-90{margin-top:90px!important}.mt-95{margin-top:95px!important}.mt-100{margin-top:100px!important}.mb-0{margin-bottom:0!important}.mb-5{margin-bottom:5px!important}.mb-10{margin-bottom:10px!important}.mb-15{margin-bottom:15px!important}.mb-20{margin-bottom:20px!important}.mb-25{margin-bottom:25px!important}.mb-30{margin-bottom:30px!important}.mb-35{margin-bottom:35px!important}.mb-40{margin-bottom:40px!important}.mb-45{margin-bottom:45px!important}.mb-50{margin-bottom:50px!important}.mb-55{margin-bottom:55px!important}.mb-60{margin-bottom:60px!important}.mb-65{margin-bottom:65px!important}.mb-70{margin-bottom:70px!important}.mb-75{margin-bottom:75px!important}.mb-80{margin-bottom:80px!important}.mb-85{margin-bottom:85px!important}.mb-90{margin-bottom:90px!important}.mb-95{margin-bottom:95px!important}.mb-100{margin-bottom:100px!important}.ml-0{margin-left:0!important}.ml-5{margin-left:5px!important}.ml-10{margin-left:10px!important}.ml-15{margin-left:15px!important}.ml-20{margin-left:20px!important}.ml-25{margin-left:25px!important}.ml-30{margin-left:30px!important}.ml-35{margin-left:35px!important}.ml-40{margin-left:40px!important}.ml-45{margin-left:45px!important}.ml-50{margin-left:50px!important}.ml-55{margin-left:55px!important}.ml-60{margin-left:60px!important}.ml-65{margin-left:65px!important}.ml-70{margin-left:70px!important}.ml-75{margin-left:75px!important}.ml-80{margin-left:80px!important}.ml-85{margin-left:85px!important}.ml-90{margin-left:90px!important}.ml-95{margin-left:95px!important}.ml-100{margin-left:100px!important}.mr-0{margin-right:0!important}.mr-5{margin-right:5px!important}.mr-10{margin-right:10px!important}.mr-15{margin-right:15px!important}.mr-20{margin-right:20px!important}.mr-25{margin-right:25px!important}.mr-30{margin-right:30px!important}.mr-35{margin-right:35px!important}.mr-40{margin-right:40px!important}.mr-45{margin-right:45px!important}.mr-50{margin-right:50px!important}.mr-55{margin-right:55px!important}.mr-60{margin-right:60px!important}.mr-65{margin-right:65px!important}.mr-70{margin-right:70px!important}.mr-75{margin-right:75px!important}.mr-80{margin-right:80px!important}.mr-85{margin-right:85px!important}.mr-90{margin-right:90px!important}.mr-95{margin-right:95px!important}.mr-100{margin-right:100px!important}.pt-0{padding-top:0!important}.pt-5{padding-top:5px!important}.pt-10{padding-top:10px!important}.pt-15{padding-top:15px!important}.pt-20{padding-top:20px!important}.pt-25{padding-top:25px!important}.pt-30{padding-top:30px!important}.pt-35{padding-top:35px!important}.pt-40{padding-top:40px!important}.pt-45{padding-top:45px!important}.pt-50{padding-top:50px!important}.pt-55{padding-top:55px!important}.pt-60{padding-top:60px!important}.pt-65{padding-top:65px!important}.pt-70{padding-top:70px!important}.pt-75{padding-top:75px!important}.pt-80{padding-top:80px!important}.pt-85{padding-top:85px!important}.pt-90{padding-top:90px!important}.pt-95{padding-top:95px!important}.pt-100{padding-top:100px!important}.pb-0{padding-bottom:0!important}.pb-5{padding-bottom:5px!important}.pb-10{padding-bottom:10px!important}.pb-15{padding-bottom:15px!important}.pb-20{padding-bottom:20px!important}.pb-25{padding-bottom:25px!important}.pb-30{padding-bottom:30px!important}.pb-35{padding-bottom:35px!important}.pb-40{padding-bottom:40px!important}.pb-45{padding-bottom:45px!important}.pb-50{padding-bottom:50px!important}.pb-55{padding-bottom:55px!important}.pb-60{padding-bottom:60px!important}.pb-65{padding-bottom:65px!important}.pb-70{padding-bottom:70px!important}.pb-75{padding-bottom:75px!important}.pb-80{padding-bottom:80px!important}.pb-85{padding-bottom:85px!important}.pb-90{padding-bottom:90px!important}.pb-95{padding-bottom:95px!important}.pb-100{padding-bottom:100px!important}.pl-0{padding-left:0!important}.pl-5{padding-left:5px!important}.pl-10{padding-left:10px!important}.pl-15{padding-left:15px!important}.pl-20{padding-left:20px!important}.pl-25{padding-left:25px!important}.pl-30{padding-left:30px!important}.pl-35{padding-left:35px!important}.pl-40{padding-left:40px!important}.pl-45{padding-left:45px!important}.pl-50{padding-left:50px!important}.pl-55{padding-left:55px!important}.pl-60{padding-left:60px!important}.pl-65{padding-left:65px!important}.pl-70{padding-left:70px!important}.pl-75{padding-left:75px!important}.pl-80{padding-left:80px!important}.pl-85{padding-left:85px!important}.pl-90{padding-left:90px!important}.pl-95{padding-left:95px!important}.pl-100{padding-left:100px!important}.pr-0{padding-right:0!important}.pr-5{padding-right:5px!important}.pr-10{padding-right:10px!important}.pr-15{padding-right:15px!important}.pr-20{padding-right:20px!important}.pr-25{padding-right:25px!important}.pr-30{padding-right:30px!important}.pr-35{padding-right:35px!important}.pr-40{padding-right:40px!important}.pr-45{padding-right:45px!important}.pr-50{padding-right:50px!important}.pr-55{padding-right:55px!important}.pr-60{padding-right:60px!important}.pr-65{padding-right:65px!important}.pr-70{padding-right:70px!important}.pr-75{padding-right:75px!important}.pr-80{padding-right:80px!important}.pr-85{padding-right:85px!important}.pr-90{padding-right:90px!important}.pr-95{padding-right:95px!important}.pr-100{padding-right:100px!important}input,select,textarea{border-radius:3px!important;font-size:1.6rem!important}textarea{font-family:Roboto-Regular,Roboto}input[type=date],input[type=email],input[type=hidden],input[type=number],input[type=search],input[type=tel],input[type=text],input[type=url]{border:1px solid rgba(26,37,76,.396);height:3.2rem;padding:0 10px}input[type=date]:focus,input[type=email]:focus,input[type=hidden]:focus,input[type=number]:focus,input[type=search]:focus,input[type=tel]:focus,input[type=text]:focus,input[type=url]:focus{background-color:rgba(48,136,223,.071);border:1px solid #1e70c2;outline:none}input::-moz-placeholder{opacity:.4}input::placeholder{opacity:.4}.v-select .vs__selected{margin:2px}.v-select .vs__dropdown-toggle{border:1px solid rgba(26,37,76,.396)!important;padding:0!important}.v-select input{border:1px solid transparent!important;margin:0}.v-select ul.vs__dropdown-menu{paddinng-top:0!important}.v-select.vs--single .vs__selected{margin:0 3px 0 5px}textarea{border:1px solid rgba(26,37,76,.396);padding:5px 10px;width:100%}textarea:focus{background-color:rgba(48,136,223,.071);border:1px solid #1e70c2;outline:none}select{border:1px solid rgba(26,37,76,.396);cursor:pointer;height:3.2rem}select:focus{background-color:rgba(48,136,223,.071);border:1px solid #1e70c2;outline:none}input::-webkit-input-placeholder,textarea::-webkit-input-placeholder{font-size:1.2rem}select::-webkit-input-placeholder{font-size:1.2rem}::-moz-selection{background-color:#1e70c2;color:#fff}::selection{background-color:#1e70c2;color:#fff}a{color:#1e70c2;outline:none;-webkit-text-decoration:none;text-decoration:none}a:focus,a:hover{-webkit-text-decoration:underline;text-decoration:underline}a:focus{font-weight:700}button:focus{outline:none}.deactive{cursor:not-allowed;pointer-events:none}.element-wrapper{margin-bottom:7rem}.page-heading{background-color:#f1f3f5;border-bottom:1px solid hsla(0,0%,44%,.161);margin-bottom:60px;padding:30px 0}.page-heading h1{margin:0}.link{color:#2078cf;outline:none;-webkit-text-decoration:none;text-decoration:none}.link:focus,.link:hover{color:#2078cf;-webkit-text-decoration:underline;text-decoration:underline}.title-link{color:#1a254c;outline:none;-webkit-text-decoration:none;text-decoration:none}.title-link:focus,.title-link:hover{color:#2078cf;-webkit-text-decoration:underline;text-decoration:underline}.h1,.h2,.h3,.h4,.h5,.h6,h2,h3,h4,h5,h6,pwdh1{font-weight:700}.h1,h1{font-size:4rem;line-height:5rem}.h2,h2{font-size:3.2rem;line-height:4rem}.h3,h3{font-size:2.6rem;line-height:3.4rem}.h4,h4{font-size:1.8rem;line-height:2.4rem}.h5,h5{font-size:1.6rem;line-height:2.2rem}.h6,h6{font-size:1.4rem}.h6,h6,p{line-height:2rem}small{line-height:1.8rem}.h1{font-size:4rem!important;line-height:5rem!important}.h2{font-size:3.2rem!important;line-height:4rem!important}.h3{font-size:2.6rem!important;line-height:3.4rem!important}.h4{font-size:1.8rem!important;line-height:2.4rem!important}.h5{font-size:1.6rem!important;line-height:2.2rem!important}.h6{font-size:1.4rem!important;line-height:2rem!important}input.disabled,select.disabled,textarea.disabled{background:hsla(0,0%,82%,.29);cursor:not-allowed}button.disabled{cursor:not-allowed;opacity:.5;pointer-events:none}strong{font-weight:700!important}.disabled-section{cursor:not-allowed;opacity:.5}.fa,.fas{font-weight:900}.errors{color:red;display:block}.screen-readers-only{height:1px;left:-10000px;overflow:hidden;position:absolute;top:auto;width:1px}input[type=text].input-error{border:1px solid red!important;border-radius:3px}.input-error{border:1px solid red!important;border-radius:3px}.popper{max-width:400px;padding:10px;text-align:justify}.vue-notification{margin:20px 20px 0 0}.vue-notification.toast-success{background:#4caf50;border-left:5px solid #1a254c}.vue-dropzone{border:2px dashed #e5e5e5}.vue-dropzone .icon{display:block;font-size:25px;margin-bottom:10px}.required:before{color:red;content:"*"}.grey-heading-underline{border-bottom:2px solid #c8cad4}.green-heading-underline{border-bottom:2px solid #367c3a}.green-underline{background:#367c3a;content:"";height:3px;margin:0 auto 40px;width:100px}.separator{color:#000;margin:0 10px;opacity:.4}.list-style-none{list-style-type:none}.list-style-none li{margin-bottom:10px}.width-100{width:100%!important}.width-fit-content{width:-moz-fit-content;width:fit-content}.break-word{word-break:break-word}.text-center{text-align:center}.d-inline-block{display:inline-block}.d-flex{display:flex}.d-block{display:block}.flex-direction-column{flex-direction:column}.justify-content-space-between{justify-content:space-between}.align-items-center{align-items:center}.align-items-baseline{align-items:baseline}.fs-10{font-size:1rem;line-height:1.6rem}.fs-12{font-size:1.2rem;line-height:1.8rem}.fs-14{font-size:1.4rem;line-height:2rem}.fs-16{font-size:1.6rem;line-height:2.4rem}.fs-18{font-size:1.8rem;line-height:2.6rem}.fs-20{font-size:2rem;line-height:3rem}.fs-italic{font-style:italic}.fw-bold{font-weight:700}.color-blue{color:#1e70c2}.color-green{color:#367c3a}.color-red{color:#b30000}.ql-toolbar{background-color:#f8f9fa}.btn{cursor:pointer;opacity:1;text-align:center;transition:.3s}.btn:focus,.btn:hover{font-weight:400;opacity:.9;-webkit-text-decoration:none;text-decoration:none}.btn:focus{outline:2px solid #f69038!important;outline-offset:6px!important}.btn-disabled{cursor:not-allowed;opacity:.6;pointer-events:none}.btn-small{font-size:1.2rem;padding:5px 10px}.btn-medium{font-size:1.4rem;padding:10px 20px}.btn-large{font-size:1.8rem;padding:20px 40px}.btn-blue{background-color:#1e70c2;border:1px solid #1e70c2;color:#fff}.btn-blue:active{background-color:#2b7bca;border:1px solid #2b7bca}.btn-green{background-color:#367c3a;border:1px solid #367c3a;color:#fff}.btn-green:active{background-color:#3b9d3f;border:1px solid #3b9d3f}.btn-grey{background-color:#f1f3f5;border:1px solid #dcdee0;color:#1a254c}.btn-grey:active{background-color:#dcdee0}.btn-red{background-color:#b30000;border:1px solid #b30000;color:#fff}.btn-red:active{background-color:#ba302d;border:1px solid #ba302d}.btn-blue-pill{background-color:inherit;border:none;border-radius:20px;color:#1e70c2;cursor:pointer;font-size:1.4rem;padding:5px 10px;transition:.3s}.btn-blue-pill:focus,.btn-blue-pill:hover{background-color:rgba(48,136,223,.161);-webkit-text-decoration:none;text-decoration:none}.btn-blue-pill:active{background-color:rgba(48,136,223,.29)}.btn-green-pill{background-color:inherit;border:none;border-radius:20px;color:#367c3a;cursor:pointer;font-size:1.4rem;padding:5px 10px;transition:.3s}.btn-green-pill:focus,.btn-green-pill:hover{background-color:rgba(76,175,80,.188);-webkit-text-decoration:none;text-decoration:none}.btn-green-pill:active{background-color:rgba(76,175,80,.29)}.btn-grey-pill{background-color:inherit;border:none;border-radius:20px;color:#1a254c;cursor:pointer;font-size:1.4rem;padding:5px 10px;transition:.3s}.btn-grey-pill:focus,.btn-grey-pill:hover{background-color:#dcdee0;-webkit-text-decoration:none;text-decoration:none}.btn-grey-pill:active{background-color:#cccdce}.btn-red-pill{background-color:inherit;border:none;border-radius:20px;color:#fa2a24;cursor:pointer;font-size:1.4rem;padding:5px 10px;transition:.3s}.btn-red-pill:focus,.btn-red-pill:hover{background-color:rgba(255,0,0,.122);-webkit-text-decoration:none;text-decoration:none}.btn-red-pill:active{background-color:rgba(255,0,0,.22)}.sm-icons a{border-radius:50%;color:#fff;cursor:pointer;display:inline-block;font-size:14px;height:25px;line-height:25px;padding:0;text-align:center;-webkit-text-decoration:none;text-decoration:none;transition:.3s;width:25px}.sm-icons a:hover{opacity:.9}.sm-icons a:focus{opacity:.9;outline:none;transform:scale(1.2)}.sm-icons .twitter{background-color:#000}.sm-icons .twitter:before{content:"";font-family:"Font Awesome 5 Brands";font-weight:900}.sm-icons .facebook{background-color:#3b5a98}.sm-icons .facebook:before{content:"";font-family:"Font Awesome 5 Brands";font-weight:900}.sm-icons .linkedin{background-color:#0077b5}.sm-icons .linkedin:before{content:"";font-family:"Font Awesome 5 Brands";font-weight:900}.sm-icons .youtube{background-color:red}.sm-icons .youtube:before{content:"";font-family:"Font Awesome 5 Brands";font-weight:900}.sm-icons .instagram{background:radial-gradient(circle at 30% 107%,#fdf497 0,#fdf497 5%,#fd5949 45%,#d6249f 60%,#285aeb 90%)}.sm-icons .instagram:before{content:"";font-family:"Font Awesome 5 Brands";font-weight:900}.sm-icons .email{background-color:#cc2126}.sm-icons .email:before{content:"";font-family:"Font Awesome 5 Free";font-weight:900}.sm-icons .rss{background-color:#ee802f}.sm-icons .rss:before{content:"";font-family:"Font Awesome 5 Free";font-weight:900}.full-width-card-wrapper .full-width-card{border:1px solid #ced1dc;display:flex;margin:10px 0 20px}@media screen and (max-width:61.9375em){.full-width-card-wrapper .full-width-card{flex-wrap:wrap;justify-content:space-between}}.full-width-card-wrapper .full-width-card-img{height:auto;position:relative;width:250px}@media screen and (max-width:61.9375em){.full-width-card-wrapper .full-width-card-img{flex-basis:50%}}@media screen and (max-width:47.9375em){.full-width-card-wrapper .full-width-card-img{width:100%}}.full-width-card-wrapper .full-width-card-img img{border:1px solid #ced1dc;height:auto;width:100%}.full-width-card-wrapper .full-width-card-img-info{background-color:#1e70c2;border-radius:3px 0 0 0;bottom:4px;cursor:pointer;outline:none;padding:11px;position:absolute;right:1px}.full-width-card-wrapper .full-width-card-img-info .icon{color:#fff;font-size:1.8rem;transition:all .3s ease}.full-width-card-wrapper .full-width-card-img-info:hover{padding-bottom:10px}.full-width-card-wrapper .full-width-card-img-info:hover .icon{font-size:2rem}.full-width-card-wrapper .full-width-card-img-info:focus .icon{font-size:2.4rem}.full-width-card-wrapper .full-width-card-info{flex-grow:1;padding:15px 15px 15px 0}@media screen and (max-width:61.9375em){.full-width-card-wrapper .full-width-card-info{flex-basis:50%}}@media screen and (max-width:47.9375em){.full-width-card-wrapper .full-width-card-info{flex-basis:100%;padding:15px}}.full-width-card-wrapper .full-width-card-info-title{margin-top:0}.full-width-card-wrapper .full-width-card-highlight:after,.full-width-card-wrapper .full-width-card-highlight:before{color:#5d6581;content:"..."}.full-width-card-wrapper .full-width-card-info-download-links a{margin-right:9px}@media screen and (max-width:61.9375em){.full-width-card-wrapper .full-width-card-info-button button{margin:10px 0 0;width:100%}}.full-width-card-wrapper .full-width-card-info-group-buttons{display:flex;margin-top:14px}@media screen and (max-width:61.9375em){.full-width-card-wrapper .full-width-card-info-group-buttons a:first-child{display:block;margin-bottom:14px;margin-left:0!important;margin-right:0!important}.full-width-card-wrapper .full-width-card-info-group-buttons a:last-child{display:block;margin-left:0!important;margin-right:0!important}.full-width-card-wrapper .full-width-card-info-group-buttons{display:block}}.full-width-card-wrapper .full-width-card-info-group-buttons button{margin-right:10px}@media screen and (max-width:61.9375em){.full-width-card-wrapper .full-width-card-info-group-buttons button{margin:10px 0 0;width:100%}}.full-width-card-wrapper .full-width-card-info-date-published{margin-bottom:20px}.full-width-card-wrapper .full-width-card-altmetric{align-self:center;margin-left:auto;padding-right:15px}@media screen and (max-width:61.9375em){.full-width-card-wrapper .full-width-card-altmetric{margin:10px auto 15px;padding:0}}.full-width-card-wrapper .full-width-card-altmetric img{width:auto}.img-cont{padding:15px}@media screen and (max-width:47.9375em){.img-cont{width:100%}}.cards{justify-content:space-between;margin-bottom:14px}.cards,.cards .card{display:flex;flex-wrap:wrap}.cards .card{flex-basis:31%;flex-direction:column;flex-grow:0;flex-shrink:0;transition:.3s}@media screen and (max-width:61.9375em){.cards .card{flex-basis:48%}}@media screen and (max-width:47.9375em){.cards .card{flex-basis:100%}}.cards .card:hover .card-header .card-img img{filter:brightness(1);transform:scale(1.1);transition:all .3s ease}.cards .card-header,.cards .card-img{border-radius:10px;border-radius:3px;overflow:hidden}.cards .card-img{height:200px;position:relative}.cards .card-img img{filter:brightness(.6);height:auto;transition:all .3s ease;width:100%}@media screen and (max-width:61.9375em){.cards .card-img{height:180px}}@media screen and (max-width:47.9375em){.cards .card-img{height:200px}}.cards .card-img-info{background-color:#1e70c2;border-radius:3px 0 0 0;cursor:pointer;outline:none;padding:11px;position:absolute;right:0;top:160px}@media screen and (max-width:61.9375em){.cards .card-img-info{top:140px}}@media screen and (max-width:47.9375em){.cards .card-img-info{top:160px}}.cards .card-img-info .icon{color:#fff;font-size:1.8rem;transition:all .3s ease}.cards .card-img-info:hover .icon{font-size:2rem}.cards .card-img-info:focus .icon{font-size:2.4rem}.cards .card-body{flex-grow:1;flex-shrink:0}.cards .card-body,.cards .card-title{display:flex;flex-direction:column}.cards .card-title a{color:#1a254c;outline:none;transition:all .3s ease}.cards .card-title a:focus,.cards .card-title a:hover{color:#1e70c2;-webkit-text-decoration:underline;text-decoration:underline;transition:all .3s ease}.cards .card-info p{max-height:125px;overflow:scroll}.cards .card-info p a{color:#1e70c2;cursor:pointer}.cards .card-info p a:hover{-webkit-text-decoration:underline;text-decoration:underline}.cards .card-years{display:flex;flex-wrap:wrap}.cards .card-years a{color:#1e70c2;cursor:pointer;margin:0 10px 5px 0}.cards .card-years a:focus,.cards .card-years a:hover{outline:none;-webkit-text-decoration:underline;text-decoration:underline}.cards .card-date-social{color:#5d6581;display:flex;flex-wrap:wrap}.v--modal-overlay{background:rgba(17,26,55,.7);box-sizing:border-box;height:100vh;left:0;opacity:1;position:fixed;top:0;width:100%;z-index:99999}.v--modal-overlay .v--modal-background-click{min-height:100%;padding-bottom:10px;width:100%}.v--modal-overlay .v--modal-background-click .v--modal-top-right{display:block;position:absolute;right:0;top:0}.v--modal-overlay .v--modal-background-click .v--modal-box{box-sizing:border-box;position:relative}.v--modal-overlay .v--modal-background-click .v--modal{border-radius:3px;box-shadow:0 20px 60px -2px rgba(27,33,58,.4);text-align:left}.modal-window{background-color:#fff;height:100%;overflow:hidden;position:relative}.modal-window-header{background:#f3f3f5;border-bottom:1px solid #bab4b4;display:flex;flex-wrap:wrap;justify-content:space-between;padding:10px 20px}.modal-window-title{margin:0}.modal-window-close{background-color:#f3f3f5;border:none;color:gray;cursor:pointer;font-size:2rem;transition:.3s}.modal-window-close:hover{color:#000}.modal-window-body{height:-webkit-fill-available;max-height:400px;overflow-y:auto;padding:20px}.modal-window-footer{background-color:#f8f9fa;border-top:1px solid #bab4b4}.modal-window-footer div{float:right;padding:10px 20px} .cookieControl__Modal-enter-active,.cookieControl__Modal-leave-active{transition:opacity .25s}.cookieControl__Modal-enter,.cookieControl__Modal-leave-to{opacity:0}.cookieControl__Bar--center{left:50%;top:50%;transform:translate(-50%,-50%)}.cookieControl__Bar--bottom-full-enter-active,.cookieControl__Bar--bottom-full-leave-active,.cookieControl__Bar--bottom-left-enter-active,.cookieControl__Bar--bottom-left-leave-active,.cookieControl__Bar--bottom-right-enter-active,.cookieControl__Bar--bottom-right-leave-active,.cookieControl__Bar--center-enter-active,.cookieControl__Bar--center-leave-active,.cookieControl__Bar--top-full-enter-active,.cookieControl__Bar--top-full-leave-active,.cookieControl__Bar--top-left-enter-active,.cookieControl__Bar--top-left-leave-active,.cookieControl__Bar--top-right-enter-active,.cookieControl__Bar--top-right-leave-active{transition:transform .25s}.cookieControl__Bar--top-full-enter,.cookieControl__Bar--top-full-leave-to,.cookieControl__Bar--top-left-enter,.cookieControl__Bar--top-left-leave-to,.cookieControl__Bar--top-right-enter,.cookieControl__Bar--top-right-leave-to{transform:translateY(-100%)}.cookieControl__Bar--bottom-full-enter,.cookieControl__Bar--bottom-full-leave-to,.cookieControl__Bar--bottom-left-enter,.cookieControl__Bar--bottom-left-leave-to,.cookieControl__Bar--bottom-right-enter,.cookieControl__Bar--bottom-right-leave-to{transform:translateY(100%)}.cookieControl__Bar--center-enter,.cookieControl__Bar--center-leave-to{transform:translate(-50%,-50%) scale(.95)}.cookieControl{position:relative;z-index:100000}.cookieControl button{backface-visibility:hidden;border:0;cursor:pointer;font-size:16px;outline:0;padding:12px 20px;transition:background-color .2s,color .2s}.cookieControl__Bar{background-color:var(--cookie-control-barBackground);font-family:Arial,"Helvetica Neue",Helvetica,sans-serif;position:fixed}.cookieControl__Bar h3,.cookieControl__Bar p{color:var(--cookie-control-barTextColor);max-width:900px}.cookieControl__Bar h3{font-size:20px;margin:0}.cookieControl__Bar p{font-size:16px;margin:5px 0 0}.cookieControl__Bar button{background-color:var(--cookie-control-barButtonBackground);color:var(--cookie-control-barButtonColor)}.cookieControl__Bar button:hover{background-color:var(--cookie-control-barButtonHoverBackground);color:var(--cookie-control-barButtonHoverColor)}.cookieControl__Bar button+button{margin-left:10px}.cookieControl__BarContainer{align-items:flex-end;display:flex;justify-content:space-between;padding:20px}.cookieControl__Bar--bottom-full,.cookieControl__Bar--top-full{left:0;right:0}.cookieControl__Bar--top-full{top:0}.cookieControl__Bar--bottom-full{bottom:0}.cookieControl__Bar--bottom-left p,.cookieControl__Bar--bottom-right p,.cookieControl__Bar--center p,.cookieControl__Bar--top-left p,.cookieControl__Bar--top-right p{max-width:400px}.cookieControl__Bar--bottom-left .cookieControl__BarContainer,.cookieControl__Bar--bottom-right .cookieControl__BarContainer,.cookieControl__Bar--center .cookieControl__BarContainer,.cookieControl__Bar--top-left .cookieControl__BarContainer,.cookieControl__Bar--top-right .cookieControl__BarContainer{flex-direction:column}.cookieControl__Bar--bottom-left .cookieControl__BarButtons,.cookieControl__Bar--bottom-right .cookieControl__BarButtons,.cookieControl__Bar--center .cookieControl__BarButtons,.cookieControl__Bar--top-left .cookieControl__BarButtons,.cookieControl__Bar--top-right .cookieControl__BarButtons{margin-top:20px}.cookieControl__Bar--top-left,.cookieControl__Bar--top-right{top:20px}.cookieControl__Bar--bottom-left,.cookieControl__Bar--bottom-right{bottom:20px}.cookieControl__Bar--bottom-left,.cookieControl__Bar--top-left{left:20px}.cookieControl__Bar--bottom-right,.cookieControl__Bar--top-right{right:20px}.cookieControl__BarButtons{display:flex}.cookieControl__Modal{bottom:0;font-size:0;left:0;position:fixed;right:0;text-align:center;top:0;z-index:1}.cookieControl__Modal:before{content:"";display:inline-block;min-height:100vh;vertical-align:middle}.cookieControl__Modal:after{background-color:var(--cookie-control-modalOverlay);bottom:0;content:"";left:0;opacity:var(--cookie-control-modalOverlayOpacity);position:absolute;right:0;top:0;z-index:-1}.cookieControl__Modal>div{font-size:medium;padding-top:80px}.cookieControl__Modal button{background-color:var(--cookie-control-modalButtonBackground);color:var(--cookie-control-modalButtonColor)}.cookieControl__Modal button:hover{background-color:var(--cookie-control-modalButtonHoverBackground);color:var(--cookie-control-modalButtonHoverColor)}.cookieControl__ModalContent{background-color:var(--cookie-control-modalBackground);display:inline-block;max-height:80vh;max-width:550px;overflow-y:scroll;padding:40px;position:relative;text-align:left;vertical-align:middle;width:100%}.cookieControl__ModalContent,.cookieControl__ModalContent :not(button){color:var(--cookie-control-modalTextColor)}.cookieControl__ModalContent h3{font-size:24px;margin:50px 0 25px}.cookieControl__ModalContent h3:first-of-type{margin-top:0}.cookieControl__ModalContent ul{font-size:16px;list-style-type:none;padding:0}.cookieControl__ModalContent ul ul{padding:5px 56px 0}.cookieControl__ModalContent ul ul li+li{margin-top:5px}.cookieControl__ModalContent li{align-items:center}.cookieControl__ModalContent li+li{margin-top:20px}.cookieControl__ModalContent input{display:none}.cookieControl__ModalContent input:checked+label{background-color:var(--cookie-control-checkboxActiveBackground)}.cookieControl__ModalContent input:checked+label:before{background-color:var(--cookie-control-checkboxActiveCircleBackground);transform:translate3d(100%,-50%,0)}.cookieControl__ModalContent input:checked:disabled+label{background-color:var(--cookie-control-checkboxDisabledBackground)}.cookieControl__ModalContent input:checked:disabled+label:before{background-color:var(--cookie-control-checkboxDisabledCircleBackground)}.cookieControl__ModalContent label{backface-visibility:hidden;background-color:var(--cookie-control-checkboxInactiveBackground);border-radius:20px;display:block;font-size:0;margin-right:20px;min-height:20px;min-width:36px;position:relative;transition:background-color .2s}.cookieControl__ModalContent label:before{background-color:var(--cookie-control-checkboxInactiveCircleBackground);border-radius:50%;content:"";height:15px;left:3px;position:absolute;top:50%;transform:translate3d(0,-50%,0);transition:transform .2s;width:15px}.cookieControl__ModalInputWrapper{align-items:flex-start;display:flex}.cookieControl__ModalCookieName{font-weight:700;text-transform:uppercase}.cookieControl__ModalCookieName span{font-weight:400;text-transform:none}.cookieControl__ModalClose{position:absolute;right:20px;top:20px}.cookieControl__ModalButtons{align-items:flex-start;display:flex;margin-top:80px}.cookieControl__ModalButtons button+button{margin-left:20px}.cookieControl__ModalUnsaved{bottom:40px;color:var(--cookie-control-modalUnsavedColor);font-size:14px;left:50%;margin:0;position:absolute;transform:translateX(-50%)}.cookieControl__BlockedIframe{border:2px solid #ddd;padding:20px}.cookieControl__BlockedIframe a,.cookieControl__BlockedIframe p{font-family:Arial,"Helvetica Neue",Helvetica,sans-serif}@media screen and (max-width:768px){.cookieControl__Bar{flex-direction:column;left:0;right:0}.cookieControl__Bar h3,.cookieControl__Bar p{max-width:100%}.cookieControl__Bar--top-full,.cookieControl__Bar--top-left,.cookieControl__Bar--top-right{top:0}.cookieControl__Bar--bottom-full,.cookieControl__Bar--bottom-left,.cookieControl__Bar--bottom-right{bottom:0}.cookieControl__ModalContent{bottom:0;left:0;max-height:100%;max-width:none;padding:80px 20px 20px;position:absolute;right:0;top:0}.cookieControl__BarButtons{flex-direction:column;justify-content:center;margin-top:20px;width:100%}.cookieControl__BarButtons button{width:100%}.cookieControl__BarButtons button+button{margin:10px 0 0}.cookieControl__BarContainer,.cookieControl__ModalButtons{flex-direction:column}.cookieControl__ModalButtons button{width:100%}.cookieControl__ModalButtons button+button{margin:10px 0 0}}.cookieControl__ControlButton{backface-visibility:hidden;background:var(--cookie-control-controlButtonBackground);border:0;border-radius:50%;bottom:20px;box-shadow:0 0 10px rgba(0,0,0,.3);cursor:pointer;height:40px;min-height:40px;min-width:40px;outline:0;position:fixed;right:20px;transition:background-color .2s;width:40px}.cookieControl__ControlButton svg{backface-visibility:hidden;color:var(--cookie-control-controlButtonIconColor);left:50%;max-height:24px;max-width:24px;min-height:24px;min-width:24px;position:absolute;top:50%;transform:translate(-50%,-50%);transition:color .2s}.cookieControl__ControlButton:hover{background-color:var(--cookie-control-controlButtonHoverBackground)}.cookieControl__ControlButton:hover svg{color:var(--cookie-control-controlButtonIconHoverColor)} .vue-modal-resizer{bottom:0;cursor:se-resize;height:12px;overflow:hidden;right:0;width:12px;z-index:9999999}.vue-modal-resizer,.vue-modal-resizer:after{background:transparent;display:block;position:absolute}.vue-modal-resizer:after{border-bottom:10px solid #ddd;border-left:10px solid transparent;content:"";height:0;left:0;top:0;width:0}.vue-modal-resizer.clicked:after{border-bottom:10px solid #369be9}.v--modal-block-scroll{overflow:hidden;width:100vw}.v--modal-overlay{background:rgba(0,0,0,.2);box-sizing:border-box;height:100vh;left:0;opacity:1;position:fixed;top:0;width:100%;z-index:999}.v--modal-overlay.scrollable{-webkit-overflow-scrolling:touch;height:100%;min-height:100vh;overflow-y:auto}.v--modal-overlay .v--modal-background-click{height:auto;min-height:100%;width:100%}.v--modal-overlay .v--modal-box{box-sizing:border-box;overflow:hidden;position:relative}.v--modal-overlay.scrollable .v--modal-box{margin-bottom:2px}.v--modal{background-color:#fff;border-radius:3px;box-shadow:0 20px 60px -2px rgba(27,33,58,.4);padding:0;text-align:left}.v--modal.v--modal-fullscreen{height:100vh;left:0;margin:0;top:0;width:100vw}.v--modal-top-right{display:block;position:absolute;right:0;top:0}.overlay-fade-enter-active,.overlay-fade-leave-active{transition:all .2s}.overlay-fade-enter,.overlay-fade-leave-active{opacity:0}.nice-modal-fade-enter-active,.nice-modal-fade-leave-active{transition:all .4s}.nice-modal-fade-enter,.nice-modal-fade-leave-active{opacity:0;-webkit-transform:translateY(-20px);transform:translateY(-20px)}.vue-dialog div{box-sizing:border-box}.vue-dialog .dialog-flex{height:100%;width:100%}.vue-dialog .dialog-content{flex:1 0 auto;font-size:14px;padding:15px;width:100%}.vue-dialog .dialog-c-title{font-weight:600;padding-bottom:15px}.vue-dialog .vue-dialog-buttons{border-top:1px solid #eee;display:flex;flex:0 1 auto;width:100%}.vue-dialog .vue-dialog-buttons-none{padding-bottom:15px;width:100%}.vue-dialog-button{background:transparent;border:0;box-sizing:border-box;color:inherit;cursor:pointer;font-size:12px!important;height:40px;line-height:40px;font:inherit;margin:0;outline:none;padding:0}.vue-dialog-button:hover{background:rgba(0,0,0,.01)}.vue-dialog-button:active{background:rgba(0,0,0,.025)}.vue-dialog-button:not(:first-of-type){border-left:1px solid #eee} .nuxt-progress{background-color:#3088df;height:2px;left:0;opacity:1;position:fixed;right:0;top:0;transition:width .1s,opacity .4s;width:0;z-index:999999}.nuxt-progress.nuxt-progress-notransition{transition:none}.nuxt-progress-failed{background-color:red} .mt-0{margin-top:0!important}.mt-5{margin-top:5px!important}.mt-10{margin-top:10px!important}.mt-15{margin-top:15px!important}.mt-20{margin-top:20px!important}.mt-25{margin-top:25px!important}.mt-30{margin-top:30px!important}.mt-35{margin-top:35px!important}.mt-40{margin-top:40px!important}.mt-45{margin-top:45px!important}.mt-50{margin-top:50px!important}.mt-55{margin-top:55px!important}.mt-60{margin-top:60px!important}.mt-65{margin-top:65px!important}.mt-70{margin-top:70px!important}.mt-75{margin-top:75px!important}.mt-80{margin-top:80px!important}.mt-85{margin-top:85px!important}.mt-90{margin-top:90px!important}.mt-95{margin-top:95px!important}.mt-100{margin-top:100px!important}.mb-0{margin-bottom:0!important}.mb-5{margin-bottom:5px!important}.mb-10{margin-bottom:10px!important}.mb-15{margin-bottom:15px!important}.mb-20{margin-bottom:20px!important}.mb-25{margin-bottom:25px!important}.mb-30{margin-bottom:30px!important}.mb-35{margin-bottom:35px!important}.mb-40{margin-bottom:40px!important}.mb-45{margin-bottom:45px!important}.mb-50{margin-bottom:50px!important}.mb-55{margin-bottom:55px!important}.mb-60{margin-bottom:60px!important}.mb-65{margin-bottom:65px!important}.mb-70{margin-bottom:70px!important}.mb-75{margin-bottom:75px!important}.mb-80{margin-bottom:80px!important}.mb-85{margin-bottom:85px!important}.mb-90{margin-bottom:90px!important}.mb-95{margin-bottom:95px!important}.mb-100{margin-bottom:100px!important}.ml-0{margin-left:0!important}.ml-5{margin-left:5px!important}.ml-10{margin-left:10px!important}.ml-15{margin-left:15px!important}.ml-20{margin-left:20px!important}.ml-25{margin-left:25px!important}.ml-30{margin-left:30px!important}.ml-35{margin-left:35px!important}.ml-40{margin-left:40px!important}.ml-45{margin-left:45px!important}.ml-50{margin-left:50px!important}.ml-55{margin-left:55px!important}.ml-60{margin-left:60px!important}.ml-65{margin-left:65px!important}.ml-70{margin-left:70px!important}.ml-75{margin-left:75px!important}.ml-80{margin-left:80px!important}.ml-85{margin-left:85px!important}.ml-90{margin-left:90px!important}.ml-95{margin-left:95px!important}.ml-100{margin-left:100px!important}.mr-0{margin-right:0!important}.mr-5{margin-right:5px!important}.mr-10{margin-right:10px!important}.mr-15{margin-right:15px!important}.mr-20{margin-right:20px!important}.mr-25{margin-right:25px!important}.mr-30{margin-right:30px!important}.mr-35{margin-right:35px!important}.mr-40{margin-right:40px!important}.mr-45{margin-right:45px!important}.mr-50{margin-right:50px!important}.mr-55{margin-right:55px!important}.mr-60{margin-right:60px!important}.mr-65{margin-right:65px!important}.mr-70{margin-right:70px!important}.mr-75{margin-right:75px!important}.mr-80{margin-right:80px!important}.mr-85{margin-right:85px!important}.mr-90{margin-right:90px!important}.mr-95{margin-right:95px!important}.mr-100{margin-right:100px!important}.pt-0{padding-top:0!important}.pt-5{padding-top:5px!important}.pt-10{padding-top:10px!important}.pt-15{padding-top:15px!important}.pt-20{padding-top:20px!important}.pt-25{padding-top:25px!important}.pt-30{padding-top:30px!important}.pt-35{padding-top:35px!important}.pt-40{padding-top:40px!important}.pt-45{padding-top:45px!important}.pt-50{padding-top:50px!important}.pt-55{padding-top:55px!important}.pt-60{padding-top:60px!important}.pt-65{padding-top:65px!important}.pt-70{padding-top:70px!important}.pt-75{padding-top:75px!important}.pt-80{padding-top:80px!important}.pt-85{padding-top:85px!important}.pt-90{padding-top:90px!important}.pt-95{padding-top:95px!important}.pt-100{padding-top:100px!important}.pb-0{padding-bottom:0!important}.pb-5{padding-bottom:5px!important}.pb-10{padding-bottom:10px!important}.pb-15{padding-bottom:15px!important}.pb-20{padding-bottom:20px!important}.pb-25{padding-bottom:25px!important}.pb-30{padding-bottom:30px!important}.pb-35{padding-bottom:35px!important}.pb-40{padding-bottom:40px!important}.pb-45{padding-bottom:45px!important}.pb-50{padding-bottom:50px!important}.pb-55{padding-bottom:55px!important}.pb-60{padding-bottom:60px!important}.pb-65{padding-bottom:65px!important}.pb-70{padding-bottom:70px!important}.pb-75{padding-bottom:75px!important}.pb-80{padding-bottom:80px!important}.pb-85{padding-bottom:85px!important}.pb-90{padding-bottom:90px!important}.pb-95{padding-bottom:95px!important}.pb-100{padding-bottom:100px!important}.pl-0{padding-left:0!important}.pl-5{padding-left:5px!important}.pl-10{padding-left:10px!important}.pl-15{padding-left:15px!important}.pl-20{padding-left:20px!important}.pl-25{padding-left:25px!important}.pl-30{padding-left:30px!important}.pl-35{padding-left:35px!important}.pl-40{padding-left:40px!important}.pl-45{padding-left:45px!important}.pl-50{padding-left:50px!important}.pl-55{padding-left:55px!important}.pl-60{padding-left:60px!important}.pl-65{padding-left:65px!important}.pl-70{padding-left:70px!important}.pl-75{padding-left:75px!important}.pl-80{padding-left:80px!important}.pl-85{padding-left:85px!important}.pl-90{padding-left:90px!important}.pl-95{padding-left:95px!important}.pl-100{padding-left:100px!important}.pr-0{padding-right:0!important}.pr-5{padding-right:5px!important}.pr-10{padding-right:10px!important}.pr-15{padding-right:15px!important}.pr-20{padding-right:20px!important}.pr-25{padding-right:25px!important}.pr-30{padding-right:30px!important}.pr-35{padding-right:35px!important}.pr-40{padding-right:40px!important}.pr-45{padding-right:45px!important}.pr-50{padding-right:50px!important}.pr-55{padding-right:55px!important}.pr-60{padding-right:60px!important}.pr-65{padding-right:65px!important}.pr-70{padding-right:70px!important}.pr-75{padding-right:75px!important}.pr-80{padding-right:80px!important}.pr-85{padding-right:85px!important}.pr-90{padding-right:90px!important}.pr-95{padding-right:95px!important}.pr-100{padding-right:100px!important}*,:after,:before{box-sizing:inherit}html{font-size:62.5%}body{background-color:#fff;box-sizing:border-box;color:#1a254c;font-family:"Roboto",sans-serif;font-size:1.4rem;font-weight:400;margin:0;padding:0}#jmir-html{position:relative}#skip-link a{font-size:1.6rem;font-weight:700;height:1px;left:-10000px;margin:10px 0 10px 10px;overflow:hidden;padding:10px;position:absolute;-webkit-text-decoration:underline;text-decoration:underline;top:auto;width:1px}#skip-link a:focus{border:2px solid #f69038;display:inline-block;height:auto;position:static;width:auto}#main-layout-container{display:flex;flex-direction:column;height:100vh;justify-content:space-between}.cookieControl{z-index:999999999}.cookieControl__Bar{border-radius:0;box-shadow:0 0 40px -10px rgba(0,0,0,.75)}.cookieControl__BarButtons button,.cookieControl__ModalButtons button{cursor:pointer;font-size:14px;opacity:1;padding:10px 20px;text-align:center;transition:.3s}.cookieControl__BarButtons button:focus,.cookieControl__BarButtons button:hover,.cookieControl__ModalButtons button:focus,.cookieControl__ModalButtons button:hover{font-weight:400;opacity:.9;-webkit-text-decoration:none;text-decoration:none}.cookieControl__BarButtons button:focus,.cookieControl__ModalButtons button:focus{outline:2px solid #f69038!important;outline-offset:6px!important}.cookieControl__BarButtons button:first-child{background-color:#f1f3f5;border:1px solid #dcdee0;color:#1a254c;display:none}.cookieControl__BarButtons button:first-child:active{background-color:#dcdee0}.cookieControl__BarButtons button:last-child{background-color:#367c3a;border:1px solid #367c3a;color:#fff}.cookieControl__BarButtons button:last-child:active{background-color:#3b9d3f;border:1px solid #3b9d3f}.cookieControl__ModalButtons button:first-child{background-color:#367c3a;border:1px solid #367c3a;color:#fff}.cookieControl__ModalButtons button:first-child:active{background-color:#3b9d3f;border:1px solid #3b9d3f}.cookieControl__ModalButtons button:last-child{background-color:#b30000;border:1px solid #b30000;color:#fff}.cookieControl__ModalButtons button:last-child:active{background-color:#ba302d;border:1px solid #ba302d}.cookieControl__Modal>div{padding-top:50px}.cookieControl__ModalContent{background-color:#f3f3f5;padding-left:0;padding-right:0;position:relative}.cookieControl__ModalContent div:first-child{background-color:#fff;border-top:1px solid #bab4b4;padding-left:20px;padding-right:20px}.cookieControl__ModalClose{top:5px!important}.cookieControl__ModalContent ul li div{border-top:none!important}.cookieControl__ModalContent ul li ul li{font-size:14px;font-style:italic;margin-left:21px}.cookieControl__ModalContent ul:last-child{padding-bottom:20px}.cookieControl__ModalContent h2{font-size:18px;margin:0;position:absolute;top:5px!important}.cookieControl__ModalContent h3{padding-top:20px}.cookieControl__ModalButtons{bottom:10px;margin-left:auto;position:absolute;right:20px;width:-moz-fit-content;width:fit-content}.cookieControl__ModalClose{background-color:transparent!important;color:gray!important}.cookieControl__ModalClose:hover{color:#000!important}.cookieControl__ModalCookieName{text-transform:none!important}.toasted-container.top-right{right:2%!important;top:2%!important}._hj-3ZiaL__MinimizedWidgetBottom__container{bottom:.5%!important;flex-direction:row!important;justify-content:flex-end;width:84%!important}span a.a-select-membership{font-size:1.2rem;padding:5px 10px;-webkit-text-decoration:none!important;text-decoration:none!important} .mt-0[data-v-6301a668]{margin-top:0!important}.mt-5[data-v-6301a668]{margin-top:5px!important}.mt-10[data-v-6301a668]{margin-top:10px!important}.mt-15[data-v-6301a668]{margin-top:15px!important}.mt-20[data-v-6301a668]{margin-top:20px!important}.mt-25[data-v-6301a668]{margin-top:25px!important}.mt-30[data-v-6301a668]{margin-top:30px!important}.mt-35[data-v-6301a668]{margin-top:35px!important}.mt-40[data-v-6301a668]{margin-top:40px!important}.mt-45[data-v-6301a668]{margin-top:45px!important}.mt-50[data-v-6301a668]{margin-top:50px!important}.mt-55[data-v-6301a668]{margin-top:55px!important}.mt-60[data-v-6301a668]{margin-top:60px!important}.mt-65[data-v-6301a668]{margin-top:65px!important}.mt-70[data-v-6301a668]{margin-top:70px!important}.mt-75[data-v-6301a668]{margin-top:75px!important}.mt-80[data-v-6301a668]{margin-top:80px!important}.mt-85[data-v-6301a668]{margin-top:85px!important}.mt-90[data-v-6301a668]{margin-top:90px!important}.mt-95[data-v-6301a668]{margin-top:95px!important}.mt-100[data-v-6301a668]{margin-top:100px!important}.mb-0[data-v-6301a668]{margin-bottom:0!important}.mb-5[data-v-6301a668]{margin-bottom:5px!important}.mb-10[data-v-6301a668]{margin-bottom:10px!important}.mb-15[data-v-6301a668]{margin-bottom:15px!important}.mb-20[data-v-6301a668]{margin-bottom:20px!important}.mb-25[data-v-6301a668]{margin-bottom:25px!important}.mb-30[data-v-6301a668]{margin-bottom:30px!important}.mb-35[data-v-6301a668]{margin-bottom:35px!important}.mb-40[data-v-6301a668]{margin-bottom:40px!important}.mb-45[data-v-6301a668]{margin-bottom:45px!important}.mb-50[data-v-6301a668]{margin-bottom:50px!important}.mb-55[data-v-6301a668]{margin-bottom:55px!important}.mb-60[data-v-6301a668]{margin-bottom:60px!important}.mb-65[data-v-6301a668]{margin-bottom:65px!important}.mb-70[data-v-6301a668]{margin-bottom:70px!important}.mb-75[data-v-6301a668]{margin-bottom:75px!important}.mb-80[data-v-6301a668]{margin-bottom:80px!important}.mb-85[data-v-6301a668]{margin-bottom:85px!important}.mb-90[data-v-6301a668]{margin-bottom:90px!important}.mb-95[data-v-6301a668]{margin-bottom:95px!important}.mb-100[data-v-6301a668]{margin-bottom:100px!important}.ml-0[data-v-6301a668]{margin-left:0!important}.ml-5[data-v-6301a668]{margin-left:5px!important}.ml-10[data-v-6301a668]{margin-left:10px!important}.ml-15[data-v-6301a668]{margin-left:15px!important}.ml-20[data-v-6301a668]{margin-left:20px!important}.ml-25[data-v-6301a668]{margin-left:25px!important}.ml-30[data-v-6301a668]{margin-left:30px!important}.ml-35[data-v-6301a668]{margin-left:35px!important}.ml-40[data-v-6301a668]{margin-left:40px!important}.ml-45[data-v-6301a668]{margin-left:45px!important}.ml-50[data-v-6301a668]{margin-left:50px!important}.ml-55[data-v-6301a668]{margin-left:55px!important}.ml-60[data-v-6301a668]{margin-left:60px!important}.ml-65[data-v-6301a668]{margin-left:65px!important}.ml-70[data-v-6301a668]{margin-left:70px!important}.ml-75[data-v-6301a668]{margin-left:75px!important}.ml-80[data-v-6301a668]{margin-left:80px!important}.ml-85[data-v-6301a668]{margin-left:85px!important}.ml-90[data-v-6301a668]{margin-left:90px!important}.ml-95[data-v-6301a668]{margin-left:95px!important}.ml-100[data-v-6301a668]{margin-left:100px!important}.mr-0[data-v-6301a668]{margin-right:0!important}.mr-5[data-v-6301a668]{margin-right:5px!important}.mr-10[data-v-6301a668]{margin-right:10px!important}.mr-15[data-v-6301a668]{margin-right:15px!important}.mr-20[data-v-6301a668]{margin-right:20px!important}.mr-25[data-v-6301a668]{margin-right:25px!important}.mr-30[data-v-6301a668]{margin-right:30px!important}.mr-35[data-v-6301a668]{margin-right:35px!important}.mr-40[data-v-6301a668]{margin-right:40px!important}.mr-45[data-v-6301a668]{margin-right:45px!important}.mr-50[data-v-6301a668]{margin-right:50px!important}.mr-55[data-v-6301a668]{margin-right:55px!important}.mr-60[data-v-6301a668]{margin-right:60px!important}.mr-65[data-v-6301a668]{margin-right:65px!important}.mr-70[data-v-6301a668]{margin-right:70px!important}.mr-75[data-v-6301a668]{margin-right:75px!important}.mr-80[data-v-6301a668]{margin-right:80px!important}.mr-85[data-v-6301a668]{margin-right:85px!important}.mr-90[data-v-6301a668]{margin-right:90px!important}.mr-95[data-v-6301a668]{margin-right:95px!important}.mr-100[data-v-6301a668]{margin-right:100px!important}.pt-0[data-v-6301a668]{padding-top:0!important}.pt-5[data-v-6301a668]{padding-top:5px!important}.pt-10[data-v-6301a668]{padding-top:10px!important}.pt-15[data-v-6301a668]{padding-top:15px!important}.pt-20[data-v-6301a668]{padding-top:20px!important}.pt-25[data-v-6301a668]{padding-top:25px!important}.pt-30[data-v-6301a668]{padding-top:30px!important}.pt-35[data-v-6301a668]{padding-top:35px!important}.pt-40[data-v-6301a668]{padding-top:40px!important}.pt-45[data-v-6301a668]{padding-top:45px!important}.pt-50[data-v-6301a668]{padding-top:50px!important}.pt-55[data-v-6301a668]{padding-top:55px!important}.pt-60[data-v-6301a668]{padding-top:60px!important}.pt-65[data-v-6301a668]{padding-top:65px!important}.pt-70[data-v-6301a668]{padding-top:70px!important}.pt-75[data-v-6301a668]{padding-top:75px!important}.pt-80[data-v-6301a668]{padding-top:80px!important}.pt-85[data-v-6301a668]{padding-top:85px!important}.pt-90[data-v-6301a668]{padding-top:90px!important}.pt-95[data-v-6301a668]{padding-top:95px!important}.pt-100[data-v-6301a668]{padding-top:100px!important}.pb-0[data-v-6301a668]{padding-bottom:0!important}.pb-5[data-v-6301a668]{padding-bottom:5px!important}.pb-10[data-v-6301a668]{padding-bottom:10px!important}.pb-15[data-v-6301a668]{padding-bottom:15px!important}.pb-20[data-v-6301a668]{padding-bottom:20px!important}.pb-25[data-v-6301a668]{padding-bottom:25px!important}.pb-30[data-v-6301a668]{padding-bottom:30px!important}.pb-35[data-v-6301a668]{padding-bottom:35px!important}.pb-40[data-v-6301a668]{padding-bottom:40px!important}.pb-45[data-v-6301a668]{padding-bottom:45px!important}.pb-50[data-v-6301a668]{padding-bottom:50px!important}.pb-55[data-v-6301a668]{padding-bottom:55px!important}.pb-60[data-v-6301a668]{padding-bottom:60px!important}.pb-65[data-v-6301a668]{padding-bottom:65px!important}.pb-70[data-v-6301a668]{padding-bottom:70px!important}.pb-75[data-v-6301a668]{padding-bottom:75px!important}.pb-80[data-v-6301a668]{padding-bottom:80px!important}.pb-85[data-v-6301a668]{padding-bottom:85px!important}.pb-90[data-v-6301a668]{padding-bottom:90px!important}.pb-95[data-v-6301a668]{padding-bottom:95px!important}.pb-100[data-v-6301a668]{padding-bottom:100px!important}.pl-0[data-v-6301a668]{padding-left:0!important}.pl-5[data-v-6301a668]{padding-left:5px!important}.pl-10[data-v-6301a668]{padding-left:10px!important}.pl-15[data-v-6301a668]{padding-left:15px!important}.pl-20[data-v-6301a668]{padding-left:20px!important}.pl-25[data-v-6301a668]{padding-left:25px!important}.pl-30[data-v-6301a668]{padding-left:30px!important}.pl-35[data-v-6301a668]{padding-left:35px!important}.pl-40[data-v-6301a668]{padding-left:40px!important}.pl-45[data-v-6301a668]{padding-left:45px!important}.pl-50[data-v-6301a668]{padding-left:50px!important}.pl-55[data-v-6301a668]{padding-left:55px!important}.pl-60[data-v-6301a668]{padding-left:60px!important}.pl-65[data-v-6301a668]{padding-left:65px!important}.pl-70[data-v-6301a668]{padding-left:70px!important}.pl-75[data-v-6301a668]{padding-left:75px!important}.pl-80[data-v-6301a668]{padding-left:80px!important}.pl-85[data-v-6301a668]{padding-left:85px!important}.pl-90[data-v-6301a668]{padding-left:90px!important}.pl-95[data-v-6301a668]{padding-left:95px!important}.pl-100[data-v-6301a668]{padding-left:100px!important}.pr-0[data-v-6301a668]{padding-right:0!important}.pr-5[data-v-6301a668]{padding-right:5px!important}.pr-10[data-v-6301a668]{padding-right:10px!important}.pr-15[data-v-6301a668]{padding-right:15px!important}.pr-20[data-v-6301a668]{padding-right:20px!important}.pr-25[data-v-6301a668]{padding-right:25px!important}.pr-30[data-v-6301a668]{padding-right:30px!important}.pr-35[data-v-6301a668]{padding-right:35px!important}.pr-40[data-v-6301a668]{padding-right:40px!important}.pr-45[data-v-6301a668]{padding-right:45px!important}.pr-50[data-v-6301a668]{padding-right:50px!important}.pr-55[data-v-6301a668]{padding-right:55px!important}.pr-60[data-v-6301a668]{padding-right:60px!important}.pr-65[data-v-6301a668]{padding-right:65px!important}.pr-70[data-v-6301a668]{padding-right:70px!important}.pr-75[data-v-6301a668]{padding-right:75px!important}.pr-80[data-v-6301a668]{padding-right:80px!important}.pr-85[data-v-6301a668]{padding-right:85px!important}.pr-90[data-v-6301a668]{padding-right:90px!important}.pr-95[data-v-6301a668]{padding-right:95px!important}.pr-100[data-v-6301a668]{padding-right:100px!important}.top-hero-banner[data-v-6301a668]{background-color:#e5f2fe;padding:0 10vw 0 5vw;position:relative}.top-hero-banner__divider[data-v-6301a668]{border-top:1px solid #1a254c}.top-hero-banner__close[data-v-6301a668]{color:#1a254c;cursor:pointer;font-size:2rem;position:absolute;right:10vw;transition:.3s}.top-hero-banner__close[data-v-6301a668]:hover{color:#2e4185}.top-hero-banner__info[data-v-6301a668]{display:flex}.top-hero-banner__info .icon[data-v-6301a668]{font-size:2rem;margin-top:3px}.top-hero-banner__actions[data-v-6301a668]{margin-left:2.9rem}.top-hero-banner__actions a[data-v-6301a668]{display:inline-block;margin-right:2rem}@media screen and (max-width:31.25em){.top-hero-banner__actions a[data-v-6301a668]{padding:5px 0}}.top-hero-banner__link[data-v-6301a668]{color:#1a254c;display:inline-block;font-size:1.6rem;font-weight:700;margin-bottom:5px;margin-right:4rem;-webkit-text-decoration:none;text-decoration:none}.top-hero-banner__link[data-v-6301a668]:focus,.top-hero-banner__link[data-v-6301a668]:hover{-webkit-text-decoration:underline;text-decoration:underline}@media screen and (max-width:47.9375em){.top-hero-banner__link[data-v-6301a668]{display:block;margin-right:0}} .mt-0[data-v-49c694ee]{margin-top:0!important}.mt-5[data-v-49c694ee]{margin-top:5px!important}.mt-10[data-v-49c694ee]{margin-top:10px!important}.mt-15[data-v-49c694ee]{margin-top:15px!important}.mt-20[data-v-49c694ee]{margin-top:20px!important}.mt-25[data-v-49c694ee]{margin-top:25px!important}.mt-30[data-v-49c694ee]{margin-top:30px!important}.mt-35[data-v-49c694ee]{margin-top:35px!important}.mt-40[data-v-49c694ee]{margin-top:40px!important}.mt-45[data-v-49c694ee]{margin-top:45px!important}.mt-50[data-v-49c694ee]{margin-top:50px!important}.mt-55[data-v-49c694ee]{margin-top:55px!important}.mt-60[data-v-49c694ee]{margin-top:60px!important}.mt-65[data-v-49c694ee]{margin-top:65px!important}.mt-70[data-v-49c694ee]{margin-top:70px!important}.mt-75[data-v-49c694ee]{margin-top:75px!important}.mt-80[data-v-49c694ee]{margin-top:80px!important}.mt-85[data-v-49c694ee]{margin-top:85px!important}.mt-90[data-v-49c694ee]{margin-top:90px!important}.mt-95[data-v-49c694ee]{margin-top:95px!important}.mt-100[data-v-49c694ee]{margin-top:100px!important}.mb-0[data-v-49c694ee]{margin-bottom:0!important}.mb-5[data-v-49c694ee]{margin-bottom:5px!important}.mb-10[data-v-49c694ee]{margin-bottom:10px!important}.mb-15[data-v-49c694ee]{margin-bottom:15px!important}.mb-20[data-v-49c694ee]{margin-bottom:20px!important}.mb-25[data-v-49c694ee]{margin-bottom:25px!important}.mb-30[data-v-49c694ee]{margin-bottom:30px!important}.mb-35[data-v-49c694ee]{margin-bottom:35px!important}.mb-40[data-v-49c694ee]{margin-bottom:40px!important}.mb-45[data-v-49c694ee]{margin-bottom:45px!important}.mb-50[data-v-49c694ee]{margin-bottom:50px!important}.mb-55[data-v-49c694ee]{margin-bottom:55px!important}.mb-60[data-v-49c694ee]{margin-bottom:60px!important}.mb-65[data-v-49c694ee]{margin-bottom:65px!important}.mb-70[data-v-49c694ee]{margin-bottom:70px!important}.mb-75[data-v-49c694ee]{margin-bottom:75px!important}.mb-80[data-v-49c694ee]{margin-bottom:80px!important}.mb-85[data-v-49c694ee]{margin-bottom:85px!important}.mb-90[data-v-49c694ee]{margin-bottom:90px!important}.mb-95[data-v-49c694ee]{margin-bottom:95px!important}.mb-100[data-v-49c694ee]{margin-bottom:100px!important}.ml-0[data-v-49c694ee]{margin-left:0!important}.ml-5[data-v-49c694ee]{margin-left:5px!important}.ml-10[data-v-49c694ee]{margin-left:10px!important}.ml-15[data-v-49c694ee]{margin-left:15px!important}.ml-20[data-v-49c694ee]{margin-left:20px!important}.ml-25[data-v-49c694ee]{margin-left:25px!important}.ml-30[data-v-49c694ee]{margin-left:30px!important}.ml-35[data-v-49c694ee]{margin-left:35px!important}.ml-40[data-v-49c694ee]{margin-left:40px!important}.ml-45[data-v-49c694ee]{margin-left:45px!important}.ml-50[data-v-49c694ee]{margin-left:50px!important}.ml-55[data-v-49c694ee]{margin-left:55px!important}.ml-60[data-v-49c694ee]{margin-left:60px!important}.ml-65[data-v-49c694ee]{margin-left:65px!important}.ml-70[data-v-49c694ee]{margin-left:70px!important}.ml-75[data-v-49c694ee]{margin-left:75px!important}.ml-80[data-v-49c694ee]{margin-left:80px!important}.ml-85[data-v-49c694ee]{margin-left:85px!important}.ml-90[data-v-49c694ee]{margin-left:90px!important}.ml-95[data-v-49c694ee]{margin-left:95px!important}.ml-100[data-v-49c694ee]{margin-left:100px!important}.mr-0[data-v-49c694ee]{margin-right:0!important}.mr-5[data-v-49c694ee]{margin-right:5px!important}.mr-10[data-v-49c694ee]{margin-right:10px!important}.mr-15[data-v-49c694ee]{margin-right:15px!important}.mr-20[data-v-49c694ee]{margin-right:20px!important}.mr-25[data-v-49c694ee]{margin-right:25px!important}.mr-30[data-v-49c694ee]{margin-right:30px!important}.mr-35[data-v-49c694ee]{margin-right:35px!important}.mr-40[data-v-49c694ee]{margin-right:40px!important}.mr-45[data-v-49c694ee]{margin-right:45px!important}.mr-50[data-v-49c694ee]{margin-right:50px!important}.mr-55[data-v-49c694ee]{margin-right:55px!important}.mr-60[data-v-49c694ee]{margin-right:60px!important}.mr-65[data-v-49c694ee]{margin-right:65px!important}.mr-70[data-v-49c694ee]{margin-right:70px!important}.mr-75[data-v-49c694ee]{margin-right:75px!important}.mr-80[data-v-49c694ee]{margin-right:80px!important}.mr-85[data-v-49c694ee]{margin-right:85px!important}.mr-90[data-v-49c694ee]{margin-right:90px!important}.mr-95[data-v-49c694ee]{margin-right:95px!important}.mr-100[data-v-49c694ee]{margin-right:100px!important}.pt-0[data-v-49c694ee]{padding-top:0!important}.pt-5[data-v-49c694ee]{padding-top:5px!important}.pt-10[data-v-49c694ee]{padding-top:10px!important}.pt-15[data-v-49c694ee]{padding-top:15px!important}.pt-20[data-v-49c694ee]{padding-top:20px!important}.pt-25[data-v-49c694ee]{padding-top:25px!important}.pt-30[data-v-49c694ee]{padding-top:30px!important}.pt-35[data-v-49c694ee]{padding-top:35px!important}.pt-40[data-v-49c694ee]{padding-top:40px!important}.pt-45[data-v-49c694ee]{padding-top:45px!important}.pt-50[data-v-49c694ee]{padding-top:50px!important}.pt-55[data-v-49c694ee]{padding-top:55px!important}.pt-60[data-v-49c694ee]{padding-top:60px!important}.pt-65[data-v-49c694ee]{padding-top:65px!important}.pt-70[data-v-49c694ee]{padding-top:70px!important}.pt-75[data-v-49c694ee]{padding-top:75px!important}.pt-80[data-v-49c694ee]{padding-top:80px!important}.pt-85[data-v-49c694ee]{padding-top:85px!important}.pt-90[data-v-49c694ee]{padding-top:90px!important}.pt-95[data-v-49c694ee]{padding-top:95px!important}.pt-100[data-v-49c694ee]{padding-top:100px!important}.pb-0[data-v-49c694ee]{padding-bottom:0!important}.pb-5[data-v-49c694ee]{padding-bottom:5px!important}.pb-10[data-v-49c694ee]{padding-bottom:10px!important}.pb-15[data-v-49c694ee]{padding-bottom:15px!important}.pb-20[data-v-49c694ee]{padding-bottom:20px!important}.pb-25[data-v-49c694ee]{padding-bottom:25px!important}.pb-30[data-v-49c694ee]{padding-bottom:30px!important}.pb-35[data-v-49c694ee]{padding-bottom:35px!important}.pb-40[data-v-49c694ee]{padding-bottom:40px!important}.pb-45[data-v-49c694ee]{padding-bottom:45px!important}.pb-50[data-v-49c694ee]{padding-bottom:50px!important}.pb-55[data-v-49c694ee]{padding-bottom:55px!important}.pb-60[data-v-49c694ee]{padding-bottom:60px!important}.pb-65[data-v-49c694ee]{padding-bottom:65px!important}.pb-70[data-v-49c694ee]{padding-bottom:70px!important}.pb-75[data-v-49c694ee]{padding-bottom:75px!important}.pb-80[data-v-49c694ee]{padding-bottom:80px!important}.pb-85[data-v-49c694ee]{padding-bottom:85px!important}.pb-90[data-v-49c694ee]{padding-bottom:90px!important}.pb-95[data-v-49c694ee]{padding-bottom:95px!important}.pb-100[data-v-49c694ee]{padding-bottom:100px!important}.pl-0[data-v-49c694ee]{padding-left:0!important}.pl-5[data-v-49c694ee]{padding-left:5px!important}.pl-10[data-v-49c694ee]{padding-left:10px!important}.pl-15[data-v-49c694ee]{padding-left:15px!important}.pl-20[data-v-49c694ee]{padding-left:20px!important}.pl-25[data-v-49c694ee]{padding-left:25px!important}.pl-30[data-v-49c694ee]{padding-left:30px!important}.pl-35[data-v-49c694ee]{padding-left:35px!important}.pl-40[data-v-49c694ee]{padding-left:40px!important}.pl-45[data-v-49c694ee]{padding-left:45px!important}.pl-50[data-v-49c694ee]{padding-left:50px!important}.pl-55[data-v-49c694ee]{padding-left:55px!important}.pl-60[data-v-49c694ee]{padding-left:60px!important}.pl-65[data-v-49c694ee]{padding-left:65px!important}.pl-70[data-v-49c694ee]{padding-left:70px!important}.pl-75[data-v-49c694ee]{padding-left:75px!important}.pl-80[data-v-49c694ee]{padding-left:80px!important}.pl-85[data-v-49c694ee]{padding-left:85px!important}.pl-90[data-v-49c694ee]{padding-left:90px!important}.pl-95[data-v-49c694ee]{padding-left:95px!important}.pl-100[data-v-49c694ee]{padding-left:100px!important}.pr-0[data-v-49c694ee]{padding-right:0!important}.pr-5[data-v-49c694ee]{padding-right:5px!important}.pr-10[data-v-49c694ee]{padding-right:10px!important}.pr-15[data-v-49c694ee]{padding-right:15px!important}.pr-20[data-v-49c694ee]{padding-right:20px!important}.pr-25[data-v-49c694ee]{padding-right:25px!important}.pr-30[data-v-49c694ee]{padding-right:30px!important}.pr-35[data-v-49c694ee]{padding-right:35px!important}.pr-40[data-v-49c694ee]{padding-right:40px!important}.pr-45[data-v-49c694ee]{padding-right:45px!important}.pr-50[data-v-49c694ee]{padding-right:50px!important}.pr-55[data-v-49c694ee]{padding-right:55px!important}.pr-60[data-v-49c694ee]{padding-right:60px!important}.pr-65[data-v-49c694ee]{padding-right:65px!important}.pr-70[data-v-49c694ee]{padding-right:70px!important}.pr-75[data-v-49c694ee]{padding-right:75px!important}.pr-80[data-v-49c694ee]{padding-right:80px!important}.pr-85[data-v-49c694ee]{padding-right:85px!important}.pr-90[data-v-49c694ee]{padding-right:90px!important}.pr-95[data-v-49c694ee]{padding-right:95px!important}.pr-100[data-v-49c694ee]{padding-right:100px!important}.universal-access-btn[data-v-49c694ee]{background-color:#fff;border:2px solid transparent;border-radius:100px;color:#1e70c2;cursor:pointer;display:block;font-size:4rem;position:fixed;right:2px;top:2px;z-index:200}.universal-access-btn[data-v-49c694ee]:focus{border:2px solid #f69038;outline:none}.accessibility *[data-v-49c694ee]{line-height:inherit!important}.accessibility[data-v-49c694ee] :not(.icon){font-weight:400!important}.accessibility[data-v-49c694ee]{background-color:#fff;border:2px solid #f69038;border-radius:20px 20px 20px 20px;box-shadow:0 0 40px -10px rgba(0,0,0,.75);padding:30px 30px 10px;position:fixed;right:0;top:0;z-index:1000000000000000}@media screen and (max-width:33.1875em){.accessibility[data-v-49c694ee]{width:100%}}.accessibility__wrapper[data-v-49c694ee]{position:relative}@media screen and (max-width:59.6875em){.accessibility__wrapper[data-v-49c694ee]{flex-direction:column}}.accessibility__wrapper button[data-v-49c694ee]{border:2px solid #dcdee0}.accessibility__wrapper button[data-v-49c694ee]:focus{border:2px solid #f69038}.accessibility__wrapper h3[data-v-49c694ee]{font-size:26px!important}.accessibility__wrapper h3 .icon[data-v-49c694ee]{color:#1e70c2}.accessibility__wrapper h4[data-v-49c694ee]{font-size:18px!important}.accessibility__settings[data-v-49c694ee]{max-height:600px;overflow-x:hidden;overflow-y:auto;padding-right:20px}.accessibility__settings[data-v-49c694ee]::-webkit-scrollbar{-webkit-appearance:none}.accessibility__settings[data-v-49c694ee]::-webkit-scrollbar-track{border-radius:8px;box-shadow:inset 0 0 5px #dcdee0}.accessibility__settings[data-v-49c694ee]::-webkit-scrollbar:vertical{width:3px}.accessibility__settings[data-v-49c694ee]::-webkit-scrollbar-thumb{background-color:#1e70c2;border-radius:1px}.accessibility__settings button[data-v-49c694ee]{border-radius:10px;width:150px}@media screen and (max-width:45.9375em){.accessibility__settings button[data-v-49c694ee]{width:100%}}@media screen and (max-width:59.6875em){.accessibility__settings[data-v-49c694ee]{max-height:200px;padding-right:0;width:100%}}.accessibility__close-top[data-v-49c694ee]{background-color:transparent;border:2px solid #fff!important;color:gray;cursor:pointer;font-size:20px;position:absolute;right:-27px;top:-27px;transition:.3s;width:auto!important}.accessibility__close-top[data-v-49c694ee]:hover{color:#000}.accessibility__close-top[data-v-49c694ee]:focus{border:2px solid #f69038!important;color:#000}.accessibility__link[data-v-49c694ee]{font-size:14px;font-weight:400}.accessibility__link[data-v-49c694ee],.accessibility__section[data-v-49c694ee]{text-align:left!important}.accessibility__colour button[data-v-49c694ee],.accessibility__content button[data-v-49c694ee],.accessibility__font button[data-v-49c694ee]{text-align:center}.accessibility__colour button span[data-v-49c694ee],.accessibility__content button span[data-v-49c694ee],.accessibility__font button span[data-v-49c694ee]{display:block;font-size:20px}.accessibility__font-menu[data-v-49c694ee]{background-color:#f0f3f5;border-radius:20px;padding:20px}.accessibility__font-menu button[data-v-49c694ee]{text-align:center;width:-moz-fit-content;width:fit-content}.accessibility__font-menu button span[data-v-49c694ee]{display:block;font-size:20px}.accessibility__font-menu p[data-v-49c694ee]{display:inline-block;font-size:14px}.accessibility__content[data-v-49c694ee]{margin-bottom:20px}.accessibility__footer[data-v-49c694ee]{float:right;margin-top:20px}.accessibility__reset-settings[data-v-49c694ee]{border:2px solid #dcdee0}.accessibility__reset-settings[data-v-49c694ee]:focus{border:2px solid #f69038}.accessibility__close-bottom[data-v-49c694ee]{border:2px solid #dcdee0}.accessibility__close-bottom[data-v-49c694ee]:focus{border:2px solid #f69038}.accessibility .btn[data-v-49c694ee]{outline:none!important}.accessibility .deactive[data-v-49c694ee]{color:#000;cursor:not-allowed;opacity:.5;pointer-events:none}.accessibility .acc-active[data-v-49c694ee]{background-color:#1e70c2!important;border:1px solid #1e70c2!important;color:#fff!important}.accessibility button[data-v-49c694ee]{font-size:14px!important} .mt-0[data-v-575455fb]{margin-top:0!important}.mt-5[data-v-575455fb]{margin-top:5px!important}.mt-10[data-v-575455fb]{margin-top:10px!important}.mt-15[data-v-575455fb]{margin-top:15px!important}.mt-20[data-v-575455fb]{margin-top:20px!important}.mt-25[data-v-575455fb]{margin-top:25px!important}.mt-30[data-v-575455fb]{margin-top:30px!important}.mt-35[data-v-575455fb]{margin-top:35px!important}.mt-40[data-v-575455fb]{margin-top:40px!important}.mt-45[data-v-575455fb]{margin-top:45px!important}.mt-50[data-v-575455fb]{margin-top:50px!important}.mt-55[data-v-575455fb]{margin-top:55px!important}.mt-60[data-v-575455fb]{margin-top:60px!important}.mt-65[data-v-575455fb]{margin-top:65px!important}.mt-70[data-v-575455fb]{margin-top:70px!important}.mt-75[data-v-575455fb]{margin-top:75px!important}.mt-80[data-v-575455fb]{margin-top:80px!important}.mt-85[data-v-575455fb]{margin-top:85px!important}.mt-90[data-v-575455fb]{margin-top:90px!important}.mt-95[data-v-575455fb]{margin-top:95px!important}.mt-100[data-v-575455fb]{margin-top:100px!important}.mb-0[data-v-575455fb]{margin-bottom:0!important}.mb-5[data-v-575455fb]{margin-bottom:5px!important}.mb-10[data-v-575455fb]{margin-bottom:10px!important}.mb-15[data-v-575455fb]{margin-bottom:15px!important}.mb-20[data-v-575455fb]{margin-bottom:20px!important}.mb-25[data-v-575455fb]{margin-bottom:25px!important}.mb-30[data-v-575455fb]{margin-bottom:30px!important}.mb-35[data-v-575455fb]{margin-bottom:35px!important}.mb-40[data-v-575455fb]{margin-bottom:40px!important}.mb-45[data-v-575455fb]{margin-bottom:45px!important}.mb-50[data-v-575455fb]{margin-bottom:50px!important}.mb-55[data-v-575455fb]{margin-bottom:55px!important}.mb-60[data-v-575455fb]{margin-bottom:60px!important}.mb-65[data-v-575455fb]{margin-bottom:65px!important}.mb-70[data-v-575455fb]{margin-bottom:70px!important}.mb-75[data-v-575455fb]{margin-bottom:75px!important}.mb-80[data-v-575455fb]{margin-bottom:80px!important}.mb-85[data-v-575455fb]{margin-bottom:85px!important}.mb-90[data-v-575455fb]{margin-bottom:90px!important}.mb-95[data-v-575455fb]{margin-bottom:95px!important}.mb-100[data-v-575455fb]{margin-bottom:100px!important}.ml-0[data-v-575455fb]{margin-left:0!important}.ml-5[data-v-575455fb]{margin-left:5px!important}.ml-10[data-v-575455fb]{margin-left:10px!important}.ml-15[data-v-575455fb]{margin-left:15px!important}.ml-20[data-v-575455fb]{margin-left:20px!important}.ml-25[data-v-575455fb]{margin-left:25px!important}.ml-30[data-v-575455fb]{margin-left:30px!important}.ml-35[data-v-575455fb]{margin-left:35px!important}.ml-40[data-v-575455fb]{margin-left:40px!important}.ml-45[data-v-575455fb]{margin-left:45px!important}.ml-50[data-v-575455fb]{margin-left:50px!important}.ml-55[data-v-575455fb]{margin-left:55px!important}.ml-60[data-v-575455fb]{margin-left:60px!important}.ml-65[data-v-575455fb]{margin-left:65px!important}.ml-70[data-v-575455fb]{margin-left:70px!important}.ml-75[data-v-575455fb]{margin-left:75px!important}.ml-80[data-v-575455fb]{margin-left:80px!important}.ml-85[data-v-575455fb]{margin-left:85px!important}.ml-90[data-v-575455fb]{margin-left:90px!important}.ml-95[data-v-575455fb]{margin-left:95px!important}.ml-100[data-v-575455fb]{margin-left:100px!important}.mr-0[data-v-575455fb]{margin-right:0!important}.mr-5[data-v-575455fb]{margin-right:5px!important}.mr-10[data-v-575455fb]{margin-right:10px!important}.mr-15[data-v-575455fb]{margin-right:15px!important}.mr-20[data-v-575455fb]{margin-right:20px!important}.mr-25[data-v-575455fb]{margin-right:25px!important}.mr-30[data-v-575455fb]{margin-right:30px!important}.mr-35[data-v-575455fb]{margin-right:35px!important}.mr-40[data-v-575455fb]{margin-right:40px!important}.mr-45[data-v-575455fb]{margin-right:45px!important}.mr-50[data-v-575455fb]{margin-right:50px!important}.mr-55[data-v-575455fb]{margin-right:55px!important}.mr-60[data-v-575455fb]{margin-right:60px!important}.mr-65[data-v-575455fb]{margin-right:65px!important}.mr-70[data-v-575455fb]{margin-right:70px!important}.mr-75[data-v-575455fb]{margin-right:75px!important}.mr-80[data-v-575455fb]{margin-right:80px!important}.mr-85[data-v-575455fb]{margin-right:85px!important}.mr-90[data-v-575455fb]{margin-right:90px!important}.mr-95[data-v-575455fb]{margin-right:95px!important}.mr-100[data-v-575455fb]{margin-right:100px!important}.pt-0[data-v-575455fb]{padding-top:0!important}.pt-5[data-v-575455fb]{padding-top:5px!important}.pt-10[data-v-575455fb]{padding-top:10px!important}.pt-15[data-v-575455fb]{padding-top:15px!important}.pt-20[data-v-575455fb]{padding-top:20px!important}.pt-25[data-v-575455fb]{padding-top:25px!important}.pt-30[data-v-575455fb]{padding-top:30px!important}.pt-35[data-v-575455fb]{padding-top:35px!important}.pt-40[data-v-575455fb]{padding-top:40px!important}.pt-45[data-v-575455fb]{padding-top:45px!important}.pt-50[data-v-575455fb]{padding-top:50px!important}.pt-55[data-v-575455fb]{padding-top:55px!important}.pt-60[data-v-575455fb]{padding-top:60px!important}.pt-65[data-v-575455fb]{padding-top:65px!important}.pt-70[data-v-575455fb]{padding-top:70px!important}.pt-75[data-v-575455fb]{padding-top:75px!important}.pt-80[data-v-575455fb]{padding-top:80px!important}.pt-85[data-v-575455fb]{padding-top:85px!important}.pt-90[data-v-575455fb]{padding-top:90px!important}.pt-95[data-v-575455fb]{padding-top:95px!important}.pt-100[data-v-575455fb]{padding-top:100px!important}.pb-0[data-v-575455fb]{padding-bottom:0!important}.pb-5[data-v-575455fb]{padding-bottom:5px!important}.pb-10[data-v-575455fb]{padding-bottom:10px!important}.pb-15[data-v-575455fb]{padding-bottom:15px!important}.pb-20[data-v-575455fb]{padding-bottom:20px!important}.pb-25[data-v-575455fb]{padding-bottom:25px!important}.pb-30[data-v-575455fb]{padding-bottom:30px!important}.pb-35[data-v-575455fb]{padding-bottom:35px!important}.pb-40[data-v-575455fb]{padding-bottom:40px!important}.pb-45[data-v-575455fb]{padding-bottom:45px!important}.pb-50[data-v-575455fb]{padding-bottom:50px!important}.pb-55[data-v-575455fb]{padding-bottom:55px!important}.pb-60[data-v-575455fb]{padding-bottom:60px!important}.pb-65[data-v-575455fb]{padding-bottom:65px!important}.pb-70[data-v-575455fb]{padding-bottom:70px!important}.pb-75[data-v-575455fb]{padding-bottom:75px!important}.pb-80[data-v-575455fb]{padding-bottom:80px!important}.pb-85[data-v-575455fb]{padding-bottom:85px!important}.pb-90[data-v-575455fb]{padding-bottom:90px!important}.pb-95[data-v-575455fb]{padding-bottom:95px!important}.pb-100[data-v-575455fb]{padding-bottom:100px!important}.pl-0[data-v-575455fb]{padding-left:0!important}.pl-5[data-v-575455fb]{padding-left:5px!important}.pl-10[data-v-575455fb]{padding-left:10px!important}.pl-15[data-v-575455fb]{padding-left:15px!important}.pl-20[data-v-575455fb]{padding-left:20px!important}.pl-25[data-v-575455fb]{padding-left:25px!important}.pl-30[data-v-575455fb]{padding-left:30px!important}.pl-35[data-v-575455fb]{padding-left:35px!important}.pl-40[data-v-575455fb]{padding-left:40px!important}.pl-45[data-v-575455fb]{padding-left:45px!important}.pl-50[data-v-575455fb]{padding-left:50px!important}.pl-55[data-v-575455fb]{padding-left:55px!important}.pl-60[data-v-575455fb]{padding-left:60px!important}.pl-65[data-v-575455fb]{padding-left:65px!important}.pl-70[data-v-575455fb]{padding-left:70px!important}.pl-75[data-v-575455fb]{padding-left:75px!important}.pl-80[data-v-575455fb]{padding-left:80px!important}.pl-85[data-v-575455fb]{padding-left:85px!important}.pl-90[data-v-575455fb]{padding-left:90px!important}.pl-95[data-v-575455fb]{padding-left:95px!important}.pl-100[data-v-575455fb]{padding-left:100px!important}.pr-0[data-v-575455fb]{padding-right:0!important}.pr-5[data-v-575455fb]{padding-right:5px!important}.pr-10[data-v-575455fb]{padding-right:10px!important}.pr-15[data-v-575455fb]{padding-right:15px!important}.pr-20[data-v-575455fb]{padding-right:20px!important}.pr-25[data-v-575455fb]{padding-right:25px!important}.pr-30[data-v-575455fb]{padding-right:30px!important}.pr-35[data-v-575455fb]{padding-right:35px!important}.pr-40[data-v-575455fb]{padding-right:40px!important}.pr-45[data-v-575455fb]{padding-right:45px!important}.pr-50[data-v-575455fb]{padding-right:50px!important}.pr-55[data-v-575455fb]{padding-right:55px!important}.pr-60[data-v-575455fb]{padding-right:60px!important}.pr-65[data-v-575455fb]{padding-right:65px!important}.pr-70[data-v-575455fb]{padding-right:70px!important}.pr-75[data-v-575455fb]{padding-right:75px!important}.pr-80[data-v-575455fb]{padding-right:80px!important}.pr-85[data-v-575455fb]{padding-right:85px!important}.pr-90[data-v-575455fb]{padding-right:90px!important}.pr-95[data-v-575455fb]{padding-right:95px!important}.pr-100[data-v-575455fb]{padding-right:100px!important}.top-nav[data-v-575455fb]{background-color:#fff}@media screen and (max-width:64.0625em){.top-nav[data-v-575455fb]{margin-top:35px}}@media screen and (max-width:61.9375em){.top-nav .container[data-v-575455fb]{max-width:none}}.top-nav .corporate[data-v-575455fb]{align-items:center;display:flex;flex-wrap:wrap;justify-content:space-between}@media screen and (max-width:47.9375em){.top-nav .corporate[data-v-575455fb]{justify-content:flex-start}}.top-nav .corporate__logo[data-v-575455fb]{order:1}.top-nav .corporate__logo a[data-v-575455fb]:focus{display:block;font-weight:400!important;outline:2px solid #f69038!important;outline-offset:6px!important}@media screen and (max-width:47.9375em){.top-nav .corporate__logo[data-v-575455fb]{margin-right:auto}}.top-nav .corporate__mobile-search[data-v-575455fb]{display:none}@media screen and (max-width:47.9375em){.top-nav .corporate__mobile-search[data-v-575455fb]{display:block;order:2}}.top-nav .corporate__mobile-search-btn[data-v-575455fb]{background-color:#fff;border:none;margin-right:15px;padding:0}@media screen and (max-width:28.125em){.top-nav .corporate__mobile-search-btn[data-v-575455fb]{display:block}}.top-nav .corporate__mobile-search-btn .icon[data-v-575455fb]{font-size:2.5rem}.top-nav .corporate__mobile-search-invisible[data-v-575455fb]{margin-top:3px}.top-nav .corporate__mobile-search-visible[data-v-575455fb]{position:relative}.top-nav .corporate__mobile-search-visible .icon.fas.fa-search[data-v-575455fb]{margin-top:3px}.top-nav .corporate__mobile-search-visible .icon.fas.fa-slash[data-v-575455fb]{left:-5px;position:absolute;top:0;transform:rotate(90deg)}.top-nav .corporate__mobile-menu[data-v-575455fb]{display:none}@media screen and (max-width:59.625em){.top-nav .corporate__mobile-menu[data-v-575455fb]{display:block;order:3}}.top-nav .corporate__mobile-menu button[data-v-575455fb]{background-color:#fff;border:none;padding:0}.top-nav .corporate__mobile-menu button .icon[data-v-575455fb]{font-size:2.5rem}.top-nav .corporate__search[data-v-575455fb]{align-items:strech;background:#fff;border-radius:3px;display:flex;flex:1;margin:0 25px;order:2;position:relative;z-index:99999}@media screen and (max-width:47.9375em){.top-nav .corporate__search[data-v-575455fb]{flex-basis:100%;margin:5px 0 10px;order:4}}.top-nav .corporate__search-form[data-v-575455fb]{display:flex;flex:1;position:relative}.top-nav .corporate__search-select[data-v-575455fb]{-moz-appearance:none;appearance:none;-webkit-appearance:none;background-color:#fff;background-image:linear-gradient(45deg,transparent 50%,#5d6581 0),linear-gradient(135deg,#5d6581 50%,transparent 0);background-position:calc(100% - 8px) calc(1em - 1px),calc(100% - 3px) calc(1em - 1px),100% 0;background-repeat:no-repeat;background-size:5px 5px,5px 5px,2.5em 2.5em;border-radius:3px 0 0 3px!important;-webkit-border-radius:0;border-right:1px solid transparent;padding:0 20px 0 5px}.top-nav .corporate__search-select[data-v-575455fb]:focus{background-color:rgba(48,136,223,.071);border:1px solid #3088df;outline:none}.top-nav .corporate__search-input[data-v-575455fb]{font-size:1.2rem;margin-left:auto;transition:.3s;width:100%}.top-nav .corporate__search-input[data-v-575455fb]:focus{outline:none}.top-nav .corporate__search-box[data-v-575455fb]{background-color:#fff;border:1px solid #b1b4bf;border-radius:3px;box-shadow:0 4px 10px 0 rgba(0,0,0,.2);display:block;position:absolute;top:32px;width:130%;z-index:99999}@media screen and (max-width:59.625em){.top-nav .corporate__search-box[data-v-575455fb]{width:100%}}.top-nav .corporate__search-main-link[data-v-575455fb]{color:#1a254c;display:block;font-size:1.6rem;padding:7px 10px;-webkit-text-decoration:none;text-decoration:none;transition:all .3s;width:100%;word-break:break-word}.top-nav .corporate__search-main-link[data-v-575455fb]:hover{background-color:#e3edfe;text-indent:3px}.top-nav .corporate__search-results[data-v-575455fb]{background-color:#fff;border-top:1px solid #1a254c;list-style:none;margin:0;max-height:410px;overflow-y:scroll;padding:0}.top-nav .corporate__search-heading-search[data-v-575455fb]{color:gray;font-size:1.2rem;font-weight:700;margin-bottom:0;padding:0 10px}.top-nav .corporate__search-list[data-v-575455fb]{display:flex;padding:7px 10px;transition:all .3s}.top-nav .corporate__search-list[data-v-575455fb]:hover{background-color:#e8ecee}.top-nav .corporate__search-list .icon[data-v-575455fb]{color:#1a254c;font-size:1rem;margin-right:5px;margin-top:2px}.top-nav .corporate__search-link[data-v-575455fb]{color:#1a254c;display:inline-block}.top-nav .corporate__search-link[data-v-575455fb],.top-nav .corporate__search-related-link[data-v-575455fb]{cursor:pointer;overflow:hidden;text-overflow:ellipsis;white-space:nowrap;width:-moz-fit-content;width:fit-content}.top-nav .corporate__search-related-link[data-v-575455fb]{color:#3088df;font-size:1.2rem;line-height:17px;margin-left:5px}.top-nav .corporate__search-related-link[data-v-575455fb]:hover{-webkit-text-decoration:underline;text-decoration:underline}.top-nav .corporate__search-btn[data-v-575455fb]{border-radius:0 3px 3px 0;margin:0;padding:6px 25px}.top-nav .corporate__search-btn[data-v-575455fb]:focus{outline:2px solid #f69038!important;outline-offset:6px!important}.top-nav .corporate__nav[data-v-575455fb]{order:3}@media screen and (max-width:59.625em){.top-nav .corporate__nav[data-v-575455fb]{display:none}}.top-nav .corporate__links[data-v-575455fb]{display:flex;flex-wrap:wrap;padding:0}.top-nav .corporate__link-item[data-v-575455fb]{list-style:none;outline:none;padding:10px 0;position:relative}.top-nav .corporate__link-item[data-v-575455fb]:not(:last-child){align-self:center;margin-right:15px}.top-nav .corporate__link-item[data-v-575455fb]:focus-within,.top-nav .corporate__link-item[data-v-575455fb]:hover{background-color:hsla(0,0%,100%,.2);transition:all .2s ease}.top-nav .corporate__link-item:focus-within .corporate__link[data-v-575455fb],.top-nav .corporate__link-item:hover .corporate__link[data-v-575455fb]{color:#1e70c2}.top-nav .corporate__link-item:focus-within .corporate__link-submenu[data-v-575455fb],.top-nav .corporate__link-item:hover .corporate__link-submenu[data-v-575455fb]{visibility:visible}.top-nav .corporate__link-item:first-child:hover span[data-v-575455fb]{color:#1e70c2;transition:all .2s ease}.top-nav .corporate__link-item:focus-within .icon[data-v-575455fb],.top-nav .corporate__link-item:hover .icon[data-v-575455fb]{color:#1e70c2;transform:rotate(180deg);transition:transform .2s ease}.top-nav .corporate__link-item .icon[data-v-575455fb]{color:#1a254c;transition:transform .2s ease}.top-nav .corporate__link-item .icon[data-v-575455fb]:hover{cursor:pointer}.top-nav .corporate__link[data-v-575455fb]{color:#1a254c;cursor:pointer;font-size:1.6rem;-webkit-text-decoration:none;text-decoration:none;transition:all .2s ease}.top-nav .corporate__link-submenu[data-v-575455fb]{background-color:#fff;box-shadow:0 0 10px 0 rgba(0,0,0,.2);display:block;padding:0;position:absolute;top:35px;visibility:hidden;white-space:nowrap;z-index:300}.top-nav .corporate__link-submenu[data-v-575455fb]:before{border:7px solid transparent;border-bottom-color:#fff;content:"";height:0;left:50%;position:absolute;top:-14px;transform:translateX(-50%);width:14px}.top-nav .corporate__link-submenu li[data-v-575455fb]{list-style:none}.top-nav .corporate__link-submenu li a[data-v-575455fb]{color:#1a254c;display:block;padding:10px 20px;-webkit-text-decoration:none!important;text-decoration:none!important}.top-nav .corporate__link-submenu li a[data-v-575455fb]:focus-within,.top-nav .corporate__link-submenu li a[data-v-575455fb]:hover{background-color:#eceff9;font-weight:400}.top-nav .corporate__link-submenu--my-roles[data-v-575455fb]:focus{outline:2px solid #f69038!important;outline-offset:6px!important}.top-nav .corporate__link-item--logged-in[data-v-575455fb]{list-style:none}.top-nav .corporate__logged-in-link[data-v-575455fb]{cursor:pointer;font-weight:400;-webkit-text-decoration:none;text-decoration:none}.top-nav .corporate__user-account[data-v-575455fb]{align-items:center;display:flex;flex-wrap:wrap;justify-content:space-between}.top-nav .corporate__user-info[data-v-575455fb]{margin-left:5px;margin-right:5px}.top-nav .corporate__user-info p[data-v-575455fb]{color:#1a254c;line-height:10px}.top-nav .corporate__user-info small[data-v-575455fb]{color:#1a254c;opacity:.7}.top-nav .corporate__user-img[data-v-575455fb]{border-radius:50px;overflow:hidden}.top-nav .corporate__user-img img[data-v-575455fb]{height:auto;width:35px}.top-nav .corporate__user-details .icon[data-v-575455fb]{color:#1a254c!important;font-size:1.4rem}.top-nav .corporate__link-submenu--user[data-v-575455fb]{min-width:180px;padding:0;top:50px}.top-nav .corporate__link-submenu--user p[data-v-575455fb]{font-weight:700;padding:0 20px}.bottom-nav-1[data-v-575455fb]{background-color:#1a254c}@media screen and (max-width:59.625em){.bottom-nav-1[data-v-575455fb]{display:none}}@media screen and (max-width:47.9375em){.bottom-nav-1 .container[data-v-575455fb]{margin-left:0;margin-right:0;max-width:none;padding:0}}@media screen and (max-width:61.9375em){.bottom-nav-1 .container[data-v-575455fb]{max-width:none}}@media screen and (max-width:59.6875em){.bottom-nav-2[data-v-575455fb]{background-color:#1a254c;padding:0}}@media screen and (min-width:59.6875em){.bottom-nav-2[data-v-575455fb]{display:none}}.bottom-nav-2 .journal ul[data-v-575455fb]{margin:0;padding:0;width:100%}.bottom-nav-2 .journal__link-item--journals[data-v-575455fb]{width:100%}@media screen and (max-width:37.5em){.bottom-nav-2 .journal__link-submenu-journals li a[data-v-575455fb]{padding:10px}}@media screen and (max-width:26.9375em){.bottom-nav-2 .journal__link-submenu-journals li a[data-v-575455fb]{padding:10px 10px 15px}}@media screen and (max-width:37.5em){.bottom-nav-2 .journal__link--home[data-v-575455fb]{padding:10px}}.journal[data-v-575455fb]{align-items:center;display:flex;flex-wrap:wrap;justify-content:flex-start}.journal__nav[data-v-575455fb]{background-color:#1a254c;display:flex}.journal__links[data-v-575455fb]{display:flex;flex-wrap:wrap;margin:0;padding:0}.journal__link-item[data-v-575455fb]{cursor:pointer;list-style:none;outline:none;padding:16px 0;position:relative;transition:all .2s ease}@media screen and (max-width:45.9375em){.journal__link-item[data-v-575455fb]{width:100%}}.journal__link-item:focus-within .journal__link[data-v-575455fb],.journal__link-item:hover .journal__link[data-v-575455fb]{color:#b3b3b3;transition:all .2s ease}.journal__link-item:focus-within .icon.fa-caret-down[data-v-575455fb],.journal__link-item:hover .icon.fa-caret-down[data-v-575455fb]{color:#b3b3b3;transform:rotate(180deg);transition:all .2s ease}.journal__link-item:focus-within .journal__link-submenu[data-v-575455fb],.journal__link-item:hover .journal__link-submenu[data-v-575455fb]{visibility:visible}.journal__link-item .icon[data-v-575455fb]{color:#fff;font-size:14px}.journal__link-item--journals[data-v-575455fb]{padding:0}.journal__journals-list[data-v-575455fb]{outline:none}.journal__journals-list:focus .icon[data-v-575455fb]{border:1px solid #fff;transform:rotate(1turn);transition:.5s}.journal__journals-list a[data-v-575455fb]{padding-right:10px}.journal__journals-list .icon[data-v-575455fb]{border-left:1px solid #fff;font-size:1.6rem}.journal__journals-list .icon.fas.fa-arrow-down[data-v-575455fb]{padding:16px}@media screen and (max-width:37.5em){.journal__journals-list .icon.fas.fa-arrow-down[data-v-575455fb]{padding:12px}}.journal__journals-list .icon.fas.fa-times[data-v-575455fb]{padding:16px 17.5px}@media screen and (max-width:37.5em){.journal__journals-list .icon.fas.fa-times[data-v-575455fb]{padding:12px 13.5px}}.journal__link-item-container[data-v-575455fb]{align-items:center;display:flex;justify-content:space-between}.journal__link[data-v-575455fb]{color:#fff;cursor:pointer;font-size:1.6rem;-webkit-text-decoration:none;text-decoration:none;transition:all .2s ease}@media screen and (max-width:45.9375em){.journal__link[data-v-575455fb]{padding-left:30px}}.journal__link--home[data-v-575455fb]{color:#fff;cursor:pointer;font-size:1.6rem;font-weight:400!important;padding:16px;-webkit-text-decoration:none;text-decoration:none;transition:.3s;width:100%}.journal__link--home[data-v-575455fb]:focus{text-indent:4px}.journal__link-submenu[data-v-575455fb]{background:#fff;box-shadow:0 0 10px 0 rgba(0,0,0,.2);display:block;padding:0;position:absolute;top:42px;visibility:hidden;white-space:nowrap;z-index:300}.journal__link-submenu[data-v-575455fb]:before{border:7px solid transparent;border-bottom-color:#fff;content:"";height:0;left:50%;position:absolute;top:-14px;transform:translateX(-50%);width:14px}@media screen and (max-width:45.9375em){.journal__link-submenu[data-v-575455fb]:before{display:none}.journal__link-submenu[data-v-575455fb]{position:static}}.journal__link-submenu li[data-v-575455fb]{list-style:none}.journal__link-submenu li a[data-v-575455fb]{color:#1a254c;display:block;padding:10px 20px;-webkit-text-decoration:none!important;text-decoration:none!important}.journal__link-submenu li a[data-v-575455fb]:focus,.journal__link-submenu li a[data-v-575455fb]:hover{background-color:#eceff9;font-weight:400}@media screen and (max-width:45.9375em){.journal__link-submenu li a[data-v-575455fb]{padding:10px 40px}}.journal__link-submenu-journals[data-v-575455fb]{background:#1e70c2;box-shadow:0 4px 10px 0 rgba(0,0,0,.2);display:none;height:400px;left:0;overflow-y:scroll;padding:0;position:absolute;top:51px;white-space:nowrap;width:437px;z-index:300}@media screen and (max-width:45.9375em){.journal__link-submenu-journals[data-v-575455fb]{position:static}}.journal__link-submenu-journals li[data-v-575455fb]{border-top:1px solid #fff;list-style:none;transition:all .3s ease}.journal__link-submenu-journals li[data-v-575455fb]:focus,.journal__link-submenu-journals li[data-v-575455fb]:hover{background-color:#2d77c6;font-weight:400;text-indent:8px}.journal__link-submenu-journals li a[data-v-575455fb]{color:#fff;display:flex;flex-wrap:wrap;font-size:1.6rem;justify-content:space-between;padding:10px 10px 10px 15px;position:relative;-webkit-text-decoration:none!important;text-decoration:none!important}@media screen and (max-width:22.1875em){.journal__link-submenu-journals li a[data-v-575455fb]{font-size:1.4rem}}.journal__link-submenu-journals li a span.articles-number[data-v-575455fb]{align-self:center;font-size:1rem}.journal__link-item--journal-info[data-v-575455fb]{margin-left:29px}.journal__link-item--browse[data-v-575455fb],.journal__link-item--journal-info[data-v-575455fb]{margin-right:29px}.journal__link-item--journals[data-v-575455fb]{background-color:#1e70c2;width:437px}.journal__link-submenu--journal-info[data-v-575455fb]{left:-50px}.journal__link-submenu--browse[data-v-575455fb]{left:-30px}.journal__link-submenu--select[data-v-575455fb]{align-items:center;display:flex;padding:10px 20px}@media screen and (max-width:45.9375em){.journal__link-submenu--select[data-v-575455fb]{padding:10px 40px!important}}.journal__link-submenu--select label[data-v-575455fb]{margin-right:10px}.journal__link-submenu--select select[data-v-575455fb]{width:-webkit-fill-available}.journal__link-submenu--select select[data-v-575455fb]:focus{outline:none}.journal__submit-article[data-v-575455fb]{font-size:1.6rem!important;-webkit-text-decoration:none!important;text-decoration:none!important}.journal__submit-article[data-v-575455fb]:focus{font-weight:400;outline:2px solid #f69038!important;outline-offset:6px!important}@media screen and (max-width:45.9375em){.journal__submit-article[data-v-575455fb]{margin-left:30px;padding:12px}}.mobile-nav[data-v-575455fb]{background-color:#1a254c}@media screen and (max-width:59.625em){.mobile-nav[data-v-575455fb]{display:block}}.mobile-nav a[data-v-575455fb]{-webkit-text-decoration:none;text-decoration:none}.mobile-nav__links[data-v-575455fb]{display:flex;flex-direction:column;flex-wrap:wrap}.mobile-nav__links a[data-v-575455fb]{border-bottom:1px solid #485170;color:#fff;padding:12px 15px}.mobile-nav__links[data-v-575455fb]:first-child{border-top:1px solid #485170}.mobile-nav__expandable[data-v-575455fb]{align-items:center;justify-content:space-between}.mobile-nav__expandable[data-v-575455fb],.mobile-nav__user-account[data-v-575455fb]{display:flex;flex-wrap:wrap}.mobile-nav__user-img[data-v-575455fb]{border:2px solid #fff;border-radius:50px;margin-right:10px;overflow:hidden}.mobile-nav__user-img img[data-v-575455fb]{height:auto;width:35px}.mobile-nav__user-info small[data-v-575455fb]{opacity:.7}.mobile-nav__user-submenu[data-v-575455fb]{background-color:#313b5e;display:flex;flex-basis:100%;flex-direction:column}.mobile-nav__user-submenu p[data-v-575455fb]{color:#fff;font-weight:700;opacity:.5;padding:0 15px}.mobile-nav__user-submenu a[data-v-575455fb]{border-bottom:none}.mobile-nav__social-media[data-v-575455fb]{padding:12px 15px}.mobile-nav__social-media p[data-v-575455fb]{color:#fff;font-weight:700}input[data-v-575455fb]{border-radius:0!important}.show-journals[data-v-575455fb]{display:block!important}.journal__link-submenu-journals[data-v-575455fb]::-webkit-scrollbar{width:7px}.journal__link-submenu-journals[data-v-575455fb]::-webkit-scrollbar-track{background-color:#fff;border:1px solid rgba(0,0,0,.15)}.journal__link-submenu-journals[data-v-575455fb]::-webkit-scrollbar-thumb{background:#2d77c6;border-radius:20px}.journal__link-submenu-journals[data-v-575455fb]::-webkit-scrollbar-thumb:hover{background:#1e60bc}.corporate__search-results[data-v-575455fb]::-webkit-scrollbar{width:5.5px}.corporate__search-results[data-v-575455fb]::-webkit-scrollbar-track{background-color:#fff;border:1px solid rgba(0,0,0,.15);border-radius:3px}.corporate__search-results[data-v-575455fb]::-webkit-scrollbar-thumb{background:#1e70c2;border-radius:20px}.overlay[data-v-575455fb]{background:rgba(17,26,55,.702);height:100vh;left:0;padding:200vh 200vw;position:fixed;top:0;width:100%;z-index:999}.remove-styling[data-v-575455fb]{border:0!important;padding:0!important} .mt-0{margin-top:0!important}.mt-5{margin-top:5px!important}.mt-10{margin-top:10px!important}.mt-15{margin-top:15px!important}.mt-20{margin-top:20px!important}.mt-25{margin-top:25px!important}.mt-30{margin-top:30px!important}.mt-35{margin-top:35px!important}.mt-40{margin-top:40px!important}.mt-45{margin-top:45px!important}.mt-50{margin-top:50px!important}.mt-55{margin-top:55px!important}.mt-60{margin-top:60px!important}.mt-65{margin-top:65px!important}.mt-70{margin-top:70px!important}.mt-75{margin-top:75px!important}.mt-80{margin-top:80px!important}.mt-85{margin-top:85px!important}.mt-90{margin-top:90px!important}.mt-95{margin-top:95px!important}.mt-100{margin-top:100px!important}.mb-0{margin-bottom:0!important}.mb-5{margin-bottom:5px!important}.mb-10{margin-bottom:10px!important}.mb-15{margin-bottom:15px!important}.mb-20{margin-bottom:20px!important}.mb-25{margin-bottom:25px!important}.mb-30{margin-bottom:30px!important}.mb-35{margin-bottom:35px!important}.mb-40{margin-bottom:40px!important}.mb-45{margin-bottom:45px!important}.mb-50{margin-bottom:50px!important}.mb-55{margin-bottom:55px!important}.mb-60{margin-bottom:60px!important}.mb-65{margin-bottom:65px!important}.mb-70{margin-bottom:70px!important}.mb-75{margin-bottom:75px!important}.mb-80{margin-bottom:80px!important}.mb-85{margin-bottom:85px!important}.mb-90{margin-bottom:90px!important}.mb-95{margin-bottom:95px!important}.mb-100{margin-bottom:100px!important}.ml-0{margin-left:0!important}.ml-5{margin-left:5px!important}.ml-10{margin-left:10px!important}.ml-15{margin-left:15px!important}.ml-20{margin-left:20px!important}.ml-25{margin-left:25px!important}.ml-30{margin-left:30px!important}.ml-35{margin-left:35px!important}.ml-40{margin-left:40px!important}.ml-45{margin-left:45px!important}.ml-50{margin-left:50px!important}.ml-55{margin-left:55px!important}.ml-60{margin-left:60px!important}.ml-65{margin-left:65px!important}.ml-70{margin-left:70px!important}.ml-75{margin-left:75px!important}.ml-80{margin-left:80px!important}.ml-85{margin-left:85px!important}.ml-90{margin-left:90px!important}.ml-95{margin-left:95px!important}.ml-100{margin-left:100px!important}.mr-0{margin-right:0!important}.mr-5{margin-right:5px!important}.mr-10{margin-right:10px!important}.mr-15{margin-right:15px!important}.mr-20{margin-right:20px!important}.mr-25{margin-right:25px!important}.mr-30{margin-right:30px!important}.mr-35{margin-right:35px!important}.mr-40{margin-right:40px!important}.mr-45{margin-right:45px!important}.mr-50{margin-right:50px!important}.mr-55{margin-right:55px!important}.mr-60{margin-right:60px!important}.mr-65{margin-right:65px!important}.mr-70{margin-right:70px!important}.mr-75{margin-right:75px!important}.mr-80{margin-right:80px!important}.mr-85{margin-right:85px!important}.mr-90{margin-right:90px!important}.mr-95{margin-right:95px!important}.mr-100{margin-right:100px!important}.pt-0{padding-top:0!important}.pt-5{padding-top:5px!important}.pt-10{padding-top:10px!important}.pt-15{padding-top:15px!important}.pt-20{padding-top:20px!important}.pt-25{padding-top:25px!important}.pt-30{padding-top:30px!important}.pt-35{padding-top:35px!important}.pt-40{padding-top:40px!important}.pt-45{padding-top:45px!important}.pt-50{padding-top:50px!important}.pt-55{padding-top:55px!important}.pt-60{padding-top:60px!important}.pt-65{padding-top:65px!important}.pt-70{padding-top:70px!important}.pt-75{padding-top:75px!important}.pt-80{padding-top:80px!important}.pt-85{padding-top:85px!important}.pt-90{padding-top:90px!important}.pt-95{padding-top:95px!important}.pt-100{padding-top:100px!important}.pb-0{padding-bottom:0!important}.pb-5{padding-bottom:5px!important}.pb-10{padding-bottom:10px!important}.pb-15{padding-bottom:15px!important}.pb-20{padding-bottom:20px!important}.pb-25{padding-bottom:25px!important}.pb-30{padding-bottom:30px!important}.pb-35{padding-bottom:35px!important}.pb-40{padding-bottom:40px!important}.pb-45{padding-bottom:45px!important}.pb-50{padding-bottom:50px!important}.pb-55{padding-bottom:55px!important}.pb-60{padding-bottom:60px!important}.pb-65{padding-bottom:65px!important}.pb-70{padding-bottom:70px!important}.pb-75{padding-bottom:75px!important}.pb-80{padding-bottom:80px!important}.pb-85{padding-bottom:85px!important}.pb-90{padding-bottom:90px!important}.pb-95{padding-bottom:95px!important}.pb-100{padding-bottom:100px!important}.pl-0{padding-left:0!important}.pl-5{padding-left:5px!important}.pl-10{padding-left:10px!important}.pl-15{padding-left:15px!important}.pl-20{padding-left:20px!important}.pl-25{padding-left:25px!important}.pl-30{padding-left:30px!important}.pl-35{padding-left:35px!important}.pl-40{padding-left:40px!important}.pl-45{padding-left:45px!important}.pl-50{padding-left:50px!important}.pl-55{padding-left:55px!important}.pl-60{padding-left:60px!important}.pl-65{padding-left:65px!important}.pl-70{padding-left:70px!important}.pl-75{padding-left:75px!important}.pl-80{padding-left:80px!important}.pl-85{padding-left:85px!important}.pl-90{padding-left:90px!important}.pl-95{padding-left:95px!important}.pl-100{padding-left:100px!important}.pr-0{padding-right:0!important}.pr-5{padding-right:5px!important}.pr-10{padding-right:10px!important}.pr-15{padding-right:15px!important}.pr-20{padding-right:20px!important}.pr-25{padding-right:25px!important}.pr-30{padding-right:30px!important}.pr-35{padding-right:35px!important}.pr-40{padding-right:40px!important}.pr-45{padding-right:45px!important}.pr-50{padding-right:50px!important}.pr-55{padding-right:55px!important}.pr-60{padding-right:60px!important}.pr-65{padding-right:65px!important}.pr-70{padding-right:70px!important}.pr-75{padding-right:75px!important}.pr-80{padding-right:80px!important}.pr-85{padding-right:85px!important}.pr-90{padding-right:90px!important}.pr-95{padding-right:95px!important}.pr-100{padding-right:100px!important}.logo-img{max-width:340px;width:100%} .mt-0{margin-top:0!important}.mt-5{margin-top:5px!important}.mt-10{margin-top:10px!important}.mt-15{margin-top:15px!important}.mt-20{margin-top:20px!important}.mt-25{margin-top:25px!important}.mt-30{margin-top:30px!important}.mt-35{margin-top:35px!important}.mt-40{margin-top:40px!important}.mt-45{margin-top:45px!important}.mt-50{margin-top:50px!important}.mt-55{margin-top:55px!important}.mt-60{margin-top:60px!important}.mt-65{margin-top:65px!important}.mt-70{margin-top:70px!important}.mt-75{margin-top:75px!important}.mt-80{margin-top:80px!important}.mt-85{margin-top:85px!important}.mt-90{margin-top:90px!important}.mt-95{margin-top:95px!important}.mt-100{margin-top:100px!important}.mb-0{margin-bottom:0!important}.mb-5{margin-bottom:5px!important}.mb-10{margin-bottom:10px!important}.mb-15{margin-bottom:15px!important}.mb-20{margin-bottom:20px!important}.mb-25{margin-bottom:25px!important}.mb-30{margin-bottom:30px!important}.mb-35{margin-bottom:35px!important}.mb-40{margin-bottom:40px!important}.mb-45{margin-bottom:45px!important}.mb-50{margin-bottom:50px!important}.mb-55{margin-bottom:55px!important}.mb-60{margin-bottom:60px!important}.mb-65{margin-bottom:65px!important}.mb-70{margin-bottom:70px!important}.mb-75{margin-bottom:75px!important}.mb-80{margin-bottom:80px!important}.mb-85{margin-bottom:85px!important}.mb-90{margin-bottom:90px!important}.mb-95{margin-bottom:95px!important}.mb-100{margin-bottom:100px!important}.ml-0{margin-left:0!important}.ml-5{margin-left:5px!important}.ml-10{margin-left:10px!important}.ml-15{margin-left:15px!important}.ml-20{margin-left:20px!important}.ml-25{margin-left:25px!important}.ml-30{margin-left:30px!important}.ml-35{margin-left:35px!important}.ml-40{margin-left:40px!important}.ml-45{margin-left:45px!important}.ml-50{margin-left:50px!important}.ml-55{margin-left:55px!important}.ml-60{margin-left:60px!important}.ml-65{margin-left:65px!important}.ml-70{margin-left:70px!important}.ml-75{margin-left:75px!important}.ml-80{margin-left:80px!important}.ml-85{margin-left:85px!important}.ml-90{margin-left:90px!important}.ml-95{margin-left:95px!important}.ml-100{margin-left:100px!important}.mr-0{margin-right:0!important}.mr-5{margin-right:5px!important}.mr-10{margin-right:10px!important}.mr-15{margin-right:15px!important}.mr-20{margin-right:20px!important}.mr-25{margin-right:25px!important}.mr-30{margin-right:30px!important}.mr-35{margin-right:35px!important}.mr-40{margin-right:40px!important}.mr-45{margin-right:45px!important}.mr-50{margin-right:50px!important}.mr-55{margin-right:55px!important}.mr-60{margin-right:60px!important}.mr-65{margin-right:65px!important}.mr-70{margin-right:70px!important}.mr-75{margin-right:75px!important}.mr-80{margin-right:80px!important}.mr-85{margin-right:85px!important}.mr-90{margin-right:90px!important}.mr-95{margin-right:95px!important}.mr-100{margin-right:100px!important}.pt-0{padding-top:0!important}.pt-5{padding-top:5px!important}.pt-10{padding-top:10px!important}.pt-15{padding-top:15px!important}.pt-20{padding-top:20px!important}.pt-25{padding-top:25px!important}.pt-30{padding-top:30px!important}.pt-35{padding-top:35px!important}.pt-40{padding-top:40px!important}.pt-45{padding-top:45px!important}.pt-50{padding-top:50px!important}.pt-55{padding-top:55px!important}.pt-60{padding-top:60px!important}.pt-65{padding-top:65px!important}.pt-70{padding-top:70px!important}.pt-75{padding-top:75px!important}.pt-80{padding-top:80px!important}.pt-85{padding-top:85px!important}.pt-90{padding-top:90px!important}.pt-95{padding-top:95px!important}.pt-100{padding-top:100px!important}.pb-0{padding-bottom:0!important}.pb-5{padding-bottom:5px!important}.pb-10{padding-bottom:10px!important}.pb-15{padding-bottom:15px!important}.pb-20{padding-bottom:20px!important}.pb-25{padding-bottom:25px!important}.pb-30{padding-bottom:30px!important}.pb-35{padding-bottom:35px!important}.pb-40{padding-bottom:40px!important}.pb-45{padding-bottom:45px!important}.pb-50{padding-bottom:50px!important}.pb-55{padding-bottom:55px!important}.pb-60{padding-bottom:60px!important}.pb-65{padding-bottom:65px!important}.pb-70{padding-bottom:70px!important}.pb-75{padding-bottom:75px!important}.pb-80{padding-bottom:80px!important}.pb-85{padding-bottom:85px!important}.pb-90{padding-bottom:90px!important}.pb-95{padding-bottom:95px!important}.pb-100{padding-bottom:100px!important}.pl-0{padding-left:0!important}.pl-5{padding-left:5px!important}.pl-10{padding-left:10px!important}.pl-15{padding-left:15px!important}.pl-20{padding-left:20px!important}.pl-25{padding-left:25px!important}.pl-30{padding-left:30px!important}.pl-35{padding-left:35px!important}.pl-40{padding-left:40px!important}.pl-45{padding-left:45px!important}.pl-50{padding-left:50px!important}.pl-55{padding-left:55px!important}.pl-60{padding-left:60px!important}.pl-65{padding-left:65px!important}.pl-70{padding-left:70px!important}.pl-75{padding-left:75px!important}.pl-80{padding-left:80px!important}.pl-85{padding-left:85px!important}.pl-90{padding-left:90px!important}.pl-95{padding-left:95px!important}.pl-100{padding-left:100px!important}.pr-0{padding-right:0!important}.pr-5{padding-right:5px!important}.pr-10{padding-right:10px!important}.pr-15{padding-right:15px!important}.pr-20{padding-right:20px!important}.pr-25{padding-right:25px!important}.pr-30{padding-right:30px!important}.pr-35{padding-right:35px!important}.pr-40{padding-right:40px!important}.pr-45{padding-right:45px!important}.pr-50{padding-right:50px!important}.pr-55{padding-right:55px!important}.pr-60{padding-right:60px!important}.pr-65{padding-right:65px!important}.pr-70{padding-right:70px!important}.pr-75{padding-right:75px!important}.pr-80{padding-right:80px!important}.pr-85{padding-right:85px!important}.pr-90{padding-right:90px!important}.pr-95{padding-right:95px!important}.pr-100{padding-right:100px!important}.tabs{border-bottom:2px solid #939ab1;display:flex;justify-content:space-evenly;margin:40px 0 30px}@media screen and (max-width:33.1875em){.tabs{text-align:center}}.tabs a{border-bottom:2px solid #939ab1;color:#1a254c;font-size:1.6rem;font-weight:700;margin-bottom:-2px;padding:10px 10px 5px}.tabs a:focus,.tabs a:hover{background-color:#b8d6f4;border-bottom:2px solid #1e70c2;outline:none;-webkit-text-decoration:none;text-decoration:none}.tabs .nuxt-link-exact-active{border-bottom:2px solid #1e70c2;color:#1e70c2}.tabs-metrics,.tabs-tweetations{align-items:baseline;display:flex;flex-wrap:wrap;justify-content:flex-start;margin-bottom:20px}.tabs-metrics a,.tabs-tweetations a{color:#1a254c;cursor:pointer;padding:10px 0 5px;text-align:center}.tabs-metrics a:focus,.tabs-metrics a:hover,.tabs-tweetations a:focus,.tabs-tweetations a:hover{color:#1e70c2}.tabs-metrics span,.tabs-tweetations span{margin:0 10px}.active{border-bottom:2px solid #1e70c2!important;color:#1e70c2!important}.mobile-show{display:none}@media screen and (max-width:47.9375em){.mobile-show{display:block}}.desktop-show{display:block}@media screen and (max-width:47.9375em){.desktop-show{display:none}}.advertisement{margin-top:25px}.advertisement:focus .advertisement__text,.advertisement:hover .advertisement__text{color:#1e70c2}.advertisement__link{background-color:#f1f3f5;border-radius:3px;display:flex;font-weight:700;justify-content:space-between;padding:20px;-webkit-text-decoration:none;text-decoration:none}@media screen and (max-width:33.1875em){.advertisement__link{flex-wrap:wrap}}.advertisement__text{color:#000;font-size:1.8rem;padding-right:10px}.advertisement__button{align-self:center;background-color:#1e70c2;border-radius:3px;color:#fff;font-size:2rem;padding:15px 20px;position:relative;transition:.3s;width:200px}@media screen and (max-width:74.9375em){.advertisement__button{width:150px}}@media screen and (max-width:47.9375em){.advertisement__button{width:200px}}@media screen and (max-width:33.1875em){.advertisement__button{margin-left:auto;margin-top:10px;padding:10px 20px;width:170px}}.advertisement__button .advertisement__icon{font-size:1.7rem;position:absolute;right:2.2rem;top:2rem;transition:.3s}@media screen and (max-width:33.1875em){.advertisement__button .advertisement__icon{top:1.5rem}}.advertisement__button:focus,.advertisement__button:hover{opacity:.9}.advertisement__button:focus .advertisement__icon,.advertisement__button:hover .advertisement__icon{right:1.5rem}.authors-for-screen-reader{height:1px;left:-10000px;overflow:hidden;position:absolute;top:auto;width:1px}.authors-for-screen-reader:focus{display:block;font-weight:700;height:auto;outline:2px solid #f69038!important;outline-offset:6px!important;overflow:hidden;position:static;width:auto}.main .details{background-color:#f1f3f5;border-radius:3px;margin-top:25px;padding:10px}.main .details p{font-size:1.6rem;font-weight:700;margin:0}.main .preprints-version{align-items:baseline;display:flex;margin-top:10px}.info{display:flex;flex-wrap:wrap}.info__article-img{cursor:pointer;flex:0 0 22%;height:-moz-fit-content;height:fit-content;margin-right:20px;position:relative}@media screen and (max-width:61.9375em){.info__article-img{display:none}}.info__article-img img{display:block;width:100%}.info__article-img-info{background-color:rgba(26,37,76,.749);bottom:0;color:#fff;height:0;left:0;margin:0;overflow:hidden;position:absolute;right:0;text-align:center;transition:.5s ease;width:100%}.info__article-img-info .icon{font-size:3rem;margin-top:30%}.info__article-img:hover .info__article-img-info{height:100%}.info__title-authors{flex:0 0 75%}@media screen and (max-width:61.9375em){.info__title-authors{flex:0 0 100%}}.info__title-authors h3:focus{outline:2px solid #f69038!important;outline-offset:6px!important}.info__hidden-title{height:1px;left:-10000px;overflow:hidden;position:absolute;top:auto;width:1px}.info__authors{display:inline-block}.info__orcid-img{height:15px}.tabs a{flex-grow:1;text-align:center}@media screen and (max-width:47.9375em){.tabs{flex-direction:column}}.sidebar-citation .export-metadata div{margin-bottom:10px}.sidebar-citation .collection h4:focus{outline:2px solid #f69038!important;outline-offset:6px!important}.sidebar-citation .collection__span{display:block}@media screen and (max-width:61.9375em){.sidebar-citation .collection__span{display:inline-block}}.sidebar-citation .collection__link{background:#1e70c2;border-radius:3px;color:#fff;display:block;font-size:1.2rem;margin-bottom:5px;padding:5px 10px;width:-moz-fit-content;width:fit-content}.sidebar-citation .download-btns{display:flex;flex-wrap:wrap;justify-content:space-between;margin-bottom:20px}@media screen and (max-width:61.9375em){.sidebar-citation .download-btns{justify-content:flex-start}.sidebar-citation .download-btns a{margin-right:10px}}.article-content h3{font-size:2.2rem;line-height:2.4rem}.article-content ol li,.article-content ul li{margin-bottom:10px}.main-article-content a{color:#1e70c2}.main-article-content a:hover{-webkit-text-decoration:underline;text-decoration:underline}.author-affiliation-details,.authors-container .authors.clearfix .clearfix,.corresponding-author-and-affiliations,.h4-original-paper{display:none}.authors .clearfix{display:flex;flex-wrap:wrap;list-style:none;padding:0}.authors .clearfix li{margin-right:10px}.authors .clearfix li a{color:#1e70c2}.article-content figure{background-color:#f1f3f5;border-radius:3px;margin:0;padding:20px}#Abstract{margin-top:10px}#Abstract,#Discussion,#Introduction,#Keywords,#Methods,#Results{border-bottom:1px solid #ccd1d5}.abstract-sub-heading{display:block;font-size:1.6rem;font-weight:700}.figure-table{background:#f1f3f5;border-radius:3px;height:auto;margin-bottom:20px;overflow-x:auto;overflow-y:hidden;padding:20px 20px 0}.figure-table::-webkit-scrollbar{-webkit-appearance:none}.figure-table::-webkit-scrollbar-track{border-radius:8px;box-shadow:inset 0 0 5px #5d6581}.figure-table::-webkit-scrollbar:vertical{width:8px}.figure-table::-webkit-scrollbar:horizontal{height:8px}.figure-table::-webkit-scrollbar-thumb{background-color:#1e70c2;border-radius:8px}.textbox-container{border:2px solid #333;padding:15px}.textbox-container h5{margin-bottom:0;margin-top:0}.footnotes ol{word-wrap:break-word}img.figure-image,img.graphic-image,img.inline-graphic-image{width:100%}table{margin:0}td,th{padding:5px 0 5px 15px}.careers{margin-top:25px}.career-widget{border-bottom:1px solid #ccc;cursor:pointer;padding:5px 20px 20px;transition:box-shadow .3s ease}.career-widget:hover{box-shadow:0 0 10px 0 rgba(0,0,0,.1)}.job,.job:hover{color:inherit;-webkit-text-decoration:none;text-decoration:none}.job:hover .job-title{-webkit-text-decoration:underline;text-decoration:underline}.job-title{font-weight:700;margin-top:20px} .mt-0{margin-top:0!important}.mt-5{margin-top:5px!important}.mt-10{margin-top:10px!important}.mt-15{margin-top:15px!important}.mt-20{margin-top:20px!important}.mt-25{margin-top:25px!important}.mt-30{margin-top:30px!important}.mt-35{margin-top:35px!important}.mt-40{margin-top:40px!important}.mt-45{margin-top:45px!important}.mt-50{margin-top:50px!important}.mt-55{margin-top:55px!important}.mt-60{margin-top:60px!important}.mt-65{margin-top:65px!important}.mt-70{margin-top:70px!important}.mt-75{margin-top:75px!important}.mt-80{margin-top:80px!important}.mt-85{margin-top:85px!important}.mt-90{margin-top:90px!important}.mt-95{margin-top:95px!important}.mt-100{margin-top:100px!important}.mb-0{margin-bottom:0!important}.mb-5{margin-bottom:5px!important}.mb-10{margin-bottom:10px!important}.mb-15{margin-bottom:15px!important}.mb-20{margin-bottom:20px!important}.mb-25{margin-bottom:25px!important}.mb-30{margin-bottom:30px!important}.mb-35{margin-bottom:35px!important}.mb-40{margin-bottom:40px!important}.mb-45{margin-bottom:45px!important}.mb-50{margin-bottom:50px!important}.mb-55{margin-bottom:55px!important}.mb-60{margin-bottom:60px!important}.mb-65{margin-bottom:65px!important}.mb-70{margin-bottom:70px!important}.mb-75{margin-bottom:75px!important}.mb-80{margin-bottom:80px!important}.mb-85{margin-bottom:85px!important}.mb-90{margin-bottom:90px!important}.mb-95{margin-bottom:95px!important}.mb-100{margin-bottom:100px!important}.ml-0{margin-left:0!important}.ml-5{margin-left:5px!important}.ml-10{margin-left:10px!important}.ml-15{margin-left:15px!important}.ml-20{margin-left:20px!important}.ml-25{margin-left:25px!important}.ml-30{margin-left:30px!important}.ml-35{margin-left:35px!important}.ml-40{margin-left:40px!important}.ml-45{margin-left:45px!important}.ml-50{margin-left:50px!important}.ml-55{margin-left:55px!important}.ml-60{margin-left:60px!important}.ml-65{margin-left:65px!important}.ml-70{margin-left:70px!important}.ml-75{margin-left:75px!important}.ml-80{margin-left:80px!important}.ml-85{margin-left:85px!important}.ml-90{margin-left:90px!important}.ml-95{margin-left:95px!important}.ml-100{margin-left:100px!important}.mr-0{margin-right:0!important}.mr-5{margin-right:5px!important}.mr-10{margin-right:10px!important}.mr-15{margin-right:15px!important}.mr-20{margin-right:20px!important}.mr-25{margin-right:25px!important}.mr-30{margin-right:30px!important}.mr-35{margin-right:35px!important}.mr-40{margin-right:40px!important}.mr-45{margin-right:45px!important}.mr-50{margin-right:50px!important}.mr-55{margin-right:55px!important}.mr-60{margin-right:60px!important}.mr-65{margin-right:65px!important}.mr-70{margin-right:70px!important}.mr-75{margin-right:75px!important}.mr-80{margin-right:80px!important}.mr-85{margin-right:85px!important}.mr-90{margin-right:90px!important}.mr-95{margin-right:95px!important}.mr-100{margin-right:100px!important}.pt-0{padding-top:0!important}.pt-5{padding-top:5px!important}.pt-10{padding-top:10px!important}.pt-15{padding-top:15px!important}.pt-20{padding-top:20px!important}.pt-25{padding-top:25px!important}.pt-30{padding-top:30px!important}.pt-35{padding-top:35px!important}.pt-40{padding-top:40px!important}.pt-45{padding-top:45px!important}.pt-50{padding-top:50px!important}.pt-55{padding-top:55px!important}.pt-60{padding-top:60px!important}.pt-65{padding-top:65px!important}.pt-70{padding-top:70px!important}.pt-75{padding-top:75px!important}.pt-80{padding-top:80px!important}.pt-85{padding-top:85px!important}.pt-90{padding-top:90px!important}.pt-95{padding-top:95px!important}.pt-100{padding-top:100px!important}.pb-0{padding-bottom:0!important}.pb-5{padding-bottom:5px!important}.pb-10{padding-bottom:10px!important}.pb-15{padding-bottom:15px!important}.pb-20{padding-bottom:20px!important}.pb-25{padding-bottom:25px!important}.pb-30{padding-bottom:30px!important}.pb-35{padding-bottom:35px!important}.pb-40{padding-bottom:40px!important}.pb-45{padding-bottom:45px!important}.pb-50{padding-bottom:50px!important}.pb-55{padding-bottom:55px!important}.pb-60{padding-bottom:60px!important}.pb-65{padding-bottom:65px!important}.pb-70{padding-bottom:70px!important}.pb-75{padding-bottom:75px!important}.pb-80{padding-bottom:80px!important}.pb-85{padding-bottom:85px!important}.pb-90{padding-bottom:90px!important}.pb-95{padding-bottom:95px!important}.pb-100{padding-bottom:100px!important}.pl-0{padding-left:0!important}.pl-5{padding-left:5px!important}.pl-10{padding-left:10px!important}.pl-15{padding-left:15px!important}.pl-20{padding-left:20px!important}.pl-25{padding-left:25px!important}.pl-30{padding-left:30px!important}.pl-35{padding-left:35px!important}.pl-40{padding-left:40px!important}.pl-45{padding-left:45px!important}.pl-50{padding-left:50px!important}.pl-55{padding-left:55px!important}.pl-60{padding-left:60px!important}.pl-65{padding-left:65px!important}.pl-70{padding-left:70px!important}.pl-75{padding-left:75px!important}.pl-80{padding-left:80px!important}.pl-85{padding-left:85px!important}.pl-90{padding-left:90px!important}.pl-95{padding-left:95px!important}.pl-100{padding-left:100px!important}.pr-0{padding-right:0!important}.pr-5{padding-right:5px!important}.pr-10{padding-right:10px!important}.pr-15{padding-right:15px!important}.pr-20{padding-right:20px!important}.pr-25{padding-right:25px!important}.pr-30{padding-right:30px!important}.pr-35{padding-right:35px!important}.pr-40{padding-right:40px!important}.pr-45{padding-right:45px!important}.pr-50{padding-right:50px!important}.pr-55{padding-right:55px!important}.pr-60{padding-right:60px!important}.pr-65{padding-right:65px!important}.pr-70{padding-right:70px!important}.pr-75{padding-right:75px!important}.pr-80{padding-right:80px!important}.pr-85{padding-right:85px!important}.pr-90{padding-right:90px!important}.pr-95{padding-right:95px!important}.pr-100{padding-right:100px!important}@media screen and (max-width:61.9375em){.sidebar-sections{display:none}}.sidebar-sections ul{color:#1a254c;padding:13px 10px 0 15px}.sidebar-sections ul li{margin-bottom:1rem}.sidebar-sections ul li a{color:#1a254c;font-weight:400!important;word-break:break-word}.sidebar-sections ul li a:hover{color:#1e70c2}.sidebar-sections ul li a:focus{-webkit-text-decoration:none!important;text-decoration:none!important}.sidebar-sections ul li:hover .active{-webkit-text-decoration:none;text-decoration:none}.sidebar-sections ul li .active{font-weight:400!important;-webkit-text-decoration:none;text-decoration:none}.sidebar-nav{height:100%;left:0;position:absolute;top:0;width:100%}.sidebar-nav-sticky{position:sticky;top:0}.article{padding:0}.tooltip{cursor:pointer;position:relative}.tooltip .tooltiptext{background-color:#000;border-radius:6px;color:#fff;font-family:"Helvetica","Sans-serif";font-size:.9rem;left:50%;margin-left:-125px;margin-top:2px;min-width:250px;padding:10px;position:absolute;text-align:center;text-transform:none;top:100%;visibility:hidden;z-index:1}@media(max-width:768px){.tooltip .tooltiptext{display:none}}.tooltip:hover .tooltiptext{visibility:visible}@media(min-width:768px){.visual-abstract{display:none}}@media(max-width:767px){.visual-abstract{display:block}}.figure-table ul{margin:auto}named-content{display:block;padding-left:1em;white-space:pre-wrap} .mt-0{margin-top:0!important}.mt-5{margin-top:5px!important}.mt-10{margin-top:10px!important}.mt-15{margin-top:15px!important}.mt-20{margin-top:20px!important}.mt-25{margin-top:25px!important}.mt-30{margin-top:30px!important}.mt-35{margin-top:35px!important}.mt-40{margin-top:40px!important}.mt-45{margin-top:45px!important}.mt-50{margin-top:50px!important}.mt-55{margin-top:55px!important}.mt-60{margin-top:60px!important}.mt-65{margin-top:65px!important}.mt-70{margin-top:70px!important}.mt-75{margin-top:75px!important}.mt-80{margin-top:80px!important}.mt-85{margin-top:85px!important}.mt-90{margin-top:90px!important}.mt-95{margin-top:95px!important}.mt-100{margin-top:100px!important}.mb-0{margin-bottom:0!important}.mb-5{margin-bottom:5px!important}.mb-10{margin-bottom:10px!important}.mb-15{margin-bottom:15px!important}.mb-20{margin-bottom:20px!important}.mb-25{margin-bottom:25px!important}.mb-30{margin-bottom:30px!important}.mb-35{margin-bottom:35px!important}.mb-40{margin-bottom:40px!important}.mb-45{margin-bottom:45px!important}.mb-50{margin-bottom:50px!important}.mb-55{margin-bottom:55px!important}.mb-60{margin-bottom:60px!important}.mb-65{margin-bottom:65px!important}.mb-70{margin-bottom:70px!important}.mb-75{margin-bottom:75px!important}.mb-80{margin-bottom:80px!important}.mb-85{margin-bottom:85px!important}.mb-90{margin-bottom:90px!important}.mb-95{margin-bottom:95px!important}.mb-100{margin-bottom:100px!important}.ml-0{margin-left:0!important}.ml-5{margin-left:5px!important}.ml-10{margin-left:10px!important}.ml-15{margin-left:15px!important}.ml-20{margin-left:20px!important}.ml-25{margin-left:25px!important}.ml-30{margin-left:30px!important}.ml-35{margin-left:35px!important}.ml-40{margin-left:40px!important}.ml-45{margin-left:45px!important}.ml-50{margin-left:50px!important}.ml-55{margin-left:55px!important}.ml-60{margin-left:60px!important}.ml-65{margin-left:65px!important}.ml-70{margin-left:70px!important}.ml-75{margin-left:75px!important}.ml-80{margin-left:80px!important}.ml-85{margin-left:85px!important}.ml-90{margin-left:90px!important}.ml-95{margin-left:95px!important}.ml-100{margin-left:100px!important}.mr-0{margin-right:0!important}.mr-5{margin-right:5px!important}.mr-10{margin-right:10px!important}.mr-15{margin-right:15px!important}.mr-20{margin-right:20px!important}.mr-25{margin-right:25px!important}.mr-30{margin-right:30px!important}.mr-35{margin-right:35px!important}.mr-40{margin-right:40px!important}.mr-45{margin-right:45px!important}.mr-50{margin-right:50px!important}.mr-55{margin-right:55px!important}.mr-60{margin-right:60px!important}.mr-65{margin-right:65px!important}.mr-70{margin-right:70px!important}.mr-75{margin-right:75px!important}.mr-80{margin-right:80px!important}.mr-85{margin-right:85px!important}.mr-90{margin-right:90px!important}.mr-95{margin-right:95px!important}.mr-100{margin-right:100px!important}.pt-0{padding-top:0!important}.pt-5{padding-top:5px!important}.pt-10{padding-top:10px!important}.pt-15{padding-top:15px!important}.pt-20{padding-top:20px!important}.pt-25{padding-top:25px!important}.pt-30{padding-top:30px!important}.pt-35{padding-top:35px!important}.pt-40{padding-top:40px!important}.pt-45{padding-top:45px!important}.pt-50{padding-top:50px!important}.pt-55{padding-top:55px!important}.pt-60{padding-top:60px!important}.pt-65{padding-top:65px!important}.pt-70{padding-top:70px!important}.pt-75{padding-top:75px!important}.pt-80{padding-top:80px!important}.pt-85{padding-top:85px!important}.pt-90{padding-top:90px!important}.pt-95{padding-top:95px!important}.pt-100{padding-top:100px!important}.pb-0{padding-bottom:0!important}.pb-5{padding-bottom:5px!important}.pb-10{padding-bottom:10px!important}.pb-15{padding-bottom:15px!important}.pb-20{padding-bottom:20px!important}.pb-25{padding-bottom:25px!important}.pb-30{padding-bottom:30px!important}.pb-35{padding-bottom:35px!important}.pb-40{padding-bottom:40px!important}.pb-45{padding-bottom:45px!important}.pb-50{padding-bottom:50px!important}.pb-55{padding-bottom:55px!important}.pb-60{padding-bottom:60px!important}.pb-65{padding-bottom:65px!important}.pb-70{padding-bottom:70px!important}.pb-75{padding-bottom:75px!important}.pb-80{padding-bottom:80px!important}.pb-85{padding-bottom:85px!important}.pb-90{padding-bottom:90px!important}.pb-95{padding-bottom:95px!important}.pb-100{padding-bottom:100px!important}.pl-0{padding-left:0!important}.pl-5{padding-left:5px!important}.pl-10{padding-left:10px!important}.pl-15{padding-left:15px!important}.pl-20{padding-left:20px!important}.pl-25{padding-left:25px!important}.pl-30{padding-left:30px!important}.pl-35{padding-left:35px!important}.pl-40{padding-left:40px!important}.pl-45{padding-left:45px!important}.pl-50{padding-left:50px!important}.pl-55{padding-left:55px!important}.pl-60{padding-left:60px!important}.pl-65{padding-left:65px!important}.pl-70{padding-left:70px!important}.pl-75{padding-left:75px!important}.pl-80{padding-left:80px!important}.pl-85{padding-left:85px!important}.pl-90{padding-left:90px!important}.pl-95{padding-left:95px!important}.pl-100{padding-left:100px!important}.pr-0{padding-right:0!important}.pr-5{padding-right:5px!important}.pr-10{padding-right:10px!important}.pr-15{padding-right:15px!important}.pr-20{padding-right:20px!important}.pr-25{padding-right:25px!important}.pr-30{padding-right:30px!important}.pr-35{padding-right:35px!important}.pr-40{padding-right:40px!important}.pr-45{padding-right:45px!important}.pr-50{padding-right:50px!important}.pr-55{padding-right:55px!important}.pr-60{padding-right:60px!important}.pr-65{padding-right:65px!important}.pr-70{padding-right:70px!important}.pr-75{padding-right:75px!important}.pr-80{padding-right:80px!important}.pr-85{padding-right:85px!important}.pr-90{padding-right:90px!important}.pr-95{padding-right:95px!important}.pr-100{padding-right:100px!important}.email-subscribtion-button{margin-left:auto}.email-subscribtion-button h3{margin-top:0}@media screen and (max-width:35.9375em){.email-subscribtion-button h3{margin-top:50px}}.email-subscribtion-button a{background-color:#fff;border:2px solid #fff;border-radius:100px;color:#1a254c;cursor:pointer;display:block;margin:auto;padding:10px;position:relative;text-align:center;transition:all .2s ease;width:205px}.email-subscribtion-button a span{font-size:1.6rem;font-weight:700;position:relative}.email-subscribtion-button a span.icon{position:absolute;right:38px;transition:all .2s ease}.email-subscribtion-button a:before{background-color:#1a254c;border-radius:28px;content:"";display:block;height:38px;left:0;position:absolute;top:0;transition:all .3s ease;width:50px}.email-subscribtion-button a:hover{color:#fff}.email-subscribtion-button a:hover span.icon{right:25px;transform:rotate(45deg)}.email-subscribtion-button a:hover:before{width:100%}.email-subscribtion-button a:active{color:#1a254c}.email-subscribtion-button a:active span.icon{right:20px}.email-subscribtion-button a:active:before{background-color:#fff}.footer-modal-window .modal-window-body{max-height:650px}.footer-modal-window .mc-field-group ul{padding:0}.footer-modal-window .mc-field-group ul li{list-style-type:none;margin-bottom:2px}.footer-modal-window .mc-field-group ul li input{cursor:pointer;height:15px;width:15px}.footer-modal-window .mc-field-group ul li label{font-size:1.6rem}.footer-modal-window .email-subscribtion label{display:block;font-weight:700;margin:5px 0;max-width:-moz-fit-content;max-width:fit-content}.footer-modal-window .email-subscribtion input{width:100%}.footer-modal-window .email-subscribtion input.error{background-color:#fcf1f1!important;border:1px solid red!important}.footer-modal-window .email-subscribtion small.error{color:red;display:block}.footer{background:#1a254c}.footer-journal-name{background:#111831;padding:7px 0}.footer-journal-name h2{color:#fff;font-size:1.8rem;line-height:2.4rem;margin:5px 0;text-align:center}.footer-title{color:#fff;font-size:1.6rem;font-weight:700;line-height:2.2rem;margin:50px 0 18px}.footer-title:focus{outline:2px solid #f69038!important;outline-offset:6px!important}.footer ul{padding:0}.footer ul li{list-style-type:none;margin-bottom:12px}.footer ul li a{color:#fff;font-size:1.6rem;font-weight:light;opacity:.6;-webkit-text-decoration:none;text-decoration:none}.footer ul li a:focus,.footer ul li a:hover{opacity:1}.footer-social .twitter a{border-radius:50px;margin-right:1px;padding:4px 5px;text-align:center}.footer-social .twitter a,.footer-social .twitter a:hover{background-color:#000;transition:all .2s ease}.footer-social .facebook a{background-color:#3b5a98;border-radius:50px;margin-right:1px;padding:4px 5px;text-align:center;transition:all .2s ease}.footer-social .facebook a:hover{background-color:#344f86;transition:all .2s ease}.footer-social .linkedin a{background-color:#0077b5;border-radius:50px;margin-right:1px;padding:4px 5px;text-align:center;transition:all .2s ease}.footer-social .linkedin a:hover{background-color:#00669c;transition:all .2s ease}.footer-social .youtube a{background-color:red;border-radius:50px;margin-right:1px;padding:4px 5px;text-align:center;transition:all .2s ease}.footer-social .youtube a:hover{background-color:#e60000;transition:all .2s ease}.footer-social a:hover{-webkit-text-decoration:none;text-decoration:none}.footer-social a i{color:#fff;text-align:center;width:14px}.footer-copyright{border-top:1px solid #fff;color:#fff;font-size:1.2rem;margin:20px 0 40px;opacity:.6;padding-top:20px}@media screen and (max-width:20.6875em){.footer .rss{margin-top:.25rem}} .mt-0{margin-top:0!important}.mt-5{margin-top:5px!important}.mt-10{margin-top:10px!important}.mt-15{margin-top:15px!important}.mt-20{margin-top:20px!important}.mt-25{margin-top:25px!important}.mt-30{margin-top:30px!important}.mt-35{margin-top:35px!important}.mt-40{margin-top:40px!important}.mt-45{margin-top:45px!important}.mt-50{margin-top:50px!important}.mt-55{margin-top:55px!important}.mt-60{margin-top:60px!important}.mt-65{margin-top:65px!important}.mt-70{margin-top:70px!important}.mt-75{margin-top:75px!important}.mt-80{margin-top:80px!important}.mt-85{margin-top:85px!important}.mt-90{margin-top:90px!important}.mt-95{margin-top:95px!important}.mt-100{margin-top:100px!important}.mb-0{margin-bottom:0!important}.mb-5{margin-bottom:5px!important}.mb-10{margin-bottom:10px!important}.mb-15{margin-bottom:15px!important}.mb-20{margin-bottom:20px!important}.mb-25{margin-bottom:25px!important}.mb-30{margin-bottom:30px!important}.mb-35{margin-bottom:35px!important}.mb-40{margin-bottom:40px!important}.mb-45{margin-bottom:45px!important}.mb-50{margin-bottom:50px!important}.mb-55{margin-bottom:55px!important}.mb-60{margin-bottom:60px!important}.mb-65{margin-bottom:65px!important}.mb-70{margin-bottom:70px!important}.mb-75{margin-bottom:75px!important}.mb-80{margin-bottom:80px!important}.mb-85{margin-bottom:85px!important}.mb-90{margin-bottom:90px!important}.mb-95{margin-bottom:95px!important}.mb-100{margin-bottom:100px!important}.ml-0{margin-left:0!important}.ml-5{margin-left:5px!important}.ml-10{margin-left:10px!important}.ml-15{margin-left:15px!important}.ml-20{margin-left:20px!important}.ml-25{margin-left:25px!important}.ml-30{margin-left:30px!important}.ml-35{margin-left:35px!important}.ml-40{margin-left:40px!important}.ml-45{margin-left:45px!important}.ml-50{margin-left:50px!important}.ml-55{margin-left:55px!important}.ml-60{margin-left:60px!important}.ml-65{margin-left:65px!important}.ml-70{margin-left:70px!important}.ml-75{margin-left:75px!important}.ml-80{margin-left:80px!important}.ml-85{margin-left:85px!important}.ml-90{margin-left:90px!important}.ml-95{margin-left:95px!important}.ml-100{margin-left:100px!important}.mr-0{margin-right:0!important}.mr-5{margin-right:5px!important}.mr-10{margin-right:10px!important}.mr-15{margin-right:15px!important}.mr-20{margin-right:20px!important}.mr-25{margin-right:25px!important}.mr-30{margin-right:30px!important}.mr-35{margin-right:35px!important}.mr-40{margin-right:40px!important}.mr-45{margin-right:45px!important}.mr-50{margin-right:50px!important}.mr-55{margin-right:55px!important}.mr-60{margin-right:60px!important}.mr-65{margin-right:65px!important}.mr-70{margin-right:70px!important}.mr-75{margin-right:75px!important}.mr-80{margin-right:80px!important}.mr-85{margin-right:85px!important}.mr-90{margin-right:90px!important}.mr-95{margin-right:95px!important}.mr-100{margin-right:100px!important}.pt-0{padding-top:0!important}.pt-5{padding-top:5px!important}.pt-10{padding-top:10px!important}.pt-15{padding-top:15px!important}.pt-20{padding-top:20px!important}.pt-25{padding-top:25px!important}.pt-30{padding-top:30px!important}.pt-35{padding-top:35px!important}.pt-40{padding-top:40px!important}.pt-45{padding-top:45px!important}.pt-50{padding-top:50px!important}.pt-55{padding-top:55px!important}.pt-60{padding-top:60px!important}.pt-65{padding-top:65px!important}.pt-70{padding-top:70px!important}.pt-75{padding-top:75px!important}.pt-80{padding-top:80px!important}.pt-85{padding-top:85px!important}.pt-90{padding-top:90px!important}.pt-95{padding-top:95px!important}.pt-100{padding-top:100px!important}.pb-0{padding-bottom:0!important}.pb-5{padding-bottom:5px!important}.pb-10{padding-bottom:10px!important}.pb-15{padding-bottom:15px!important}.pb-20{padding-bottom:20px!important}.pb-25{padding-bottom:25px!important}.pb-30{padding-bottom:30px!important}.pb-35{padding-bottom:35px!important}.pb-40{padding-bottom:40px!important}.pb-45{padding-bottom:45px!important}.pb-50{padding-bottom:50px!important}.pb-55{padding-bottom:55px!important}.pb-60{padding-bottom:60px!important}.pb-65{padding-bottom:65px!important}.pb-70{padding-bottom:70px!important}.pb-75{padding-bottom:75px!important}.pb-80{padding-bottom:80px!important}.pb-85{padding-bottom:85px!important}.pb-90{padding-bottom:90px!important}.pb-95{padding-bottom:95px!important}.pb-100{padding-bottom:100px!important}.pl-0{padding-left:0!important}.pl-5{padding-left:5px!important}.pl-10{padding-left:10px!important}.pl-15{padding-left:15px!important}.pl-20{padding-left:20px!important}.pl-25{padding-left:25px!important}.pl-30{padding-left:30px!important}.pl-35{padding-left:35px!important}.pl-40{padding-left:40px!important}.pl-45{padding-left:45px!important}.pl-50{padding-left:50px!important}.pl-55{padding-left:55px!important}.pl-60{padding-left:60px!important}.pl-65{padding-left:65px!important}.pl-70{padding-left:70px!important}.pl-75{padding-left:75px!important}.pl-80{padding-left:80px!important}.pl-85{padding-left:85px!important}.pl-90{padding-left:90px!important}.pl-95{padding-left:95px!important}.pl-100{padding-left:100px!important}.pr-0{padding-right:0!important}.pr-5{padding-right:5px!important}.pr-10{padding-right:10px!important}.pr-15{padding-right:15px!important}.pr-20{padding-right:20px!important}.pr-25{padding-right:25px!important}.pr-30{padding-right:30px!important}.pr-35{padding-right:35px!important}.pr-40{padding-right:40px!important}.pr-45{padding-right:45px!important}.pr-50{padding-right:50px!important}.pr-55{padding-right:55px!important}.pr-60{padding-right:60px!important}.pr-65{padding-right:65px!important}.pr-70{padding-right:70px!important}.pr-75{padding-right:75px!important}.pr-80{padding-right:80px!important}.pr-85{padding-right:85px!important}.pr-90{padding-right:90px!important}.pr-95{padding-right:95px!important}.pr-100{padding-right:100px!important}.scroll-to-very-top{background-color:#1e70c2;border-radius:3px 0 0 3px;bottom:.7vh;box-shadow:0 0 5px rgba(0,0,0,.2);position:fixed;right:0;z-index:2}.scroll-to-very-top:focus,.scroll-to-very-top:hover{opacity:.9}.scroll-to-very-top:focus{border:2px solid #f69038;margin:1px}.scroll-to-very-top .icon{color:#fff;font-size:2.5rem;padding:9px;transition:all .3s}</style></head><body ><div data-server-rendered="true" id="__nuxt"><!----><div id="__layout"><div id="jmir-html"><span tabindex="-1"></span> <div id="skip-link"><a href="#main-content"> Skip to Main Content <span aria-hidden="true" class="icon fas fa-chevron-down"></span></a> <a href="#footer"> Skip to Footer <span aria-hidden="true" class="icon fas fa-chevron-down"></span></a></div> <!----> <div data-v-49c694ee><span tabindex="0" aria-label="Accessibility settings" title="Accessibility settings" role="button" class="icon fas fa-universal-access universal-access-btn" data-v-49c694ee></span> <!----> <style type="text/css" data-v-49c694ee> html { filter: none !important } html { font-weight: inherit !important } html { font-size: 0.625rem !important } html { text-align: initial !important } *:not svg { font-weight: inherit } </style></div> <div id="main-layout-container"><div style="background-color: white;"><div id="scroll-to-very-top"><header id="header"><section class="header"><nav><div data-v-575455fb><nav aria-label="Navigation" data-v-575455fb><section class="top-nav" data-v-575455fb><div class="container" data-v-575455fb><!----> <div class="corporate" data-v-575455fb><div class="corporate__logo" data-v-575455fb><a href="https://jmirpublications.com" aria-label="JMIR Publications main website" data-v-575455fb><div data-v-575455fb><img src="https://asset.jmir.pub/resources/images/logos/JMIR-25-year-logo-less-white.png" alt="JMIR Publications" class="logo-img"></div></a></div> <div class="corporate__mobile-menu" data-v-575455fb><button data-v-575455fb><span aria-hidden="true" class="icon fas fa-bars" data-v-575455fb></span></button></div> <div class="corporate__search" data-v-575455fb><label for="corporate__search-select" class="screen-readers-only" data-v-575455fb> Select options </label> <select id="corporate__search-select" data-test="search-select" class="corporate__search-select" data-v-575455fb><option value="articles" selected="selected" data-v-575455fb> Articles </option> <option value="help" data-v-575455fb> Help </option></select> <input name="type" type="hidden" value="text" data-v-575455fb> <!----><!----><!----><!----><!----> <button aria-label="Search articles" data-test="unisearch-button" class="btn btn-small btn-blue corporate__search-btn" data-v-575455fb><span aria-hidden="true" class="icon fas fa-search" data-v-575455fb></span></button></div> <div class="corporate__nav" data-v-575455fb><ul class="corporate__links" data-v-575455fb><li tabindex="0" aria-haspopup="true" data-test="resource-center" class="corporate__link-item" data-v-575455fb><a href="https://careers.jmir.org/" class="corporate__link" data-v-575455fb> Career Center </a></li> <li class="corporate__link-item" data-v-575455fb><a id="button-login" href="javascript:;" data-test="login-button" class="corporate__link" data-v-575455fb>Login</a></li> <li id="button-register" data-test="register-button" class="corporate__link-item" data-v-575455fb><a class="corporate__link" data-v-575455fb>Register</a></li> <!----></ul></div></div></div></section> <section class="bottom-nav-1" data-v-575455fb><div class="container" data-v-575455fb><div class="journal" data-v-575455fb><div class="journal__nav" data-v-575455fb><ul class="journal__links" data-v-575455fb><li class="journal__link-item journal__link-item--journals" data-v-575455fb><div aria-haspopup="true" class="journal__link-item-container" data-v-575455fb><a href="/" aria-label="Journal of Medical Internet Research home page" class="journal__link--home nuxt-link-active" data-v-575455fb><span aria-hidden="true" class="icon fas fa-home" style="font-size:16px;" data-v-575455fb></span> Journal of Medical Internet Research </a> <span aria-expanded="false" aria-label="Other journals" tabindex="0" role="button" class="journal__journals-list" data-v-575455fb><span aria-hidden="true" data-test="journal-list-show-dropdown-button" class="icon fas fa-arrow-down" style="padding: 16px 16px;" data-v-575455fb></span></span></div> <ul aria-label="submenu" data-test="journal-list-dropdown" class="journal__link-submenu-journals" data-v-575455fb><li class="m-0" data-v-575455fb><a href="/" class="nuxt-link-active" data-v-575455fb><span data-v-575455fb>Journal of Medical Internet Research</span> <span class="articles-number" data-v-575455fb>9001 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://www.researchprotocols.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Research Protocols</span> <span class="articles-number" data-v-575455fb>4321 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://formative.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Formative Research</span> <span class="articles-number" data-v-575455fb>3050 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://mhealth.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR mHealth and uHealth</span> <span class="articles-number" data-v-575455fb>2730 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://ojphi.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>Online Journal of Public Health Informatics</span> <span class="articles-number" data-v-575455fb>1717 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://publichealth.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Public Health and Surveillance</span> <span class="articles-number" data-v-575455fb>1639 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://medinform.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Medical Informatics</span> <span class="articles-number" data-v-575455fb>1405 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://mental.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Mental Health</span> <span class="articles-number" data-v-575455fb>1073 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://humanfactors.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Human Factors</span> <span class="articles-number" data-v-575455fb>798 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://games.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Serious Games</span> <span class="articles-number" data-v-575455fb>630 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://mededu.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Medical Education</span> <span class="articles-number" data-v-575455fb>552 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://www.iproc.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>Iproceedings</span> <span class="articles-number" data-v-575455fb>510 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://www.i-jmr.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>Interactive Journal of Medical Research</span> <span class="articles-number" data-v-575455fb>429 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://aging.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Aging</span> <span class="articles-number" data-v-575455fb>424 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://xmed.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIRx Med</span> <span class="articles-number" data-v-575455fb>420 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://pediatrics.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Pediatrics and Parenting</span> <span class="articles-number" data-v-575455fb>401 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://cancer.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Cancer</span> <span class="articles-number" data-v-575455fb>381 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://derma.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Dermatology</span> <span class="articles-number" data-v-575455fb>307 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://diabetes.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Diabetes</span> <span class="articles-number" data-v-575455fb>266 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://rehab.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Rehabilitation and Assistive Technologies</span> <span class="articles-number" data-v-575455fb>263 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://cardio.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Cardio</span> <span class="articles-number" data-v-575455fb>194 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://infodemiology.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Infodemiology</span> <span class="articles-number" data-v-575455fb>149 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://ai.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR AI</span> <span class="articles-number" data-v-575455fb>118 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://periop.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Perioperative Medicine</span> <span class="articles-number" data-v-575455fb>115 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://nursing.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Nursing</span> <span class="articles-number" data-v-575455fb>114 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://jopm.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>Journal of Participatory Medicine</span> <span class="articles-number" data-v-575455fb>96 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://biomedeng.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Biomedical Engineering</span> <span class="articles-number" data-v-575455fb>90 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://bioinform.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Bioinformatics and Biotechnology</span> <span class="articles-number" data-v-575455fb>51 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://www.medicine20.com" no-prefetch="" data-v-575455fb><span data-v-575455fb>Medicine 2.0</span> <span class="articles-number" data-v-575455fb>26 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://apinj.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>Asian/Pacific Island Nursing Journal</span> <span class="articles-number" data-v-575455fb>25 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://neuro.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Neurotechnology</span> <span class="articles-number" data-v-575455fb>24 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://xbio.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIRx Bio</span> <span class="articles-number" data-v-575455fb>22 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://xr.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR XR and Spatial Computing (JMXR)</span> <span class="articles-number" data-v-575455fb>18 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://data.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Data</span> <!----></a></li><li class="m-0" data-v-575455fb><a href="https://challenges.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Challenges</span> <!----></a></li><li class="m-0" data-v-575455fb><a href="https://preprints.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Preprints</span> <!----></a></li></ul></li> <li tabindex="0" aria-haspopup="true" data-test="journal-information" class="journal__link-item journal__link-item--journal-info" data-v-575455fb><span class="journal__link" data-v-575455fb> Journal Information </span> <span aria-hidden="true" class="icon fas fa-caret-down" data-v-575455fb></span> <ul aria-label="submenu" data-test="journal-info-popover" class="journal__link-submenu journal__link-submenu--journal-info" data-v-575455fb><li data-v-575455fb><a href="/about-journal/focus-and-scope" data-v-575455fb> Focus and Scope </a></li><li data-v-575455fb><a href="/about-journal/editorial-board" data-v-575455fb> Editorial Board </a></li><li data-v-575455fb><a href="/author-information/instructions-for-authors" data-v-575455fb> Author Information </a></li><li data-v-575455fb><a href="/resource-centre/author-hub" data-v-575455fb> Resource Center </a></li><li data-v-575455fb><a href="/about-journal/article-processing-fees" data-v-575455fb> Article Processing Fees </a></li><li data-v-575455fb><a href="/publishing-policies/section-policies" data-v-575455fb> Publishing Policies </a></li><li data-v-575455fb><a href="/get-involved/new-journal-editor-in-chief-proposals" data-v-575455fb> Get Involved </a></li><li data-v-575455fb><a href="/top-articles/overview" data-v-575455fb> Top Articles </a></li><li data-v-575455fb><a href="/fees/institutional-partners" data-v-575455fb> Institutional Partners </a></li><li data-v-575455fb><a href="/about-journal/indexing-and-impact-factor" data-v-575455fb> Indexing and Impact Factor </a></li></ul></li> <li tabindex="0" aria-haspopup="true" class="journal__link-item journal__link-item--browse" data-v-575455fb><span class="journal__link" data-v-575455fb> Browse Journal </span> <span aria-hidden="true" class="icon fas fa-caret-down" data-v-575455fb></span> <ul aria-label="submenu" data-test="browse-journal-popover" class="journal__link-submenu journal__link-submenu--browse" data-v-575455fb><li class="m-0" data-v-575455fb><div class="journal__link-submenu--select" data-v-575455fb><label for="nav-year" data-v-575455fb> Year: </label> <select id="nav-year" data-v-575455fb><option disabled="disabled" value="" data-v-575455fb> Select... </option> <option value="1999" data-v-575455fb> 1999 </option><option value="2000" data-v-575455fb> 2000 </option><option value="2001" data-v-575455fb> 2001 </option><option value="2002" data-v-575455fb> 2002 </option><option value="2003" data-v-575455fb> 2003 </option><option value="2004" data-v-575455fb> 2004 </option><option value="2005" data-v-575455fb> 2005 </option><option value="2006" data-v-575455fb> 2006 </option><option value="2007" data-v-575455fb> 2007 </option><option value="2008" data-v-575455fb> 2008 </option><option value="2009" data-v-575455fb> 2009 </option><option value="2010" data-v-575455fb> 2010 </option><option value="2011" data-v-575455fb> 2011 </option><option value="2012" data-v-575455fb> 2012 </option><option value="2013" data-v-575455fb> 2013 </option><option value="2014" data-v-575455fb> 2014 </option><option value="2015" data-v-575455fb> 2015 </option><option value="2016" data-v-575455fb> 2016 </option><option value="2017" data-v-575455fb> 2017 </option><option value="2018" data-v-575455fb> 2018 </option><option value="2019" data-v-575455fb> 2019 </option><option value="2020" data-v-575455fb> 2020 </option><option value="2021" data-v-575455fb> 2021 </option><option value="2022" data-v-575455fb> 2022 </option><option value="2023" data-v-575455fb> 2023 </option><option value="2024" data-v-575455fb> 2024 </option></select></div></li> <li data-v-575455fb><a href="/announcements" data-v-575455fb> Latest Announcements </a></li><li data-v-575455fb><a href="/search/authors" data-v-575455fb> Authors </a></li> <li data-v-575455fb><a href="/themes" data-v-575455fb> Themes </a></li><li data-v-575455fb><a href="/issues" data-v-575455fb> Issues </a></li> <li data-v-575455fb><a href="https://blog.jmir.org/" data-v-575455fb> Blog </a></li></ul></li> <li class="journal__link-item journal__link-item--submit-article" data-v-575455fb><a href="/author" class="btn btn-small btn-blue journal__submit-article" data-v-575455fb>Submit Article</a></li></ul></div></div></div></section> <section class="bottom-nav-2" data-v-575455fb><div class="journal" data-v-575455fb><ul data-v-575455fb><li class="journal__link-item journal__link-item--journals" data-v-575455fb><div class="journal__link-item-container" data-v-575455fb><a href="/" class="journal__link--home nuxt-link-active" data-v-575455fb><span aria-hidden="true" class="icon fas fa-home" style="font-size:16px;" data-v-575455fb></span> Journal of Medical Internet Research </a> <span class="journal__journals-list" data-v-575455fb><span aria-hidden="true" class="icon fas fa-arrow-down" data-v-575455fb></span></span></div> <ul data-test="journal-list" class="journal__link-submenu-journals" data-v-575455fb><li class="m-0" data-v-575455fb><a href="/" class="nuxt-link-active" data-v-575455fb><span data-v-575455fb>Journal of Medical Internet Research</span> <span class="articles-number" data-v-575455fb>9001 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://www.researchprotocols.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Research Protocols</span> <span class="articles-number" data-v-575455fb>4321 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://formative.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Formative Research</span> <span class="articles-number" data-v-575455fb>3050 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://mhealth.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR mHealth and uHealth</span> <span class="articles-number" data-v-575455fb>2730 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://ojphi.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>Online Journal of Public Health Informatics</span> <span class="articles-number" data-v-575455fb>1717 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://publichealth.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Public Health and Surveillance</span> <span class="articles-number" data-v-575455fb>1639 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://medinform.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Medical Informatics</span> <span class="articles-number" data-v-575455fb>1405 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://mental.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Mental Health</span> <span class="articles-number" data-v-575455fb>1073 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://humanfactors.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Human Factors</span> <span class="articles-number" data-v-575455fb>798 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://games.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Serious Games</span> <span class="articles-number" data-v-575455fb>630 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://mededu.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Medical Education</span> <span class="articles-number" data-v-575455fb>552 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://www.iproc.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>Iproceedings</span> <span class="articles-number" data-v-575455fb>510 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://www.i-jmr.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>Interactive Journal of Medical Research</span> <span class="articles-number" data-v-575455fb>429 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://aging.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Aging</span> <span class="articles-number" data-v-575455fb>424 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://xmed.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIRx Med</span> <span class="articles-number" data-v-575455fb>420 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://pediatrics.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Pediatrics and Parenting</span> <span class="articles-number" data-v-575455fb>401 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://cancer.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Cancer</span> <span class="articles-number" data-v-575455fb>381 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://derma.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Dermatology</span> <span class="articles-number" data-v-575455fb>307 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://diabetes.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Diabetes</span> <span class="articles-number" data-v-575455fb>266 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://rehab.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Rehabilitation and Assistive Technologies</span> <span class="articles-number" data-v-575455fb>263 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://cardio.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Cardio</span> <span class="articles-number" data-v-575455fb>194 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://infodemiology.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Infodemiology</span> <span class="articles-number" data-v-575455fb>149 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://ai.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR AI</span> <span class="articles-number" data-v-575455fb>118 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://periop.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Perioperative Medicine</span> <span class="articles-number" data-v-575455fb>115 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://nursing.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Nursing</span> <span class="articles-number" data-v-575455fb>114 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://jopm.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>Journal of Participatory Medicine</span> <span class="articles-number" data-v-575455fb>96 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://biomedeng.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Biomedical Engineering</span> <span class="articles-number" data-v-575455fb>90 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://bioinform.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Bioinformatics and Biotechnology</span> <span class="articles-number" data-v-575455fb>51 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://www.medicine20.com" no-prefetch="" data-v-575455fb><span data-v-575455fb>Medicine 2.0</span> <span class="articles-number" data-v-575455fb>26 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://apinj.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>Asian/Pacific Island Nursing Journal</span> <span class="articles-number" data-v-575455fb>25 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://neuro.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Neurotechnology</span> <span class="articles-number" data-v-575455fb>24 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://xbio.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIRx Bio</span> <span class="articles-number" data-v-575455fb>22 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://xr.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR XR and Spatial Computing (JMXR)</span> <span class="articles-number" data-v-575455fb>18 articles </span></a></li><li class="m-0" data-v-575455fb><a href="https://data.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Data</span> <!----></a></li><li class="m-0" data-v-575455fb><a href="https://challenges.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Challenges</span> <!----></a></li><li class="m-0" data-v-575455fb><a href="https://preprints.jmir.org" no-prefetch="" data-v-575455fb><span data-v-575455fb>JMIR Preprints</span> <!----></a></li></ul></li></ul></div></section> <!----></nav></div></nav></section></header></div> <div class="container-fluid" style="padding: 0px;"><div class="element-wrapper"><div data-test="main-content" class="container"><div class="sidebar-citation mobile-show"><div class="collection"><h2 tabindex="0" data-test="article-collection" class="h4 green-heading-underline width-fit-content"> This paper is in the following <span class="collection__span">e-collection/theme issue:</span></h2> <a href="/themes/797" data-test="article-collection" aria-label="1389 articles belongs to Artificial Intelligence e-collection/theme issue" class="collection__link"> Artificial Intelligence (1389) </a><a href="/themes/500" data-test="article-collection" aria-label="1478 articles belongs to Machine Learning e-collection/theme issue" class="collection__link"> Machine Learning (1478) </a><a href="/themes/58" data-test="article-collection" aria-label="1033 articles belongs to Clinical Informatics e-collection/theme issue" class="collection__link"> Clinical Informatics (1033) </a><a href="/themes/412" data-test="article-collection" aria-label="195 articles belongs to Imaging Informatics e-collection/theme issue" class="collection__link"> Imaging Informatics (195) </a><a href="/themes/186" data-test="article-collection" aria-label="1194 articles belongs to Decision Support for Health Professionals e-collection/theme issue" class="collection__link"> Decision Support for Health Professionals (1194) </a><a href="/themes/67" data-test="article-collection" aria-label="1415 articles belongs to Clinical Information and Decision Making e-collection/theme issue" class="collection__link"> Clinical Information and Decision Making (1415) </a><a href="/themes/297" data-test="article-collection" aria-label="425 articles belongs to Innovations and Technology in Cancer Care e-collection/theme issue" class="collection__link"> Innovations and Technology in Cancer Care (425) </a></div></div> <div class="row"><div class="main col-lg-9 mb-1"><!----> <div data-test="details" class="details"><div><p id="main-content" tabindex="0"> Published on <time datetime="12.07.2021">12.07.2021 </time> in <span data-test="issue-info"><a href="/2021/7" class="nuxt-link-active"> Vol 23<span>, No 7</span> (2021)<span>: July</span></a></span></p> <!----></div> <div class="preprints-version"><span aria-hidden="true" class="icon fas fa-thumbtack"></span> <div><span class="ml-2"> Preprints (earlier versions) of this paper are available at <a data-test="preprint-link" aria-label="'Preprints (earlier versions) of this paper are available at preprints.jmir.org/preprint/'26151" href="https://preprints.jmir.org/preprint/26151" target="_blank">https://preprints.jmir.org/preprint/26151</a>, first published <time datetime="November 30, 2020">November 30, 2020</time>. </span></div></div></div> <div class="info mt-3"><div class="info__article-img"><div data-v-10f10a3e><img data-srcset="https://asset.jmir.pub/assets/391d0ca13f4f8f74602e2dbe63d25e08.png 480w,https://asset.jmir.pub/assets/391d0ca13f4f8f74602e2dbe63d25e08.png 960w,https://asset.jmir.pub/assets/391d0ca13f4f8f74602e2dbe63d25e08.png 1920w,https://asset.jmir.pub/assets/391d0ca13f4f8f74602e2dbe63d25e08.png 2500w" alt="Clinically Applicable Segmentation of Head and Neck Anatomy for Radiotherapy: Deep Learning Algorithm Development and Validation Study" title="Clinically Applicable Segmentation of Head and Neck Anatomy for Radiotherapy: Deep Learning Algorithm Development and Validation Study" aria-label="Article Thumbnail Image" src="https://asset.jmir.pub/placeholder.svg" data-v-10f10a3e></div> <div data-test="article-img-info" class="info__article-img-info"><span aria-hidden="true" class="icon fas fa-search-plus"></span></div></div> <div class="info__title-authors"><h1 tabindex="0" aria-label="Clinically Applicable Segmentation of Head and Neck Anatomy for Radiotherapy: Deep Learning Algorithm Development and Validation Study" class="h3 mb-0 mt-0">Clinically Applicable Segmentation of Head and Neck Anatomy for Radiotherapy: Deep Learning Algorithm Development and Validation Study</h1> <h2 class="info__hidden-title"> Clinically Applicable Segmentation of Head and Neck Anatomy for Radiotherapy: Deep Learning Algorithm Development and Validation Study </h2> <div class="mt-3"><p tabindex="0" class="authors-for-screen-reader"> Authors of this article: </p> <span data-test="authors-info" class="info__authors"><a href="/search?term=Stanislav%20Nikolov&amp;type=author&amp;precise=true" aria-label="Stanislav Nikolov. Search more articles by this author."> Stanislav Nikolov<sup>1</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0001-8234-0751"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Sam%20Blackwell&amp;type=author&amp;precise=true" aria-label="Sam Blackwell. Search more articles by this author."> Sam Blackwell<sup>1</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0001-8730-3036"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Alexei%20Zverovitch&amp;type=author&amp;precise=true" aria-label="Alexei Zverovitch. Search more articles by this author."> Alexei Zverovitch<sup>2</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0002-0567-5440"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Ruheena%20Mendes&amp;type=author&amp;precise=true" aria-label="Ruheena Mendes. Search more articles by this author."> Ruheena Mendes<sup>3</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0003-4754-1181"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Michelle%20Livne&amp;type=author&amp;precise=true" aria-label="Michelle Livne. Search more articles by this author."> Michelle Livne<sup>2</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0002-8277-4733"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Jeffrey%20De%20Fauw&amp;type=author&amp;precise=true" aria-label="Jeffrey De Fauw. Search more articles by this author."> Jeffrey De Fauw<sup>1</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0001-6971-5678"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Yojan%20Patel&amp;type=author&amp;precise=true" aria-label="Yojan Patel. Search more articles by this author."> Yojan Patel<sup>2</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0001-6397-6279"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Clemens%20Meyer&amp;type=author&amp;precise=true" aria-label="Clemens Meyer. Search more articles by this author."> Clemens Meyer<sup>1</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0003-1165-6104"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Harry%20Askham&amp;type=author&amp;precise=true" aria-label="Harry Askham. Search more articles by this author."> Harry Askham<sup>1</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0003-1530-4683"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Bernadino%20Romera-Paredes&amp;type=author&amp;precise=true" aria-label="Bernadino Romera-Paredes. Search more articles by this author."> Bernadino Romera-Paredes<sup>1</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0003-3604-3590"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Christopher%20Kelly&amp;type=author&amp;precise=true" aria-label="Christopher Kelly. Search more articles by this author."> Christopher Kelly<sup>2</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0002-1246-844X"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Alan%20Karthikesalingam&amp;type=author&amp;precise=true" aria-label="Alan Karthikesalingam. Search more articles by this author."> Alan Karthikesalingam<sup>2</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0001-5074-898X"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Carlton%20Chu&amp;type=author&amp;precise=true" aria-label="Carlton Chu. Search more articles by this author."> Carlton Chu<sup>1</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0001-8282-6364"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Dawn%20Carnell&amp;type=author&amp;precise=true" aria-label="Dawn Carnell. Search more articles by this author."> Dawn Carnell<sup>3</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0002-2898-3219"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Cheng%20Boon&amp;type=author&amp;precise=true" aria-label="Cheng Boon. Search more articles by this author."> Cheng Boon<sup>4</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0003-2652-9263"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Derek%20D%27Souza&amp;type=author&amp;precise=true" aria-label="Derek D'Souza. Search more articles by this author."> Derek D'Souza<sup>3</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0002-4393-7683"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Syed%20Ali%20Moinuddin&amp;type=author&amp;precise=true" aria-label="Syed Ali Moinuddin. Search more articles by this author."> Syed Ali Moinuddin<sup>3</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0002-8955-8224"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Bethany%20Garie&amp;type=author&amp;precise=true" aria-label="Bethany Garie. Search more articles by this author."> Bethany Garie<sup>1</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0003-3538-9063"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Yasmin%20McQuinlan&amp;type=author&amp;precise=true" aria-label="Yasmin McQuinlan. Search more articles by this author."> Yasmin McQuinlan<sup>1</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0002-8464-0640"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Sarah%20Ireland&amp;type=author&amp;precise=true" aria-label="Sarah Ireland. Search more articles by this author."> Sarah Ireland<sup>1</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0002-2975-2447"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Kiarna%20Hampton&amp;type=author&amp;precise=true" aria-label="Kiarna Hampton. Search more articles by this author."> Kiarna Hampton<sup>1</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0002-4384-6108"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Krystle%20Fuller&amp;type=author&amp;precise=true" aria-label="Krystle Fuller. Search more articles by this author."> Krystle Fuller<sup>1</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0003-0706-6857"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Hugh%20Montgomery&amp;type=author&amp;precise=true" aria-label="Hugh Montgomery. Search more articles by this author."> Hugh Montgomery<sup>5</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0001-8797-5019"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Geraint%20Rees&amp;type=author&amp;precise=true" aria-label="Geraint Rees. Search more articles by this author."> Geraint Rees<sup>5</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0002-9623-7007"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Mustafa%20Suleyman&amp;type=author&amp;precise=true" aria-label="Mustafa Suleyman. Search more articles by this author."> Mustafa Suleyman<sup>6</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0002-5415-4457"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Trevor%20Back&amp;type=author&amp;precise=true" aria-label="Trevor Back. Search more articles by this author."> Trevor Back<sup>1</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0002-0567-8043"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=C%C3%ADan%20Owen%20Hughes&amp;type=author&amp;precise=true" aria-label="Cían Owen Hughes. Search more articles by this author."> Cían Owen Hughes<sup>2</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0001-6901-0985"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Joseph%20R%20Ledsam&amp;type=author&amp;precise=true" aria-label="Joseph R Ledsam. Search more articles by this author."> Joseph R Ledsam<sup>7</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0001-9917-7196"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <span style="margin-left: -2px;"> ;   </span></span><span data-test="authors-info" class="info__authors"><a href="/search?term=Olaf%20Ronneberger&amp;type=author&amp;precise=true" aria-label="Olaf Ronneberger. Search more articles by this author."> Olaf Ronneberger<sup>1</sup> <!----></a> <span><a aria-label="Visit this author on ORCID website" data-test="orcid-link" target="_blank" href="https://orcid.org/0000-0002-4266-1515"><img src="https://asset.jmir.pub/assets/static/images/Orcid-ID-Logo-Colour.png" alt="Author Orcid Image" aria-label="Author Orcid Image" class="info__orcid-img"></a></span> <!----></span></div> <!----></div></div> <div role="tablist" aria-label="Article" class="tabs"><a href="/2021/7/e26151/" aria-current="page" role="tab" aria-label="Article" data-test="tabs" class="nuxt-link-exact-active nuxt-link-active active"> Article </a><a href="/2021/7/e26151/authors" role="tab" aria-label="Authors" data-test="tabs"> Authors </a><a href="/2021/7/e26151/citations" role="tab" aria-label="Cited by (147)" data-test="tabs"> Cited by (147) </a><a href="/2021/7/e26151/tweetations" role="tab" aria-label="Tweetations (12)" data-test="tabs"> Tweetations (12) </a><a href="/2021/7/e26151/metrics" role="tab" aria-label="Metrics" data-test="tabs"> Metrics </a></div> <div class="container"><div class="row"><div class="col-lg-3 mb-5 sidebar-sections"><div class="sidebar-nav"><div class="sidebar-nav-sticky"><ul></ul></div></div></div> <div data-test="keyword-links" class="col-lg-9 article"><main id="wrapper" class="wrapper ArticleMain clearfix"><section class="inner-wrapper clearfix"><section class="main-article-content clearfix"><article class="ajax-article-content"><h4 class="h4-original-paper"><span class="typcn typcn-document-text"/>Original Paper</h4><div class="authors-container"><div class="authors clearfix"/></div><div class="authors-container"><div class="authors clearfix"/></div><div class="authors-container"><div class="authors clearfix"><ul class="clearfix"><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Stanislav+Nikolov" class="btn-view-author-options">Stanislav Nikolov<sup><small>1</small></sup><sup>*</sup>, MEng</a><a class="author-orcid" href="https://orcid.org/0000-0001-8234-0751" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Sam+Blackwell" class="btn-view-author-options">Sam Blackwell<sup><small>1</small></sup><sup>*</sup>, MEng</a><a class="author-orcid" href="https://orcid.org/0000-0001-8730-3036" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Alexei+Zverovitch" class="btn-view-author-options">Alexei Zverovitch<sup><small>2</small></sup><sup>*</sup>, PhD</a><a class="author-orcid" href="https://orcid.org/0000-0002-0567-5440" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Ruheena+Mendes" class="btn-view-author-options">Ruheena Mendes<sup><small>3</small></sup>, MB ChB</a><a class="author-orcid" href="https://orcid.org/0000-0003-4754-1181" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Michelle+Livne" class="btn-view-author-options">Michelle Livne<sup><small>2</small></sup>, PhD</a><a class="author-orcid" href="https://orcid.org/0000-0002-8277-4733" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Jeffrey+De Fauw" class="btn-view-author-options">Jeffrey De Fauw<sup><small>1</small></sup>, BSc</a><a class="author-orcid" href="https://orcid.org/0000-0001-6971-5678" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Yojan+Patel" class="btn-view-author-options">Yojan Patel<sup><small>2</small></sup>, BA</a><a class="author-orcid" href="https://orcid.org/0000-0001-6397-6279" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Clemens+Meyer" class="btn-view-author-options">Clemens Meyer<sup><small>1</small></sup>, MSc</a><a class="author-orcid" href="https://orcid.org/0000-0003-1165-6104" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Harry+Askham" class="btn-view-author-options">Harry Askham<sup><small>1</small></sup>, MSc</a><a class="author-orcid" href="https://orcid.org/0000-0003-1530-4683" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Bernadino+Romera-Paredes" class="btn-view-author-options">Bernadino Romera-Paredes<sup><small>1</small></sup>, PhD</a><a class="author-orcid" href="https://orcid.org/0000-0003-3604-3590" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Christopher+Kelly" class="btn-view-author-options">Christopher Kelly<sup><small>2</small></sup>, PhD</a><a class="author-orcid" href="https://orcid.org/0000-0002-1246-844X" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Alan+Karthikesalingam" class="btn-view-author-options">Alan Karthikesalingam<sup><small>2</small></sup>, PhD</a><a class="author-orcid" href="https://orcid.org/0000-0001-5074-898X" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Carlton+Chu" class="btn-view-author-options">Carlton Chu<sup><small>1</small></sup>, PhD</a><a class="author-orcid" href="https://orcid.org/0000-0001-8282-6364" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Dawn+Carnell" class="btn-view-author-options">Dawn Carnell<sup><small>3</small></sup>, MD</a><a class="author-orcid" href="https://orcid.org/0000-0002-2898-3219" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Cheng+Boon" class="btn-view-author-options">Cheng Boon<sup><small>4</small></sup>, MB ChB</a><a class="author-orcid" href="https://orcid.org/0000-0003-2652-9263" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Derek+D'Souza" class="btn-view-author-options">Derek D'Souza<sup><small>3</small></sup>, MSc</a><a class="author-orcid" href="https://orcid.org/0000-0002-4393-7683" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Syed Ali+Moinuddin" class="btn-view-author-options">Syed Ali Moinuddin<sup><small>3</small></sup>, MSc</a><a class="author-orcid" href="https://orcid.org/0000-0002-8955-8224" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Bethany+Garie" class="btn-view-author-options">Bethany Garie<sup><small>1</small></sup>, BMRSc (RT)</a><a class="author-orcid" href="https://orcid.org/0000-0003-3538-9063" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Yasmin+McQuinlan" class="btn-view-author-options">Yasmin McQuinlan<sup><small>1</small></sup>, BRT</a><a class="author-orcid" href="https://orcid.org/0000-0002-8464-0640" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Sarah+Ireland" class="btn-view-author-options">Sarah Ireland<sup><small>1</small></sup>, BMRSc (RT)</a><a class="author-orcid" href="https://orcid.org/0000-0002-2975-2447" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Kiarna+Hampton" class="btn-view-author-options">Kiarna Hampton<sup><small>1</small></sup>, MPH</a><a class="author-orcid" href="https://orcid.org/0000-0002-4384-6108" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Krystle+Fuller" class="btn-view-author-options">Krystle Fuller<sup><small>1</small></sup>, BAppSc (RT)</a><a class="author-orcid" href="https://orcid.org/0000-0003-0706-6857" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Hugh+Montgomery" class="btn-view-author-options">Hugh Montgomery<sup><small>5</small></sup>, BSc, MB BS, MD</a><a class="author-orcid" href="https://orcid.org/0000-0001-8797-5019" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Geraint+Rees" class="btn-view-author-options">Geraint Rees<sup><small>5</small></sup>, PhD</a><a class="author-orcid" href="https://orcid.org/0000-0002-9623-7007" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Mustafa+Suleyman" class="btn-view-author-options">Mustafa Suleyman<sup><small>6</small></sup></a><a class="author-orcid" href="https://orcid.org/0000-0002-5415-4457" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Trevor+Back" class="btn-view-author-options">Trevor Back<sup><small>1</small></sup>, PhD</a><a class="author-orcid" href="https://orcid.org/0000-0002-0567-8043" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=C&#237;an Owen+Hughes" class="btn-view-author-options">C&#237;an Owen Hughes<sup><small>2</small></sup><sup>*</sup>, MBChB, MRCS, MSc</a><a class="author-orcid" href="https://orcid.org/0000-0001-6901-0985" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Joseph R+Ledsam" class="btn-view-author-options">Joseph R Ledsam<sup><small>7</small></sup><sup>*</sup>, MB ChB</a><a class="author-orcid" href="https://orcid.org/0000-0001-9917-7196" target="_blank" title="ORCID">&#160;</a>;&#160;</li><li><a href="/search/searchResult?field%5B%5D=author&amp;criteria%5B%5D=Olaf+Ronneberger" class="btn-view-author-options">Olaf Ronneberger<sup><small>1</small></sup><sup>*</sup>, PhD</a><a class="author-orcid" href="https://orcid.org/0000-0002-4266-1515" target="_blank" title="ORCID">&#160;</a></li></ul><div class="author-affiliation-details"><p><sup>1</sup>DeepMind, London, United Kingdom</p><p><sup>2</sup>Google Health, London, United Kingdom</p><p><sup>3</sup>University College London Hospitals NHS Foundation Trust, London, United Kingdom</p><p><sup>4</sup>Clatterbridge Cancer Centre NHS Foundation Trust, Liverpool, United Kingdom</p><p><sup>5</sup>University College London, London, United Kingdom</p><p><sup>6</sup>Google, London, United Kingdom</p><p><sup>7</sup>Google AI, Tokyo, Japan</p><p>*these authors contributed equally</p></div></div><div class="corresponding-author-and-affiliations clearfix"><div class="corresponding-author-details"><h3>Corresponding Author:</h3><p>C&#237;an Owen Hughes, MBChB, MRCS, MSc</p><p/><p>Google Health</p><p>6 Pancras Square</p><p>London, N1C 4AG</p><p>United Kingdom</p><p>Phone: 1 650 253 0000</p><p>Fax:1 650 253 0001</p><p>Email: <a href="mailto:cianh@google.com">cianh@google.com</a></p><br/></div></div></div><section class="article-content clearfix"><article class="abstract"><h3 id="Abstract" class="navigation-heading" data-label="Abstract">Abstract</h3><p><span class="abstract-sub-heading">Background: </span>Over half a million individuals are diagnosed with head and neck cancer each year globally. Radiotherapy is an important curative treatment for this disease, but it requires manual time to delineate radiosensitive organs at risk. This planning process can delay treatment while also introducing interoperator variability, resulting in downstream radiation dose differences. Although auto-segmentation algorithms offer a potentially time-saving solution, the challenges in defining, quantifying, and achieving expert performance remain.<br/></p><p><span class="abstract-sub-heading">Objective: </span>Adopting a deep learning approach, we aim to demonstrate a 3D U-Net architecture that achieves expert-level performance in delineating 21 distinct head and neck organs at risk commonly segmented in clinical practice.<br/></p><p><span class="abstract-sub-heading">Methods: </span>The model was trained on a data set of 663 deidentified computed tomography scans acquired in routine clinical practice and with both segmentations taken from clinical practice and segmentations created by experienced radiographers as part of this research, all in accordance with consensus organ at risk definitions.<br/></p><p><span class="abstract-sub-heading">Results: </span>We demonstrated the model&#8217;s clinical applicability by assessing its performance on a test set of 21 computed tomography scans from clinical practice, each with 21 organs at risk segmented by 2 independent experts. We also introduced surface Dice similarity coefficient, a new metric for the comparison of organ delineation, to quantify the deviation between organ at risk surface contours rather than volumes, better reflecting the clinical task of correcting errors in automated organ segmentations. The model&#8217;s generalizability was then demonstrated on 2 distinct open-source data sets, reflecting different centers and countries to model training.<br/></p><p><span class="abstract-sub-heading">Conclusions: </span>Deep learning is an effective and clinically applicable technique for the segmentation of the head and neck anatomy for radiotherapy. With appropriate validation studies and regulatory approvals, this system could improve the efficiency, consistency, and safety of radiotherapy pathways.<br/></p><strong class="h4-article-volume-issue">J Med Internet Res 2021;23(7):e26151</strong><br/><br/><span class="article-doi"><a href="https://doi.org/10.2196/26151">doi:10.2196/26151</a></span><br/><br/><h3 class="h3-main-heading" id="Keywords">Keywords</h3><div class="keywords"><span><a href="/search?type=keyword&amp;term=radiotherapy">radiotherapy</a>;&#160;</span><span><a href="/search?type=keyword&amp;term=segmentation">segmentation</a>;&#160;</span><span><a href="/search?type=keyword&amp;term=contouring">contouring</a>;&#160;</span><span><a href="/search?type=keyword&amp;term=machine learning">machine learning</a>;&#160;</span><span><a href="/search?type=keyword&amp;term=artificial intelligence">artificial intelligence</a>;&#160;</span><span><a href="/search?type=keyword&amp;term=UNet">UNet</a>;&#160;</span><span><a href="/search?type=keyword&amp;term=convolutional neural networks">convolutional neural networks</a>;&#160;</span><span><a href="/search?type=keyword&amp;term=surface DSC">surface DSC</a>&#160;</span></div><div id="trendmd-suggestions"/></article><br/><article class="main-article clearfix"><br/><h3 class="navigation-heading h3-main-heading" id="Introduction" data-label="Introduction">Introduction</h3><h4>Background</h4><p class="abstract-paragraph">Each year, 550,000 people worldwide are diagnosed with cancer of the head and neck [<span class="footers"><a class="citation-link" href="#ref1" rel="footnote">1</a></span>]. This incidence is rising [<span class="footers"><a class="citation-link" href="#ref2" rel="footnote">2</a></span>] and more than doubling in certain subgroups over the last 30 years [<span class="footers"><a class="citation-link" href="#ref3" rel="footnote">3</a></span>-<span class="footers"><a class="citation-link" href="#ref5" rel="footnote">5</a></span>]. Where available, most patients will be treated with radiotherapy, which targets the tumor mass and areas at high risk of microscopic tumor spread. However, strategies are needed to mitigate the dose-dependent adverse effects that result from incidental irradiation of normal anatomical structures (<i>organs at risk</i>) [<span class="footers"><a class="citation-link" href="#ref6" rel="footnote">6</a></span>-<span class="footers"><a class="citation-link" href="#ref9" rel="footnote">9</a></span>].</p><p class="abstract-paragraph">Thus, the efficacy and safety of head and neck radiotherapy depends on the accurate delineation of organs at risk and tumors, a process known as segmentation or contouring. However, the fact that this process is predominantly done manually means that results may be both inconsistent and imperfectly accurate [<span class="footers"><a class="citation-link" href="#ref10" rel="footnote">10</a></span>], leading to large inter- and intrapractitioner variability even among experts and thus variation in care quality [<span class="footers"><a class="citation-link" href="#ref11" rel="footnote">11</a></span>].</p><p class="abstract-paragraph">Segmentation is also very time consuming: an expert can spend 4 hours or more on a single case [<span class="footers"><a class="citation-link" href="#ref12" rel="footnote">12</a></span>]. The duration of resulting delays in treatment initiation (<span class="footers"><a class="citation-link" href="#figure1" rel="footnote">Figure 1</a></span>) is associated with an increased risk of both local recurrence and overall mortality [<span class="footers"><a class="citation-link" href="#ref13" rel="footnote">13</a></span>-<span class="footers"><a class="citation-link" href="#ref15" rel="footnote">15</a></span>]. Increasing demands for, and shortages of, trained staff already place a heavy burden on health care systems, which can lead to long delays for patients as radiotherapy is planned [<span class="footers"><a class="citation-link" href="#ref16" rel="footnote">16</a></span>,<span class="footers"><a class="citation-link" href="#ref17" rel="footnote">17</a></span>], and the continued rise in head and neck cancer incidence may make it impossible to maintain even current temporal reporting standards [<span class="footers"><a class="citation-link" href="#ref4" rel="footnote">4</a></span>]. Such issues also represent a barrier to <i>adaptive radiotherapy</i>&#8212;the process of repeated scanning, segmentation, and radiotherapy planning throughout treatment, which maintains the precision of tumor targeting (and organ at risk avoidance) in the face of treatment-related anatomic changes such as tumor shrinkage [<span class="footers"><a class="citation-link" href="#ref18" rel="footnote">18</a></span>].</p><figure><a name="figure1">&#8206;</a><a class="fancybox" title="Figure 1. A typical clinical pathway for radiotherapy. After a patient is diagnosed and the decision is made to treat with radiotherapy, a defined workflow aims to provide treatment that is both safe and effective. In the United Kingdom, the time delay between decision to treat and treatment delivery should be no greater than 31 days. Time-intensive manual segmentation and dose optimization steps can introduce delays to treatment." href="https://asset.jmir.pub/assets/c3a17c2b83daa977a6e80bb77dcfc12f.png" id="figure1"><img class="figure-image" src="https://asset.jmir.pub/assets/c3a17c2b83daa977a6e80bb77dcfc12f.png"/></a><figcaption><span class="typcn typcn-image"/>Figure 1. A typical clinical pathway for radiotherapy. After a patient is diagnosed and the decision is made to treat with radiotherapy, a defined workflow aims to provide treatment that is both safe and effective. In the United Kingdom, the time delay between decision to treat and treatment delivery should be no greater than 31 days. Time-intensive manual segmentation and dose optimization steps can introduce delays to treatment. </figcaption><a class="fancybox" href="https://asset.jmir.pub/assets/c3a17c2b83daa977a6e80bb77dcfc12f.png" title="Figure 1. A typical clinical pathway for radiotherapy. After a patient is diagnosed and the decision is made to treat with radiotherapy, a defined workflow aims to provide treatment that is both safe and effective. In the United Kingdom, the time delay between decision to treat and treatment delivery should be no greater than 31 days. Time-intensive manual segmentation and dose optimization steps can introduce delays to treatment.">View this figure</a></figure><p class="abstract-paragraph">Automated (ie, computer-performed) segmentation has the potential to address these challenges. However, most segmentation algorithms in clinical use are atlas based, producing segmentations by fitting previously labeled reference images to the new target scan. This might not sufficiently account for either postsurgical changes or the variability in normal anatomical structures that exist between patients, particularly when considering the variable effect that tumors may have on local anatomy; thus, they may be prone to systematic error. To date, such algorithm-derived segmentations still require significant manual editing, perform at expert levels on only a small number of organs, demonstrate an overall performance in clinical practice inferior to that of human experts, and have failed to significantly improve clinical workflows [<span class="footers"><a class="citation-link" href="#ref19" rel="footnote">19</a></span>-<span class="footers"><a class="citation-link" href="#ref26" rel="footnote">26</a></span>].</p><p class="abstract-paragraph">In recent years, deep learning&#8211;based algorithms have proven capable of delivering substantially better performance than traditional segmentation algorithms. Several deep learning&#8211;based approaches have been proposed for head and neck cancer segmentation. Some of them use standard convolutional neural network classifiers on patches with tailored pre- and postprocessing [<span class="footers"><a class="citation-link" href="#ref27" rel="footnote">27</a></span>-<span class="footers"><a class="citation-link" href="#ref31" rel="footnote">31</a></span>]. However, the U-Net convolutional architecture [<span class="footers"><a class="citation-link" href="#ref32" rel="footnote">32</a></span>] has shown promise in the area of deep learning&#8211;based medical image segmentation [<span class="footers"><a class="citation-link" href="#ref33" rel="footnote">33</a></span>] and has also been applied to head and neck radiotherapy segmentation [<span class="footers"><a class="citation-link" href="#ref34" rel="footnote">34</a></span>-<span class="footers"><a class="citation-link" href="#ref47" rel="footnote">47</a></span>].</p><p class="abstract-paragraph">Despite the promise that deep learning offers, barriers remain in the application of auto-segmentation in radiotherapy planning. These include the absence of consensus on how <i>expert</i> performance is defined, the lack of available methods by which such human performance can be compared with that delivered by automated segmentation processes, and thus how the clinical acceptability of automated processes can be defined.</p><h4>Objectives</h4><p class="abstract-paragraph">In this paper, we address these challenges in defining comparison metrics and report a deep learning approach that delineates a wide range of important organs at risk in head and neck cancer radiotherapy scans. We aim to achieve this using a study design that includes (1) the introduction of a clinically meaningful performance metric for segmentation in radiotherapy planning, (2) a representative set of images acquired during routine clinical practice, (3) an unambiguous segmentation protocol for all organs, and (4) a segmentation of each test set image according to these protocols by 2 independent experts. In addition to the model&#8217;s generalizability, as demonstrated on two distinct open-source data sets, by achieving performance equal to that of human experts on previously unseen patients from the same hospital site used for training, we aim to demonstrate the clinical applicability of our approach.</p><br/><h3 class="navigation-heading h3-main-heading" id="Methods" data-label="Methods">Methods</h3><h4>Data Sets</h4><p class="abstract-paragraph">University College London Hospitals (UCLH) National Health Service (NHS) Foundation Trust serves an urban, mixed socioeconomic and ethnic population in central London, United Kingdom, and houses a specialist center for cancer treatment. Data were selected from a retrospective cohort of all-adult (aged &gt;18 years) UCLH patients who underwent computed tomography (CT) scans to plan radical radiotherapy treatment for head and neck cancer between January 1, 2008, and March 20, 2016. Both initial CT images and rescans were included in the training data set. Patients with all tumor types, stages, and histological grades were considered for inclusion, as long as their CT scans were available in digital form and were of sufficient diagnostic quality. The standard CT pixel spacing was 0.976&#215;0.976&#215;2.5 mm, and scans with nonstandard spacing (with the exception of 1.25-mm spacing scans that were subsampled) were excluded to ensure consistent performance metrics during training. It should be noted that for the Cancer Imaging Archive (TCIA) test set, the in-plane pixel spacing was not used as an exclusion criterion, <i>i</i> ranged from 0.94 to 1.27 mm. For the public domain database for computational anatomy (PDDCA) test set, we included all scans, and the voxels varied between 2 to 3 mm in height and 0.98 to 1.27 mm in axial dimension. Patients&#8217; requests to not have their data shared for research were respected.</p><p class="abstract-paragraph">Of the 513 patients who underwent radiotherapy at UCLH within the given study dates, a total of 486 patients (94.7%; 838 scans; mean age 57 years; male 337, female 146, and gender unknown 3) met the inclusion criteria. Of note, no scans were excluded because of poor diagnostic quality. Scans from UCLH were split into a training set (389 patients; 663 scans), validation set (51 patients; 100 scans), and test set (46 patients; 75 scans). From the selected test set, 19 patients (21 scans) underwent adjudicated contouring described below. No patient was included in multiple data sets; in cases where multiple scans were present for a single patient, all were included in the same subset. Multiple scans present for a single patient reflect CT scans taken for the purpose of replanning radiotherapy owing to anatomical changes during the course of treatment. It is important for models to perform well in both scenarios as treatment naive and postradiotherapy organ at risk anatomies can differ. However, to avoid potential correlation between the same organs segmented twice in the same data set, care was taken to avoid this in the TCIA test set (described later in this section).</p><p class="abstract-paragraph">In total, 21 organs at risk were selected throughout the head and neck area to represent a wide range of anatomical regions. We used a combination of segmentations sourced from those used clinically at UCLH and additional segmentations performed in-house by trained radiographers.</p><p class="abstract-paragraph">We divided our UCLH data set into the following categories: (1) <i>training set</i>, used to train the model, a combination of UCLH clinical segmentations and in-house segmentations, some of which were only 2D slices (owing to the time required to segment larger organs manually, we initially relied heavily on sparse segmentations to make efficient use of the radiographers&#8217; time). (2) <i>UCLH validation set</i>: used to evaluate model performance and steer additional data set priorities, which used in-house segmentations only, as we did not want to overfit any clinical bias. (3) <i>UCLH test set</i>: our primary result set; each scan has every organ at risk labeled and was independently segmented from scratch by 2 radiographers before one of the pairs of scans (chosen arbitrarily) was reviewed and corrected by an experienced radiation oncologist.</p><p class="abstract-paragraph">As these scans were taken from UCLH patients not present elsewhere, and to consider generalizability, we curated additional open-source CT scans available from The Cancer Genome Atlas Head-Neck Squamous Cell Carcinoma (TCGA-HNSC) and Head-Neck Cetuximab [<span class="footers"><a class="citation-link" href="#ref48" rel="footnote">48</a></span>-<span class="footers"><a class="citation-link" href="#ref50" rel="footnote">50</a></span>]. The open-source (category 4) TCIA validation set and (category 5) TCIA test set were both labeled in the same way as our UCLH test set.</p><p class="abstract-paragraph">Non-CT planning scans and those that did not meet the same slice thickness as the UCLH scans (2.5 mm) were excluded. These were then manually segmented in-house according to the Brouwer Atlas (the segmentation procedure is described in further detail in the <i>Clinical Labeling and Annotation</i> section [<span class="footers"><a class="citation-link" href="#ref51" rel="footnote">51</a></span>]). We included 31 scans (22 Head-Neck Cetuximab and 9 TCGA-HNSC) that met these criteria, which we further split into validation (6 patients; 7 scans) and test (24 patients; 24 scans) sets (<span class="footers"><a class="citation-link" href="#figure2" rel="footnote">Figure 2</a></span>). The original segmentations from the Head-Neck Cetuximab data set were not included; a consensus assessment by experienced radiographers and oncologists found the segmentations either nonconformant to the selected segmentation protocol or below the quality that would be acceptable for clinical care. The original inclusion criteria for Head-Neck Cetuximab were patients with stage 3-4 carcinoma of the oropharynx, larynx, and hypopharynx, with a Zubrod performance of 0-1, and meeting predefined blood chemistry criteria between November 2005 and March 2009. The TCGA-HNSC data set included patients treated for head-neck squamous cell carcinoma, with no further restrictions being apparent [<span class="footers"><a class="citation-link" href="#ref48" rel="footnote">48</a></span>,<span class="footers"><a class="citation-link" href="#ref50" rel="footnote">50</a></span>].</p><figure><a name="figure2">&#8206;</a><a class="fancybox" title="Figure 2. Case selection from the University College London Hospitals and The Cancer Imaging Archive computed tomography data sets. A consort-style diagram demonstrating the application of inclusion and exclusion criteria to select the training, validation, and test sets used in this work. CT: computed tomography; HN_C: Head and Neck Carcinoma; N/A: not applicable; TCIA: The Cancer Imaging Archive; TCGA: The Cancer Genome Atlas; UCLH: University College London Hospitals; Val: validation." href="https://asset.jmir.pub/assets/9c1afe91ef715718f5005d99d08c139a.png" id="figure2"><img class="figure-image" src="https://asset.jmir.pub/assets/9c1afe91ef715718f5005d99d08c139a.png"/></a><figcaption><span class="typcn typcn-image"/>Figure 2. Case selection from the University College London Hospitals and The Cancer Imaging Archive computed tomography data sets. A consort-style diagram demonstrating the application of inclusion and exclusion criteria to select the training, validation, and test sets used in this work. CT: computed tomography; HN_C: Head and Neck Carcinoma; N/A: not applicable; TCIA: The Cancer Imaging Archive; TCGA: The Cancer Genome Atlas; UCLH: University College London Hospitals; Val: validation. </figcaption><a class="fancybox" href="https://asset.jmir.pub/assets/9c1afe91ef715718f5005d99d08c139a.png" title="Figure 2. Case selection from the University College London Hospitals and The Cancer Imaging Archive computed tomography data sets. A consort-style diagram demonstrating the application of inclusion and exclusion criteria to select the training, validation, and test sets used in this work. CT: computed tomography; HN_C: Head and Neck Carcinoma; N/A: not applicable; TCIA: The Cancer Imaging Archive; TCGA: The Cancer Genome Atlas; UCLH: University College London Hospitals; Val: validation.">View this figure</a></figure><p class="abstract-paragraph">All test sets were kept separate during model training and validation. <span class="footers"><a class="citation-link" href="#table1" rel="footnote">Table 1</a></span> describes in detail the demographics and characteristics within the data sets; to obtain a balanced demographic in each of the tests, the validation and training data sets, we sampled randomly stratified splits and selected one that minimized the differences between the key demographics in each data set.</p><p class="abstract-paragraph">In addition, the (6) <i>PDDCA open-source data set</i> consisted of 15 patients selected from the Head-Neck Cetuximab open-source data set [<span class="footers"><a class="citation-link" href="#ref48" rel="footnote">48</a></span>], owing to differences in selection criteria and test, validation, or training set allocation, five scans were present in both the TCIA and PDDCA test sets. This data set was used without further postprocessing and only accessed once to assess the volumetric Dice similarity coefficient (DSC) performance. The PDDCA test set differs from the TCIA test set in both the segmentation protocol and the axial slice thickness. The work by Raudaschl et al [<span class="footers"><a class="citation-link" href="#ref25" rel="footnote">25</a></span>] provides more details on the data set characteristics and preprocessing.</p><p class="abstract-paragraph"><span class="footers"><a class="citation-link" href="#table1" rel="footnote">Table 1</a></span> details the characteristics of these data sets and patient demographics.</p><div class="figure-table"><figcaption><span class="typcn typcn-clipboard"/>Table 1. Data set characteristics<sup>a</sup>.</figcaption><table width="1000" cellpadding="5" cellspacing="0" border="1" rules="groups" frame="hsides"><col width="30" span="1"></col><col width="290" span="1"></col><col width="0" span="1"></col><col width="120" span="1"></col><col width="0" span="1"></col><col width="130" span="1"></col><col width="0" span="1"></col><col width="100" span="1"></col><col width="0" span="1"></col><col width="0" span="1"></col><col width="130" span="1"></col><col width="0" span="1"></col><col width="100" span="1"></col><col width="0" span="1"></col><col width="0" span="1"></col><col width="100" span="1"></col><thead><tr valign="top"><td colspan="3" rowspan="1">Data set</td><td colspan="7" rowspan="1">UCLH<sup>b</sup></td><td colspan="5" rowspan="1">TCIA<sup>c</sup></td><td rowspan="1" colspan="1">PDDCA<sup>d</sup></td></tr><tr valign="top"><td colspan="3" rowspan="1"><br/></td><td colspan="2" rowspan="1">Train</td><td colspan="2" rowspan="1">Validation</td><td colspan="2" rowspan="1">Test</td><td colspan="3" rowspan="1">Validation</td><td colspan="2" rowspan="1">Test</td><td colspan="2" rowspan="1">Test</td></tr></thead><tbody><tr valign="top"><td colspan="3" rowspan="1">Total scans (patients), n</td><td colspan="2" rowspan="1">663 (389)</td><td colspan="2" rowspan="1">100 (51)</td><td colspan="2" rowspan="1">21 (19)</td><td colspan="3" rowspan="1">7 (6)</td><td colspan="2" rowspan="1">24 (24)</td><td colspan="2" rowspan="1">15 (15)</td></tr><tr valign="top"><td colspan="3" rowspan="1">Average patient age (years)</td><td colspan="2" rowspan="1">57.1</td><td colspan="2" rowspan="1">57.5</td><td colspan="2" rowspan="1">59.6</td><td colspan="3" rowspan="1">56.5</td><td colspan="2" rowspan="1">59.9</td><td colspan="2" rowspan="1">58.6</td></tr><tr valign="top"><td colspan="16" rowspan="1"><b>Sex, number of scans (number of patients)</b></td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Female</td><td colspan="2" rowspan="1">207 (115)</td><td colspan="2" rowspan="1">36 (19)</td><td colspan="2" rowspan="1">7 (6)</td><td colspan="3" rowspan="1">2 (2)</td><td colspan="2" rowspan="1">2 (2)</td><td colspan="3" rowspan="1">2 (2)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Male</td><td colspan="2" rowspan="1">450 (271)</td><td colspan="2" rowspan="1">64 (32)</td><td colspan="2" rowspan="1">14 (13)</td><td colspan="3" rowspan="1">5 (4)</td><td colspan="2" rowspan="1">20 (20)</td><td colspan="3" rowspan="1">9 (9)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Unknown</td><td colspan="2" rowspan="1">6 (3)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="3" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">2 (2)</td><td colspan="3" rowspan="1">4 (4)</td></tr><tr valign="top"><td colspan="16" rowspan="1"><b>Tumor</b><b>site, number of scans (number of patients)</b></td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Oropharynx</td><td colspan="2" rowspan="1">145 (86)</td><td colspan="2" rowspan="1">27 (15)</td><td colspan="2" rowspan="1">7 (6)</td><td colspan="3" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">8 (8)</td><td colspan="3" rowspan="1">2 (2)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Lip, oral cavity, and pharynx</td><td colspan="2" rowspan="1">80 (52)</td><td colspan="2" rowspan="1">20 (8)</td><td colspan="2" rowspan="1">4 (4)</td><td colspan="3" rowspan="1">1 (1)</td><td colspan="2" rowspan="1">3 (3)</td><td colspan="3" rowspan="1">0 (0)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Tongue</td><td colspan="2" rowspan="1">53 (26)</td><td colspan="2" rowspan="1">8 (5)</td><td colspan="2" rowspan="1">1 (1)</td><td colspan="3" rowspan="1">2 (2)</td><td colspan="2" rowspan="1">7 (7)</td><td colspan="3" rowspan="1">0 (0)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Larynx</td><td colspan="2" rowspan="1">46 (31)</td><td colspan="2" rowspan="1">8 (3)</td><td colspan="2" rowspan="1">2 (2)</td><td colspan="3" rowspan="1">2 (2)</td><td colspan="2" rowspan="1">4 (4)</td><td colspan="3" rowspan="1">0 (0)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Nasopharynx</td><td colspan="2" rowspan="1">48 (24)</td><td colspan="2" rowspan="1">5 (3)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="3" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="3" rowspan="1">0 (0)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Head, face, and neck</td><td colspan="2" rowspan="1">37 (23)</td><td colspan="2" rowspan="1">8 (3)</td><td colspan="2" rowspan="1">1 (1)</td><td colspan="3" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="3" rowspan="1">0 (0)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Nasal cavity</td><td colspan="2" rowspan="1">32 (19)</td><td colspan="2" rowspan="1">2 (1)</td><td colspan="2" rowspan="1">1 (1)</td><td colspan="3" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="3" rowspan="1">0 (0)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Connective and soft tissue</td><td colspan="2" rowspan="1">37 (18)</td><td colspan="2" rowspan="1">2 (1)</td><td colspan="2" rowspan="1">1 (1)</td><td colspan="3" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="3" rowspan="1">0 (0)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Hypopharynx</td><td colspan="2" rowspan="1">17 (10)</td><td colspan="2" rowspan="1">1 (1)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="3" rowspan="1">2 (1)</td><td colspan="2" rowspan="1">1 (1)</td><td colspan="3" rowspan="1">0 (0)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Accessory sinus</td><td colspan="2" rowspan="1">10 (7)</td><td colspan="2" rowspan="1">2 (1)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="3" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="3" rowspan="1">0 (0)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Esophagus</td><td colspan="2" rowspan="1">6 (2)</td><td colspan="2" rowspan="1">1 (1)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="3" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="3" rowspan="1">0 (0)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Other</td><td colspan="2" rowspan="1">33 (20)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="3" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">1 (1)</td><td colspan="3" rowspan="1">0 (0)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Unknown</td><td colspan="2" rowspan="1">119 (71)</td><td colspan="2" rowspan="1">16 (9)</td><td colspan="2" rowspan="1">4 (3)</td><td colspan="3" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="3" rowspan="1">13 (13)</td></tr><tr valign="top"><td colspan="16" rowspan="1"><b>Source, number of scans (number of patients)</b></td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">TCGA<sup>e</sup></td><td colspan="2" rowspan="1">&#8212;<sup>f</sup></td><td colspan="2" rowspan="1">&#8212;</td><td colspan="2" rowspan="1">&#8212;</td><td colspan="3" rowspan="1">2 (2)</td><td colspan="2" rowspan="1">7 (7)</td><td colspan="3" rowspan="1">0 (0)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">HN_Cetux<sup>g</sup></td><td colspan="2" rowspan="1">&#8212;</td><td colspan="2" rowspan="1">&#8212;</td><td colspan="2" rowspan="1">&#8212;</td><td colspan="3" rowspan="1">5 (4)</td><td colspan="2" rowspan="1">17 (17)</td><td colspan="3" rowspan="1">15 (15)</td></tr><tr valign="top"><td colspan="16" rowspan="1"><b>Site, number of scans (number of patients)</b></td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">UCLH</td><td colspan="2" rowspan="1">663 (389)</td><td colspan="2" rowspan="1">100 (51)</td><td colspan="2" rowspan="1">21 (19)</td><td colspan="3" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="3" rowspan="1">0 (0)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">MD Anderson Cancer Center</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="3" rowspan="1">2 (2)</td><td colspan="2" rowspan="1">7 (7)</td><td colspan="3" rowspan="1">0 (0)</td></tr><tr valign="top"><td rowspan="1" colspan="1"><br/></td><td rowspan="1" colspan="1">Unknown (US)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="2" rowspan="1">0 (0)</td><td colspan="3" rowspan="1">5 (4)</td><td colspan="2" rowspan="1">17 (17)</td><td colspan="3" rowspan="1">15 (15)</td></tr></tbody></table><fn id="table1fn1"><p><sup>a</sup>Tumor sites were derived from International Classification of Diseases codes. The Cancer Genome Atlas Head-Neck Squamous Cell Carcinoma [<span class="footers"><a class="citation-link" href="#ref52" rel="footnote">52</a></span>] is an open-source data set hosted on The Cancer Imaging Archive (TCIA). Head-Neck Cetuximab is an open-source data set hosted on TCIA [<span class="footers"><a class="citation-link" href="#ref53" rel="footnote">53</a></span>]. Public Domain Database for Computational Anatomy data set released as part of the 2015 challenge in the segmentation of head and neck anatomy at the International Conference on Medical Image Computing and Computer Assisted Intervention.</p></fn><fn id="table1fn2"><p><sup>b</sup>UCLH: University College London Hospitals.</p></fn><fn id="table1fn3"><p><sup>c</sup>TCIA: The Cancer Imaging Archive.</p></fn><fn id="table1fn4"><p><sup>d</sup>PDDCA: Public Domain Database for Computational Anatomy.</p></fn><fn id="table1fn5"><p><sup>e</sup>TCGA: The Cancer Genome Atlas Program.</p></fn><fn id="table1fn6"><p><sup>f</sup>The University College London Hospitals (UCLH) data set was sourced entirely from UCLH.</p></fn><fn id="table1fn7"><p><sup>g</sup>HN_Cetux: Head-Neck Cetuximab.</p></fn></div><h4>Clinical Taxonomy</h4><p class="abstract-paragraph">To select the organs at risk to be included in the study, we used the Brouwer Atlas (consensus guidelines for delineating organs at risk for head and neck radiotherapy, defined by an international panel of radiation oncologists [<span class="footers"><a class="citation-link" href="#ref51" rel="footnote">51</a></span>]). From this, we excluded those regions that required additional magnetic resonance imaging for segmentation, those that were not relevant to routine head and neck radiotherapy, or those that were not used clinically at UCLH. This resulted in a set of 21 organs at risk (<span class="footers"><a class="citation-link" href="#table2" rel="footnote">Table 2</a></span>).</p><div class="figure-table"><figcaption><span class="typcn typcn-clipboard"/>Table 2. Taxonomy of segmentation regions.</figcaption><table width="1000" cellpadding="5" cellspacing="0" border="1" rules="groups" frame="hsides"><col width="200" span="1"></col><col width="250" span="1"></col><col width="550" span="1"></col><thead><tr valign="top"><td rowspan="1" colspan="1">Organ at risk</td><td rowspan="1" colspan="1">Total number of labeled slices included</td><td rowspan="1" colspan="1">Anatomical landmarks and definition</td></tr></thead><tbody><tr valign="top"><td rowspan="1" colspan="1">Brain</td><td rowspan="1" colspan="1">11,476</td><td rowspan="1" colspan="1">Sits inside the cranium and includes all brain vessels excluding the brainstem and optic chiasm.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Brainstem</td><td rowspan="1" colspan="1">34,794</td><td rowspan="1" colspan="1">The posterior aspect of the brain including the midbrain, pons, and medulla oblongata. Extending inferior from the lateral ventricles to the tip of the dens at C2. It is structurally continuous with the spinal cord.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Cochlea-left</td><td rowspan="1" colspan="1">4526</td><td rowspan="1" colspan="1">Embedded in the temporal bone and lateral to the internal auditory meatus.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Cochlea-right</td><td rowspan="1" colspan="1">4754</td><td rowspan="1" colspan="1">Embedded in the temporal bone and lateral to the internal auditory meatus.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Lacrimal-left</td><td rowspan="1" colspan="1">17,186</td><td rowspan="1" colspan="1">Concave-shaped gland located at the superolateral aspect of the orbit.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Lacrimal-right</td><td rowspan="1" colspan="1">17,788</td><td rowspan="1" colspan="1">Concave-shaped gland located at the superolateral aspect of the orbit.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Lens-left</td><td rowspan="1" colspan="1">3006</td><td rowspan="1" colspan="1">An oval structure that sits within the anterior segment of the orbit. Can be variable in position but never sitting posterior beyond the level of the outer canthus.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Lens-right</td><td rowspan="1" colspan="1">3354</td><td rowspan="1" colspan="1">An oval structure that sits within the anterior segment of the orbit. Can be variable in position but never sitting posterior beyond the level of the outer canthus.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Lung-left</td><td rowspan="1" colspan="1">8340</td><td rowspan="1" colspan="1">Encompassed by the thoracic cavity adjacent to the lateral aspect of the mediastinum, extending from the first rib to the diaphragm excluding the carina.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Lung-right</td><td rowspan="1" colspan="1">9158</td><td rowspan="1" colspan="1">Encompassed by the thoracic cavity adjacent to the lateral aspect of the mediastinum, extending from the first rib to the diaphragm excluding the carina.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Mandible</td><td rowspan="1" colspan="1">25,074</td><td rowspan="1" colspan="1">The entire mandible bone including the temporomandibular joint, ramus, and body, excluding the teeth. The mandible joins to the inferior aspect of the temporal bone and forms the entire lower jaw.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Optic-nerve-left</td><td rowspan="1" colspan="1">3458</td><td rowspan="1" colspan="1">A 2 to 5 mm thick nerve that runs from the posterior aspect of the eye, through the optic canal and ends at the lateral aspect of the optic chiasm.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Optic-nerve-right</td><td rowspan="1" colspan="1">3012</td><td rowspan="1" colspan="1">A 2 to 5 mm thick nerve that runs from the posterior aspect of the eye, through the optic canal and ends at the lateral aspect of the optic chiasm.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Orbit-left</td><td rowspan="1" colspan="1">8538</td><td rowspan="1" colspan="1">Spherical organ sitting within the orbital cavity. Includes the vitreous humor, retina, cornea, and lens with the optic nerve attached posteriorly.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Orbit-right</td><td rowspan="1" colspan="1">8242</td><td rowspan="1" colspan="1">Spherical organ sitting within the orbital cavity. Includes the vitreous humor, retina, cornea, and lens with the optic nerve attached posteriorly.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Parotid-left</td><td rowspan="1" colspan="1">8984</td><td rowspan="1" colspan="1">Multi-lobed salivary gland wrapped around the mandibular ramus. Extends medially to the styloid process and parapharyngeal space. Laterally extending to the subcutaneous fat. Posteriorly extending to the sternocleidomastoid muscle. Anterior extending to posterior border of the mandible bone and masseter muscle. In cases where the retromandibular vein is encapsulated by parotid, this is included in the segmentation.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Parotid-right</td><td rowspan="1" colspan="1">11,752</td><td rowspan="1" colspan="1">Multi-lobed salivary gland wrapped around the mandibular ramus. Extends medially to the styloid process and parapharyngeal space. Laterally extending to the subcutaneous fat. Posteriorly extending to the sternocleidomastoid muscle. Anterior extending to posterior border of the mandible bone and masseter muscle. In cases where the retromandibular vein is encapsulated by parotid this is included in the segmentation.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Spinal-canal</td><td rowspan="1" colspan="1">37,000</td><td rowspan="1" colspan="1">Hollow cavity that runs through the foramen of the vertebrae, extending from the base of skull to the end of the sacrum.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Spinal-cord</td><td rowspan="1" colspan="1">37,096</td><td rowspan="1" colspan="1">Sits inside the spinal canal and extends from the level of the foramen magnum to the bottom of L2.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Submandibular-left</td><td rowspan="1" colspan="1">10,652</td><td rowspan="1" colspan="1">Sits within the submandibular portion of the anterior triangle of the neck, making up the floor of the mouth and extending both superior and inferior to the posterior aspect of the mandible and is limited laterally by the mandible and medially by the hypoglossal muscle.</td></tr><tr valign="top"><td rowspan="1" colspan="1">Submandibular-right</td><td rowspan="1" colspan="1">10,716</td><td rowspan="1" colspan="1">Sits within the submandibular portion of the anterior triangle of the neck, making up the floor of the mouth and extending both superior and inferior to the posterior aspect of the mandible and is limited laterally by the mandible and medially by the hypoglossal muscle.</td></tr></tbody></table></div><h4>Clinical Labeling and Annotation</h4><p class="abstract-paragraph">Owing to the large variability of segmentation protocols used and annotation quality in the UCLH data set, all segmentations from all scans selected for inclusion in the training set were manually reviewed by a radiographer with at least 4 years of experience in the segmentation of head and neck organs at risk. Volumes that did not conform to the Brouwer Atlas were excluded from the training. To increase the number of training examples, additional axial slices were randomly selected for further manual organ at risk segmentations to be added based on model performance or perceived imbalances in the data set. These were then produced by a radiographer with at least 4 years of experience in head and neck radiotherapy, arbitrated by a second radiographer with the same level of experience. The total number of examples from the original UCLH segmentations and additional slices are provided in <span class="footers"><a class="citation-link" href="#table2" rel="footnote">Table 2</a></span>.</p><p class="abstract-paragraph">For the TCIA test and validation sets, the original dense segmentations were not used owing to poor adherence to the chosen study protocol. To produce the ground truth labels, the full volumes of all 21 organs at risk included in the study were segmented. This was done initially by a radiographer with at least 4 years of experience in the segmentation of head and neck organs at risk and then arbitrated by a second radiographer with similar experience. Further arbitration was then performed by a radiation oncologist with at least 5 years of postcertification experience in head and neck radiotherapy. The same process was repeated with 2 additional radiographers working independently, but after peer arbitration, these segmentations were not reviewed by an oncologist; rather, they became the human reference to which the model was compared. This is schematically shown in <span class="footers"><a class="citation-link" href="#figure3" rel="footnote">Figure 3</a></span>. Before participation, all radiographers and oncologists were required to study the Brouwer Atlas for head and neck organ at risk segmentation [<span class="footers"><a class="citation-link" href="#ref51" rel="footnote">51</a></span>] and demonstrate competence in adhering to these guidelines.</p><figure><a name="figure3">&#8206;</a><a class="fancybox" title="Figure 3. Process for the segmentation of ground truth and radiographer organs at risk volumes. The flowchart illustrates how the ground truth segmentations were created and compared with independent radiographer segmentations and the model. For the ground truth, each computed tomography scan in The Cancer Imaging Archive test set was segmented first by a radiographer and peer reviewed by a second radiographer. This then went through one or more iterations of review and editing with a specialist oncologist before creating a ground truth used to compare with the segmentations produced by both the model and additional radiographer. CT: computed tomography." href="https://asset.jmir.pub/assets/927f4012c92da3f5a055064e95bb5f0b.png" id="figure3"><img class="figure-image" src="https://asset.jmir.pub/assets/927f4012c92da3f5a055064e95bb5f0b.png"/></a><figcaption><span class="typcn typcn-image"/>Figure 3. Process for the segmentation of ground truth and radiographer organs at risk volumes. The flowchart illustrates how the ground truth segmentations were created and compared with independent radiographer segmentations and the model. For the ground truth, each computed tomography scan in The Cancer Imaging Archive test set was segmented first by a radiographer and peer reviewed by a second radiographer. This then went through one or more iterations of review and editing with a specialist oncologist before creating a ground truth used to compare with the segmentations produced by both the model and additional radiographer. CT: computed tomography. </figcaption><a class="fancybox" href="https://asset.jmir.pub/assets/927f4012c92da3f5a055064e95bb5f0b.png" title="Figure 3. Process for the segmentation of ground truth and radiographer organs at risk volumes. The flowchart illustrates how the ground truth segmentations were created and compared with independent radiographer segmentations and the model. For the ground truth, each computed tomography scan in The Cancer Imaging Archive test set was segmented first by a radiographer and peer reviewed by a second radiographer. This then went through one or more iterations of review and editing with a specialist oncologist before creating a ground truth used to compare with the segmentations produced by both the model and additional radiographer. CT: computed tomography.">View this figure</a></figure><h4>Model Architecture</h4><p class="abstract-paragraph">We used a residual 3D U-Net architecture with 8 levels (<span class="footers"><a class="citation-link" href="#figure4" rel="footnote">Figure 4</a></span>). Our network takes in a CT volume (single channel) and outputs a segmentation mask with 21 channels, where each channel contains a binary segmentation mask for a specific organ at risk. The network consists of 7 residual convolutional blocks in the downward path, a residual fully connected block at the bottom, and 7 residual convolutional blocks in the upward path. A 1&#215;1&#215;1 convolution layer with sigmoidal activation produces the final output in the original resolution of the input image. Each predicted slice had 21 slices of context. The 21-slice context (ie, 21 &#215; 2.5 mm=52.5 mm) was found to provide the optimal context. This is not the case with the 21 organs at risk used in this study.</p><figure><a name="figure4">&#8206;</a><a class="fancybox" title="Figure 4. 3D U-Net model architecture. (a) At training time, the model receives 21 contiguous computed tomography slices, which are processed through a series of &#8220;down&#8221; blocks, a fully connected block, and a series of &#8220;up&#8221; blocks to create a segmentation prediction. (b) A detailed view of the convolutional residual down and up blocks and the residual fully connected block." href="https://asset.jmir.pub/assets/a8378390342460c336ae6203846eba44.png" id="figure4"><img class="figure-image" src="https://asset.jmir.pub/assets/a8378390342460c336ae6203846eba44.png"/></a><figcaption><span class="typcn typcn-image"/>Figure 4. 3D U-Net model architecture. (a) At training time, the model receives 21 contiguous computed tomography slices, which are processed through a series of &#8220;down&#8221; blocks, a fully connected block, and a series of &#8220;up&#8221; blocks to create a segmentation prediction. (b) A detailed view of the convolutional residual down and up blocks and the residual fully connected block. </figcaption><a class="fancybox" href="https://asset.jmir.pub/assets/a8378390342460c336ae6203846eba44.png" title="Figure 4. 3D U-Net model architecture. (a) At training time, the model receives 21 contiguous computed tomography slices, which are processed through a series of &#8220;down&#8221; blocks, a fully connected block, and a series of &#8220;up&#8221; blocks to create a segmentation prediction. (b) A detailed view of the convolutional residual down and up blocks and the residual fully connected block.">View this figure</a></figure><p class="abstract-paragraph">We trained our network with a regularized top-<i>k</i>-percent, pixel-wise, binary, cross-entropy loss [<span class="footers"><a class="citation-link" href="#ref54" rel="footnote">54</a></span>]; for each output channel, the top-<i>k</i> loss selects only the <i>k</i>% most difficult pixels (those with the highest binary cross-entropy) and only adds their contribution to the total loss. This speeds up training and helps the network to tackle the large class imbalance and to focus on difficult examples.</p><p class="abstract-paragraph">We regularized the model using standard L2 weight regularization with scale 10<sup>&#8722;6</sup> and extensive data augmentation using random in-plane (ie, in <i>x</i> and <i>y</i> directions only) translation, rotation, scaling, shearing, mirroring, elastic deformations, and pixel-wise noise. We used uniform translations between &#8722;32 and 32 pixels, uniform rotations between &#8722;9&#176; and 9&#176;, uniform scaling factors between 0.8&#176; and 1.2&#176;, and uniform shear factors between &#8722;0.1 and 0.1. We mirrored the images (and adjusted the corresponding left and right labels) with a probability of 0.5. We performed elastic deformations by placing random displacement vectors (SD 5 mm, in-plane displacements only) on a control point grid with 100&#215;100&#215;100 mm spacing and by deriving the dense deformation field using cubic b-spline interpolation. In the implementation, all spatial transformations are first combined to a dense deformation field, which is then applied to the image using bilinear interpolation and extrapolation with zero padding. We added zero-mean Gaussian intensity noise independently to each pixel with an SD of 20 Hounsfield units.</p><p class="abstract-paragraph">We trained the model with the Adam optimizer [<span class="footers"><a class="citation-link" href="#ref53" rel="footnote">53</a></span>] for 120,000 steps and a batch size of 32 (32 graphical processing units) using synchronous stochastic gradient descent. We used an initial learning rate of 10<sup>&#8722;4</sup> and scaled the learning rate by 1/2, 1/8, 1/64, and 1/256 at time steps of 24,000, 60,000, 108,000, and 114,000, respectively.</p><p class="abstract-paragraph">We used the validation set to select the model that performed at over 95% for most organs at risk according to our chosen surface DSC performance metric, breaking ties by preferring better performance on more clinically impactful organs at risk and the absolute performance obtained.</p><h4>Performance Metrics</h4><p class="abstract-paragraph">All performance metrics are reported for each organ independently (eg, separately for just the left parotid), so we only need to deal with binary masks (eg, a left parotid voxel and a non&#8211;left-parotid voxel). Masks are defined as a subset of <img class="inline-graphic-image" alt="" src="https://asset.jmir.pub/assets/3f007e4a26946ad814aa5bc2778fb2d4.png" border="0" style="width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;"/>, that is, <img class="inline-graphic-image" alt="" src="https://asset.jmir.pub/assets/a4794be6bc36041d9601047477506cf6.png" border="0" style="width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;"/> (<span class="footers"><a class="citation-link" href="#figure5" rel="footnote">Figure 5</a></span>).</p><figure><a name="figure5">&#8206;</a><a class="fancybox" title="Figure 5. Illustrations of masks, surfaces, border regions, and the &#8220;overlapping&#8221; surface at tolerance &#964;." href="https://asset.jmir.pub/assets/e0be8500f85a93e28f44b06d90eb45fb.png" id="figure5"><img class="figure-image" src="https://asset.jmir.pub/assets/e0be8500f85a93e28f44b06d90eb45fb.png"/></a><figcaption><span class="typcn typcn-image"/>Figure 5. Illustrations of masks, surfaces, border regions, and the &#8220;overlapping&#8221; surface at tolerance &#964;. </figcaption><a class="fancybox" href="https://asset.jmir.pub/assets/e0be8500f85a93e28f44b06d90eb45fb.png" title="Figure 5. Illustrations of masks, surfaces, border regions, and the &#8220;overlapping&#8221; surface at tolerance &#964;.">View this figure</a></figure><p class="abstract-paragraph">The volume of a mask is denoted as <img class="inline-graphic-image" alt="" src="https://asset.jmir.pub/assets/694ddae7b041e6d1952f700634d0acbc.png" border="0" style="width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;"/>, with</p><blockquote><img class="graphic-image" alt="" src="https://asset.jmir.pub/assets/3fa19e4f0be5b69a802dd0f12eb23ee6.png" border="0" style="text-align:center;margin-left: auto;margin-right: auto;display: block;background-color: #ffffff;"/></blockquote><p class="abstract-paragraph">With this notation, the standard (volumetric) DSC for two given masks <i>M</i><sub>1</sub> and <i>M</i><sub>2</sub> and can be written as:</p><blockquote><img class="graphic-image" alt="" src="https://asset.jmir.pub/assets/6deb202a14a449a075ea5b82910d2f90.png" border="0" style="text-align:center;margin-left: auto;margin-right: auto;display: block;background-color: #ffffff;"/></blockquote><p class="abstract-paragraph">In the case of sparse ground truth segmentations (ie, only a few slices of the CT scan are labeled), we estimate the volumetric DSC by aggregating data from labeled voxels across multiple scans and patients as</p><blockquote><img class="graphic-image" alt="" src="https://asset.jmir.pub/assets/9805ff85f092184875b547c172cb2e91.png" border="0" style="text-align:center;margin-left: auto;margin-right: auto;display: block;background-color: #ffffff;"/></blockquote><p class="abstract-paragraph">where the mask <i>M</i><sub>1,</sub><i><sub>p</sub></i> and the labeled region <i>L<sub>p</sub></i> represent the sparse ground truth segmentation for a patient <i>p</i> and the mask <i>M</i><sub>2,</sub><i><sub>p</sub></i> is the full volume predicted segmentation for the patient <i>p</i>.</p><p class="abstract-paragraph">Owing to the shortcomings of the volumetric DSC metric for the presented radiotherapy use case, we introduced the <i>surface DSC</i> metric, which assesses the overlap of two surfaces (at a specified tolerance) instead of the overlap of two volumes (see <i>Results</i> section). A surface is the border of a mask, <img class="inline-graphic-image" alt="" src="https://asset.jmir.pub/assets/bd9b9263e53953e076fb11bab157ebb3.png" border="0" style="width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;"/>, and the area of the surface is denoted as</p><blockquote><img class="graphic-image" alt="" src="https://asset.jmir.pub/assets/690e2bf25800200a3b0cadc1c0fc697f.png" border="0" style="text-align:center;margin-left: auto;margin-right: auto;display: block;background-color: #ffffff;"/></blockquote><p class="abstract-paragraph">where <img class="inline-graphic-image" alt="" src="https://asset.jmir.pub/assets/342d467e9edd7fc4b51b06d55466c27b.png" border="0" style="width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;"/> is a point on the surface using arbitrary parameterization. The mapping from this parameterization to a point in <img class="inline-graphic-image" alt="" src="https://asset.jmir.pub/assets/a1d33ba387e9ccab97a506db3e0392ce.png" border="0" style="width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;"/> is denoted as <img class="inline-graphic-image" alt="" src="https://asset.jmir.pub/assets/91159bf804fe2c4fae791272fb556053.png" border="0" style="width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;"/>, that is, <img class="inline-graphic-image" alt="" src="https://asset.jmir.pub/assets/69078fc22497f2707c06e912c3218ff7.png" border="0" style="width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;"/>. With this we can define the border region <img class="inline-graphic-image" alt="" src="https://asset.jmir.pub/assets/693bab5ffe535fb785c90130ff377c38.png" border="0" style="width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;"/>, for the surface <i>S<sub>i</sub></i>, at a given tolerance <i>&#964;</i> as (<span class="footers"><a class="citation-link" href="#figure5" rel="footnote">Figure 5</a></span>)</p><blockquote><img class="graphic-image" alt="" src="https://asset.jmir.pub/assets/005a7b98c397f6836b094b8fa0b4a3fa.png" border="0" style="text-align:center;margin-left: auto;margin-right: auto;display: block;background-color: #ffffff;"/></blockquote><p class="abstract-paragraph">Using these definitions, we can write the <i>surface DSC at tolerance &#964;</i> as</p><blockquote><img class="graphic-image" alt="" src="https://asset.jmir.pub/assets/4b6c33e0555e226e29c7fbbb351f01b7.png" border="0" style="text-align:center;margin-left: auto;margin-right: auto;display: block;background-color: #ffffff;"/></blockquote><p class="abstract-paragraph">using an informal notation for the intersection of the surface with the boundary, that is,</p><blockquote><img class="graphic-image" alt="" src="https://asset.jmir.pub/assets/608002607a80f003d874f34b4ddb1e6c.png" border="0" style="text-align:center;margin-left: auto;margin-right: auto;display: block;background-color: #ffffff;"/></blockquote><h4>Implementation of Surface DSC</h4><p class="abstract-paragraph">The computation of surface integrals on sampled images is not straightforward, especially for medical images, where the voxel spacing is usually not equal in all 3 dimensions. The common approximation of the integral by counting the surface voxels can lead to substantial systematic errors.</p><p class="abstract-paragraph">Another common challenge is the representation of a surface with voxels. As the surface of a binary mask is located between voxels, a definition of <i>surface voxels</i> in the raster-space of the image introduces a bias: using foreground voxels to represent the surface leads to an underestimation of the surface, whereas the use of background voxels leads to an overestimation.</p><p class="abstract-paragraph">Our proposed implementation uses a surface representation that provides less-biased estimates but still allows us to compute the performance metrics with linear complexity O(<i>N</i>), with <i>N</i>: number of voxels). We placed the surface points between the voxels on a raster that is shifted by half of the raster spacing on each axis (see <span class="footers"><a class="citation-link" href="#figure6" rel="footnote">Figure 6</a></span> for a 2D illustration).</p><figure><a name="figure6">&#8206;</a><a class="fancybox" title="Figure 6. 2D illustration of the implementation of the surface Dice similarity coefficient. (a) A binary mask displayed as an image. The origin of the image raster is (0,0). (b) The surface points (red circles) are located in a raster that is shifted half of the raster spacing on each axis. Each surface point has 4 neighbors in 2D (8 neighbors in 3D). The local contour (blue line) assigned to each surface point (red circle) depends on the neighbor constellation." href="https://asset.jmir.pub/assets/d1861009f50dd76330c2350f605fa9bf.png" id="figure6"><img class="figure-image" src="https://asset.jmir.pub/assets/d1861009f50dd76330c2350f605fa9bf.png"/></a><figcaption><span class="typcn typcn-image"/>Figure 6. 2D illustration of the implementation of the surface Dice similarity coefficient. (a) A binary mask displayed as an image. The origin of the image raster is (0,0). (b) The surface points (red circles) are located in a raster that is shifted half of the raster spacing on each axis. Each surface point has 4 neighbors in 2D (8 neighbors in 3D). The local contour (blue line) assigned to each surface point (red circle) depends on the neighbor constellation. </figcaption><a class="fancybox" href="https://asset.jmir.pub/assets/d1861009f50dd76330c2350f605fa9bf.png" title="Figure 6. 2D illustration of the implementation of the surface Dice similarity coefficient. (a) A binary mask displayed as an image. The origin of the image raster is (0,0). (b) The surface points (red circles) are located in a raster that is shifted half of the raster spacing on each axis. Each surface point has 4 neighbors in 2D (8 neighbors in 3D). The local contour (blue line) assigned to each surface point (red circle) depends on the neighbor constellation.">View this figure</a></figure><p class="abstract-paragraph">For 3D images, each point in the raster has 8 neighboring voxels. As we analyzed binary masks, there are only 2<sup>8</sup>=256 possible neighbor constellations. For each of these constellations, we computed the resulting triangles using the marching cube triangulation [<span class="footers"><a class="citation-link" href="#ref55" rel="footnote">55</a></span>,<span class="footers"><a class="citation-link" href="#ref56" rel="footnote">56</a></span>] and stored the surface area of the triangles (in mm<sup>2</sup>) in a look-up table. With this look-up table, we then created a surface image (on the above-mentioned raster) that contains zeros at positions that have 8 identical neighbors or the local surface area at all positions that have both foreground and background neighbors. These surface images were created for masks <i>M</i><sub>1</sub> and <i>M</i><sub>2</sub>. In addition, we created a distance map from each of these surface images using the distance transform algorithm [<span class="footers"><a class="citation-link" href="#ref57" rel="footnote">57</a></span>]. Iterating over the nonzero elements in the first surface image and looking up the distance from the other surface in the corresponding distance map allows the creation of a list of tuples (surface element area and distance from other surfaces). From this list, we can easily compute the surface area by summing the area of the surface elements that are within the tolerance. To account for the quantized distances, there is only a discrete set <img class="inline-graphic-image" alt="" src="https://asset.jmir.pub/assets/579a0894d6013d9da94e9a74b700579f.png" border="0" style="width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;"/> of distances between voxels in a 3D raster with spacing (<i>d</i><sub>1</sub>, <i>d</i><sub>2</sub>, <i>d</i><sub>3</sub>)&#8212;we also rounded the tolerance to the nearest neighbor in set <i>D</i> for each image before computing the surface DSC. Our open-source implementation of surface DSC provides more details.</p><br/><h3 class="navigation-heading h3-main-heading" id="Results" data-label="Results">Results</h3><h4>Selecting Clinically Representative Data Sets</h4><p class="abstract-paragraph">Data sets are described in detail in the Methods section. In brief, the first data set was a representative sample of CT scans used to plan curative-intent radiotherapy of head and neck cancer for patients at UCLH NHS Foundation Trust, a single high-volume center. We performed iterative cycles of model development using the UCLH scans (<i>training</i> and <i>validation</i> subsets), taking the performance on a previously unseen subset (<i>test</i>) as our primary outcome.</p><p class="abstract-paragraph">It is also important to demonstrate a model&#8217;s generalizability to data from previously unseen demographics and distributions. To do this, we curated the test and validation data sets of open-source CT scans. These were collected from the <i>TCIA test set</i> [<span class="footers"><a class="citation-link" href="#ref48" rel="footnote">48</a></span>-<span class="footers"><a class="citation-link" href="#ref50" rel="footnote">50</a></span>] and the <i>PDDCA data set</i> released as part of the 2015 challenge (<i>PDDCA test set</i> [<span class="footers"><a class="citation-link" href="#ref25" rel="footnote">25</a></span>]).</p><p class="abstract-paragraph"><span class="footers"><a class="citation-link" href="#table1" rel="footnote">Table 1</a></span> details the characteristics of these data sets and their patient demographics. Ethnicity and protected-group status are not reported, as this information was not available in the source systems. In total, 21 organs at risk were selected to represent a wide range of anatomical regions throughout the head and neck. To provide a human clinical comparison for the algorithm, each case was manually segmented by a single radiographer with arbitration by a second radiographer. This was compared with our study&#8217;s <i>gold standard</i> ground truth graded by 2 other radiographers and arbitrated by one of 2 independent specialist oncologists, each with a minimum of 4 years specialist experience in radiotherapy treatment planning for patients with head and neck cancer.</p><p class="abstract-paragraph">An example of model performance is shown in <span class="footers"><a class="citation-link" href="#figure7" rel="footnote">Figure 7</a></span>, two further randomly selected UCLH set scans are shown in Figures S1 and S2 of <span class="footers"><a class="citation-link" href="#app1" rel="footnote">Multimedia Appendix 1</a></span> [<span class="footers"><a class="citation-link" href="#ref19" rel="footnote">19</a></span>-<span class="footers"><a class="citation-link" href="#ref31" rel="footnote">31</a></span>,<span class="footers"><a class="citation-link" href="#ref34" rel="footnote">34</a></span>-<span class="footers"><a class="citation-link" href="#ref46" rel="footnote">46</a></span>,<span class="footers"><a class="citation-link" href="#ref56" rel="footnote">56</a></span>-<span class="footers"><a class="citation-link" href="#ref90" rel="footnote">90</a></span>]. Three randomly selected TCIA set scans are shown in Figures S3, S4 and S5 of <span class="footers"><a class="citation-link" href="#app1" rel="footnote">Multimedia Appendix 1</a></span> to visually demonstrate the model&#8217;s generalizability. We compared our performance (model vs oncologist) to radiographer performance (radiographer vs oncologist). For more information on data set selection and inclusion and exclusion criteria for patients and organs at risk, see the <i>Methods</i> section.</p><figure><a name="figure7">&#8206;</a><a class="fancybox" title="Figure 7. Example results. Computed tomography (CT) image: axial slices at 5 representative levels from the raw CT scan of a male patient aged 55-59 years were selected from the University College London Hospitals data set (patient 20). These were selected to best demonstrate the organs at risks included in the work. The levels shown as 2D slices have been selected to demonstrate all 21 organs at risks included in this study. The window leveling has been adjusted for each to best display the anatomy present. Oncologist contour: the ground truth segmentation, as defined by experienced radiographers and arbitrated by a head and neck specialist oncologist. Model contour: segmentations produced by our model. Contour comparison: contoured by oncologist only (green region) or model only (yellow region). Best viewed on a display. CT: computed tomography." href="https://asset.jmir.pub/assets/bc23e253cd9b35d2effefc6bad63e7cd.png" id="figure7"><img class="figure-image" src="https://asset.jmir.pub/assets/bc23e253cd9b35d2effefc6bad63e7cd.png"/></a><figcaption><span class="typcn typcn-image"/>Figure 7. Example results. Computed tomography (CT) image: axial slices at 5 representative levels from the raw CT scan of a male patient aged 55-59 years were selected from the University College London Hospitals data set (patient 20). These were selected to best demonstrate the organs at risks included in the work. The levels shown as 2D slices have been selected to demonstrate all 21 organs at risks included in this study. The window leveling has been adjusted for each to best display the anatomy present. Oncologist contour: the ground truth segmentation, as defined by experienced radiographers and arbitrated by a head and neck specialist oncologist. Model contour: segmentations produced by our model. Contour comparison: contoured by oncologist only (green region) or model only (yellow region). Best viewed on a display. CT: computed tomography. </figcaption><a class="fancybox" href="https://asset.jmir.pub/assets/bc23e253cd9b35d2effefc6bad63e7cd.png" title="Figure 7. Example results. Computed tomography (CT) image: axial slices at 5 representative levels from the raw CT scan of a male patient aged 55-59 years were selected from the University College London Hospitals data set (patient 20). These were selected to best demonstrate the organs at risks included in the work. The levels shown as 2D slices have been selected to demonstrate all 21 organs at risks included in this study. The window leveling has been adjusted for each to best display the anatomy present. Oncologist contour: the ground truth segmentation, as defined by experienced radiographers and arbitrated by a head and neck specialist oncologist. Model contour: segmentations produced by our model. Contour comparison: contoured by oncologist only (green region) or model only (yellow region). Best viewed on a display. CT: computed tomography.">View this figure</a></figure><h4>A New Metric for Assessing Clinical Performance</h4><p class="abstract-paragraph">In routine clinical care, algorithm-derived segmentation is reviewed and potentially corrected by a human expert, just as those created by radiographers currently are. Segmentation performance is thus best assessed by determining the fraction of the surface that needs to be redrawn. The standard volumetric DSC [<span class="footers"><a class="citation-link" href="#ref91" rel="footnote">91</a></span>] is not well suited to this because it weighs all regions of misplaced delineation equally and independently of their distance from the surface. For example, two inaccurate segmentations could have a similar volumetric DSC score if one were to deviate from the correct surface boundary by a small amount in many places, whereas the other had a large deviation at a single point. Correcting the former would likely take a considerable amount of time as it would require redrawing almost all of the boundary, whereas the latter could be corrected much faster, potentially with a single edit action.</p><p class="abstract-paragraph">For quantitative analysis, we therefore introduced a new segmentation performance metric, the <i>surface DSC</i> (<span class="footers"><a class="citation-link" href="#figure8" rel="footnote">Figure 8</a></span>), which assesses the overlap of two surfaces (at a specified tolerance) instead of the overlap of two volumes. This provides a measure of agreement between the surfaces of two structures, which is where most of the human effort in correcting is usually expended. In doing so, we also addressed the volumetric DSC&#8217;s bias toward large organs at risk, where the large (and mostly trivial) internal volume accounts for a much larger proportion of the score.</p><figure><a name="figure8">&#8206;</a><a class="fancybox" title="Figure 8. Surface Dice similarity coefficient performance metric. (a) Illustration of the computation of the surface Dice similarity coefficient. Continuous line: predicted surface. Dashed line: ground truth surface. Black arrow: the maximum margin of deviation that may be tolerated without penalty, hereafter referred to by &#964;. Note that in our use case each organ at risk has an independently calculated value for &#964;. Green: acceptable surface parts (distance between surfaces &#8804;&#964;). Pink: unacceptable regions of the surfaces (distance between surfaces &#8804;&#964;). The proposed surface Dice similarity coefficient metric reports the good surface parts compared with the total surface (sum of predicted surface area and ground truth surface area). (b) Illustration of the determination of the organ-specific tolerance. Green: segmentation of an organ by oncologist A. Black: segmentation by oncologist B. Red: distances between the surfaces." href="https://asset.jmir.pub/assets/22d8f0548a88dcb5fd0e3bf1929b02c8.png" id="figure8"><img class="figure-image" src="https://asset.jmir.pub/assets/22d8f0548a88dcb5fd0e3bf1929b02c8.png"/></a><figcaption><span class="typcn typcn-image"/>Figure 8. Surface Dice similarity coefficient performance metric. (a) Illustration of the computation of the surface Dice similarity coefficient. Continuous line: predicted surface. Dashed line: ground truth surface. Black arrow: the maximum margin of deviation that may be tolerated without penalty, hereafter referred to by &#964;. Note that in our use case each organ at risk has an independently calculated value for &#964;. Green: acceptable surface parts (distance between surfaces &#8804;&#964;). Pink: unacceptable regions of the surfaces (distance between surfaces &#8804;&#964;). The proposed surface Dice similarity coefficient metric reports the good surface parts compared with the total surface (sum of predicted surface area and ground truth surface area). (b) Illustration of the determination of the organ-specific tolerance. Green: segmentation of an organ by oncologist A. Black: segmentation by oncologist B. Red: distances between the surfaces. </figcaption><a class="fancybox" href="https://asset.jmir.pub/assets/22d8f0548a88dcb5fd0e3bf1929b02c8.png" title="Figure 8. Surface Dice similarity coefficient performance metric. (a) Illustration of the computation of the surface Dice similarity coefficient. Continuous line: predicted surface. Dashed line: ground truth surface. Black arrow: the maximum margin of deviation that may be tolerated without penalty, hereafter referred to by &#964;. Note that in our use case each organ at risk has an independently calculated value for &#964;. Green: acceptable surface parts (distance between surfaces &#8804;&#964;). Pink: unacceptable regions of the surfaces (distance between surfaces &#8804;&#964;). The proposed surface Dice similarity coefficient metric reports the good surface parts compared with the total surface (sum of predicted surface area and ground truth surface area). (b) Illustration of the determination of the organ-specific tolerance. Green: segmentation of an organ by oncologist A. Black: segmentation by oncologist B. Red: distances between the surfaces.">View this figure</a></figure><p class="abstract-paragraph">When evaluating the surface DSC, we must define a threshold within which the variation is clinically acceptable. To do this, we first defined the organ-specific tolerances (in mm) as a parameter of the proposed metric, &#964;. We computed these acceptable tolerances for each organ by measuring the interobserver variation in segmentations between 3 different consultant oncologists (each with over 10 years of experience in organ at risk delineation) on the validation subset of TCIA images.</p><p class="abstract-paragraph">To penalize both false-negative and false-positive parts of the predicted surface, our proposed metrics measure both the nonsymmetric distances between the surfaces and then normalize them by the combined surface area. Similar to volumetric DSC, the surface DSC ranges from 0 (no overlap) to 1 (perfect overlap).</p><p class="abstract-paragraph">This means that approximately 95% of the surface was properly outlined (ie, within &#964; mm of the correct boundary), whereas 5% needs to be corrected. There is no consensus as to what constitutes a nonsignificant variation in such a segmentation. Thus, we selected a surface DSC of 0.95, a stringency that likely far exceeds the expert oncologist intrarater concordance [<span class="footers"><a class="citation-link" href="#ref19" rel="footnote">19</a></span>,<span class="footers"><a class="citation-link" href="#ref92" rel="footnote">92</a></span>]. For a more formal definition and implementation, see the <i>Methods</i> section.</p><h4>Model Performance</h4><p class="abstract-paragraph">Model performance was evaluated alongside that of therapeutic radiographers (each with at least 4 years of experience) segmenting the test set of UCLH images independently of the oncologist-reviewed scans (which we used as our ground truth).</p><p class="abstract-paragraph">The model performed similarly to humans. For all organs at risk studied, there was no clinically meaningful difference between the deep learning model&#8217;s segmentations and those of the radiographers (<span class="footers"><a class="citation-link" href="#figure9" rel="footnote">Figure 9</a></span> and Tables S1 and S2, <span class="footers"><a class="citation-link" href="#app1" rel="footnote">Multimedia Appendix 1</a></span>). For details on the number of labelled scans in the UCLH test set, see Table S3 in <span class="footers"><a class="citation-link" href="#app1" rel="footnote">Multimedia Appendix 1</a></span>.</p><p class="abstract-paragraph">To investigate the generalizability of our model, we additionally evaluated the performance of open-source scans (<i>TCIA test set</i>). These were collected from sites in the United States, where patient demographics, clinical pathways for radiotherapy, and scanner type and parameters differed from our UK training set in meaningful ways. Nevertheless, model performance was preserved, and in 90% (19/21) organs at risk, the model was performed within the threshold defined for human variability (<span class="footers"><a class="citation-link" href="#figure10" rel="footnote">Figure 10</a></span>). The fact that performance in 2 organs at risk (brainstem and right lens) was less than that in UK data may relate to issues of image quality in several TCIA test set scans.</p><figure><a name="figure9">&#8206;</a><a class="fancybox" title="Figure 9. University College London Hospitals (UCLH) test set: quantitative performance of the model in comparison with radiographers. (a) The model achieves a surface Dice similarity coefficient similar to humans in all 21 organs at risk (on the UCLH held out test set) when compared with the gold standard for each organ at an organ-specific tolerance &#964;. Blue: our model; green: radiographers. (b) Performance difference between the model and the radiographers. Each blue dot represents a model-radiographer pair. The gray area highlights nonsubstantial differences (&#8722;5% to +5%). The box extends from the lower to upper quartile values of the data, with a line at the median. The whiskers indicate most extreme, nonoutlier data points. Where data lie outside, an IQR of 1.5 is represented as a circular flier. The notches represent the 95% CI around the median. DSC: Dice similarity coefficient; UCLH: University College London Hospitals." href="https://asset.jmir.pub/assets/575e67d4074029b6161117fe32c21a66.png" id="figure9"><img class="figure-image" src="https://asset.jmir.pub/assets/575e67d4074029b6161117fe32c21a66.png"/></a><figcaption><span class="typcn typcn-image"/>Figure 9. University College London Hospitals (UCLH) test set: quantitative performance of the model in comparison with radiographers. (a) The model achieves a surface Dice similarity coefficient similar to humans in all 21 organs at risk (on the UCLH held out test set) when compared with the gold standard for each organ at an organ-specific tolerance &#964;. Blue: our model; green: radiographers. (b) Performance difference between the model and the radiographers. Each blue dot represents a model-radiographer pair. The gray area highlights nonsubstantial differences (&#8722;5% to +5%). The box extends from the lower to upper quartile values of the data, with a line at the median. The whiskers indicate most extreme, nonoutlier data points. Where data lie outside, an IQR of 1.5 is represented as a circular flier. The notches represent the 95% CI around the median. DSC: Dice similarity coefficient; UCLH: University College London Hospitals. </figcaption><a class="fancybox" href="https://asset.jmir.pub/assets/575e67d4074029b6161117fe32c21a66.png" title="Figure 9. University College London Hospitals (UCLH) test set: quantitative performance of the model in comparison with radiographers. (a) The model achieves a surface Dice similarity coefficient similar to humans in all 21 organs at risk (on the UCLH held out test set) when compared with the gold standard for each organ at an organ-specific tolerance &#964;. Blue: our model; green: radiographers. (b) Performance difference between the model and the radiographers. Each blue dot represents a model-radiographer pair. The gray area highlights nonsubstantial differences (&#8722;5% to +5%). The box extends from the lower to upper quartile values of the data, with a line at the median. The whiskers indicate most extreme, nonoutlier data points. Where data lie outside, an IQR of 1.5 is represented as a circular flier. The notches represent the 95% CI around the median. DSC: Dice similarity coefficient; UCLH: University College London Hospitals.">View this figure</a></figure><figure><a name="figure10">&#8206;</a><a class="fancybox" title="Figure 10. Model generalizability to an independent test set from The Cancer Imaging Archive (TCIA). Quantitative performance of the model on TCIA test set in comparison with radiographers. (a) Surface Dice similarity coefficient (on the TCIA open-source test set) for the segmentations compared with the gold standard for each organ at an organ-specific tolerance &#964;. Blue: our model, green: radiographers. (b) Performance difference between the model and the radiographers. Each blue dot represents a model-radiographer pair. Red lines show the mean difference. The gray area highlights nonsubstantial differences (&#8722;5% to +5%). The box extends from the lower to upper quartile values of the data, with a line at the median. The whiskers indicate most extreme, nonoutlier data points. Where data lie outside, an IQR of 1.5 is represented as a circular flier. The notches represent the 95% CI around the median. DSC: Dice similarity coefficient; TCIA: The Cancer Imaging Archive." href="https://asset.jmir.pub/assets/695687440a4e5a83918929c8810032a5.png" id="figure10"><img class="figure-image" src="https://asset.jmir.pub/assets/695687440a4e5a83918929c8810032a5.png"/></a><figcaption><span class="typcn typcn-image"/>Figure 10. Model generalizability to an independent test set from The Cancer Imaging Archive (TCIA). Quantitative performance of the model on TCIA test set in comparison with radiographers. (a) Surface Dice similarity coefficient (on the TCIA open-source test set) for the segmentations compared with the gold standard for each organ at an organ-specific tolerance &#964;. Blue: our model, green: radiographers. (b) Performance difference between the model and the radiographers. Each blue dot represents a model-radiographer pair. Red lines show the mean difference. The gray area highlights nonsubstantial differences (&#8722;5% to +5%). The box extends from the lower to upper quartile values of the data, with a line at the median. The whiskers indicate most extreme, nonoutlier data points. Where data lie outside, an IQR of 1.5 is represented as a circular flier. The notches represent the 95% CI around the median. DSC: Dice similarity coefficient; TCIA: The Cancer Imaging Archive. </figcaption><a class="fancybox" href="https://asset.jmir.pub/assets/695687440a4e5a83918929c8810032a5.png" title="Figure 10. Model generalizability to an independent test set from The Cancer Imaging Archive (TCIA). Quantitative performance of the model on TCIA test set in comparison with radiographers. (a) Surface Dice similarity coefficient (on the TCIA open-source test set) for the segmentations compared with the gold standard for each organ at an organ-specific tolerance &#964;. Blue: our model, green: radiographers. (b) Performance difference between the model and the radiographers. Each blue dot represents a model-radiographer pair. Red lines show the mean difference. The gray area highlights nonsubstantial differences (&#8722;5% to +5%). The box extends from the lower to upper quartile values of the data, with a line at the median. The whiskers indicate most extreme, nonoutlier data points. Where data lie outside, an IQR of 1.5 is represented as a circular flier. The notches represent the 95% CI around the median. DSC: Dice similarity coefficient; TCIA: The Cancer Imaging Archive.">View this figure</a></figure><p class="abstract-paragraph">For more detailed results demonstrating surface DSC and volumetric DSC for each individual patient from the TCIA test set, see Table S4 and Table S5, respectively, in <span class="footers"><a class="citation-link" href="#app1" rel="footnote">Multimedia Appendix 1</a></span>.</p><h4>Comparison With Previous Work</h4><p class="abstract-paragraph">An accurate quantitative comparison with previously published literature is difficult because of inherent differences in definitions of ground truth segmentations and varied processes of arbitration and consensus building. Given that the use of surface DSC is novel in this study, we also reported the standard volumetric DSC scores achieved by our algorithm (despite the shortcomings of this method) so that our results can be directly compared with those in the existing literature. An overview of past papers that have reported mean volumetric DSC for unedited automatic delineation of head and neck CT organs at risk can be found in Table S6, <span class="footers"><a class="citation-link" href="#app1" rel="footnote">Multimedia Appendix 1</a></span>. Each used different data sets, scanning parameters, and labeling protocols, meaning that the resulting volumetric DSC results varied significantly. No study, other than ours, segmented the lacrimal glands. We compared these results with those obtained when we applied our model to three different data sets: the TCIA open-source test set, an additional test set from the original UCLH data set (<i>UCLH test set</i>) and the data set released by the PDDCA as part of the 2015 Medical Image Computing and Computer Assisted Intervention head and neck radiotherapy organ at risk segmentation challenge (<i>PDDCA test set</i> [<span class="footers"><a class="citation-link" href="#ref25" rel="footnote">25</a></span>]). To contextualize the performance of our model, radiographer performance is shown on the TCIA test set, and oncologist interobserver variation is shown on the UCLH test set.</p><p class="abstract-paragraph">Although not the primary test set, we nevertheless present per-patient surface DSC and volumetric DSC for the PDDCA test set in Table S7 and Table S8 in <span class="footers"><a class="citation-link" href="#app1" rel="footnote">Multimedia Appendix 1</a></span>, respectively.</p><br/><h3 class="navigation-heading h3-main-heading" id="Discussion" data-label="Discussion">Discussion</h3><h4>Principal Findings</h4><p class="abstract-paragraph">We demonstrated an automated deep learning&#8211;based segmentation algorithm that can perform as well as experienced radiographers for head and neck radiotherapy planning. Our model was developed using CT scans derived from routine clinical practice and therefore should be applicable in a hospital setting for the segmentation of organs at risk, routine radiation therapy quality assurance peer review, and in reducing the associated variability between different specialists and radiotherapy centers [<span class="footers"><a class="citation-link" href="#ref93" rel="footnote">93</a></span>].</p><p class="abstract-paragraph">Clinical applicability must be supported not only by high model performance but also by evidence of model generalizability to new external data sets. To achieve this, we presented these results on three separate test sets, one of which (the PDDCA test set) uses a different segmentation protocol. In this study, performance in most organs at risk was maintained when tested on scans taken from a range of previously unseen international sites. Although these scans varied in patient demographics, scanning protocol, device manufacturer, and image quality, the model still achieved human performance on 19 of the 21 organs at risk studied; only the right lens and brainstem were below radiographer performance. For these organs at risk, the performance of the model might have been lower than expert performance owing to lower image quality. This is particularly evident for the right lens, where the anatomical borders were quite indistinct in some TCIA test set cases, thus preventing full segmentation by the model (Figure S6, <span class="footers"><a class="citation-link" href="#app1" rel="footnote">Multimedia Appendix 1</a></span>). Moreover, a precise CT definition of the brainstem&#8217;s proximal and distal boundaries is lacking, a factor that might have contributed to labeling variability and thus to decreased model performance. Finally, demographic bias may have resulted from the TCIA data set selection for cases of more advanced head and neck cancer [<span class="footers"><a class="citation-link" href="#ref48" rel="footnote">48</a></span>] or from variability in the training data [<span class="footers"><a class="citation-link" href="#ref10" rel="footnote">10</a></span>].</p><p class="abstract-paragraph">One major contribution of this paper is the presentation of a performance measure that represents the clinical task of organ at risk correction. In the first preprint of this work, we introduced surface DSC [<span class="footers"><a class="citation-link" href="#ref70" rel="footnote">70</a></span>], a metric conceived to be sensitive to clinically significant errors in organ at risk delineation. Surface DSC has recently been shown to be more strongly correlated with the amount of time required to correct segmentation for clinical use than traditional metrics, including volumetric DSC [<span class="footers"><a class="citation-link" href="#ref94" rel="footnote">94</a></span>,<span class="footers"><a class="citation-link" href="#ref95" rel="footnote">95</a></span>]. Small deviations in organ at risk border placement can have a potentially serious impact, increasing the risk of debilitating side effects for the patient. Misplacement by only a small offset may thus require the entire region to be redrawn, and in such cases, an automated segmentation algorithm may offer no time savings. Volumetric DSC is relatively insensitive to such small changes in large organs, as the absolute overlap is also large. Difficulties identifying the exact borders of smaller organs can result in large differences in volumetric DSC, even if these differences are not clinically relevant in terms of their effect on radiotherapy treatment. By strongly penalizing border placement outside a tolerance determined by consultant oncologists, the surface DSC metric resolves these issues.</p><p class="abstract-paragraph">Although volumetric DSC is therefore not representative of clinical consequences, it remains to be the most popular metric for evaluating segmentation models and therefore the only metric that allows comparison with previously published works. In recent years, fully convolutional networks have become the most popular and successful methodology for organ at risk segmentation in head and neck CT for de novo radiotherapy planning [<span class="footers"><a class="citation-link" href="#ref40" rel="footnote">40</a></span>-<span class="footers"><a class="citation-link" href="#ref45" rel="footnote">45</a></span>,<span class="footers"><a class="citation-link" href="#ref58" rel="footnote">58</a></span>-<span class="footers"><a class="citation-link" href="#ref69" rel="footnote">69</a></span>]. Although not directly comparable owing to different data sets and labeling protocols, our volumetric DSC results compare favorably with the existing published literature for many of the organs at risk (see Table S6 and Figure S7, <span class="footers"><a class="citation-link" href="#app1" rel="footnote">Multimedia Appendix 1</a></span>, for more details on this and other prior publications). In organs at risk with inferior volumetric DSC scores compared with the published literature, both our model and human radiographers achieved similar scores. This suggests that current and previously published results are difficult to compare, either because of the inclusion of more difficult cases than previous studies or because of different segmentation and scanning protocols. To allow more objective comparisons of different segmentation methods, we made our labeled TCIA data sets freely available to the academic community (see the Acknowledgments section on data availability). At least 11 auto-segmentation software solutions are currently available commercially, with varying claims regarding their potential to lower segmentation time during radiotherapy planning [<span class="footers"><a class="citation-link" href="#ref96" rel="footnote">96</a></span>]. The principal factor that determines whether automatic segmentation is time saving during the radiotherapy workflow is the degree to which automated segmentations require correction by oncologists.</p><p class="abstract-paragraph">The wide variability in state-of-the-art and limited uptake in routine clinical practice motivates the need for clinical studies evaluating model performance in practice. Future work will seek to define the clinical acceptability of the segmented organs at risk produced by our models and estimate the time saving that could be achieved during the radiotherapy planning workflow in a real-world setting.</p><p class="abstract-paragraph">A number of other study limitations should be addressed in future studies. First, we included only planning CT scans because magnetic resonance imaging and positron emission tomography scans were not routinely performed for all patients in the UCLH data set. Some organ at risk classes, such as optic chiasm, require co-registration with MR images for optimal delineation, and access to additional imaging has been shown to improve the delineation of optic nerves [<span class="footers"><a class="citation-link" href="#ref29" rel="footnote">29</a></span>]. As a result, certain organ at risk classes were deliberately excluded from this CT-based project and will be addressed in future work that will incorporate magnetic resonance imaging scans. A second limitation is with regard to the classes of organs at risk in this study. Although we presented one of the largest sets of reported organs at risk in the literature [<span class="footers"><a class="citation-link" href="#ref44" rel="footnote">44</a></span>,<span class="footers"><a class="citation-link" href="#ref97" rel="footnote">97</a></span>,<span class="footers"><a class="citation-link" href="#ref98" rel="footnote">98</a></span>], some omissions occurred (eg, oral cavity) owing to an insufficient number of examples in the training data that conformed to a standard international protocol. The number of oncologists used in the creation of our ground truth may not have fully captured the variability in organ at risk segmentation or may have been biased toward a particular interpretation of the Brouwer Atlas used as our segmentation protocol. Even in an organ as simple as the spinal cord that is traditionally reliably outlined by auto-segmentation algorithms, there is ambiguity between the inclusion of, for example, the nerve roots. Such variation may widen the thresholds of acceptable deviation in favor of the model, despite a consistent protocol. Future studies will address these deficits alongside time-consuming lymph node segmentation.</p><p class="abstract-paragraph">Finally, neither of the test sets used in this study included the patients&#8217; protected-characteristic status. This is a significant limitation, as it prevents the study of intersectional fairness.</p><h4>Conclusions</h4><p class="abstract-paragraph">In conclusion, we demonstrated that deep learning can achieve human expert&#8211;level performance in the segmentation of head and neck organs at risk in radiotherapy planning CT scans, using a clinically applicable performance metric designed for this clinical scenario. We provided evidence of the generalizability of this model by testing it on patients from different geographies, demographics, and scanning protocols. This segmentation algorithm was performed with similar accuracy compared with experts and has the potential to improve the speed, efficiency, and consistency of radiotherapy workflows, with an expected positive influence on patient outcomes. Future work will investigate the impact of our segmentation algorithm in clinical practice.</p></article><p><h4 class="h4-border-top">Acknowledgments</h4></p><p class="abstract-paragraph">The codebase for the deep learning framework makes use of proprietary components, and we are unable to publicly release this code. However, all experiments and implementation details are described in detail in the Methods section to allow independent replication with nonproprietary libraries. The surface DSC performance metric code is available on the internet [<span class="footers"><a class="citation-link" href="#ref99" rel="footnote">99</a></span>].</p><p class="abstract-paragraph">The clinical data used for training and validation sets were collected and deidentified at the UCLH NHS Foundation Trust. The data were used for both local and national permissions. They are not publicly available, and restrictions apply to their use. The data, or a subset, may be available from the UCLH NHS Foundation Trust, subject to local and national ethical approvals. The released test or validation set data were collected from two data sets hosted on TCIA. The subset used, along with the ground truth segmentations added, is available on the internet [<span class="footers"><a class="citation-link" href="#ref100" rel="footnote">100</a></span>].</p><p class="abstract-paragraph">The authors thank the patients treated at UCLH whose scans were used in this work, A Zisserman, D King, D Barrett, V Cornelius, C Beltran, J Cornebise, R Sharma, J Ashburner, J Good, and N Haji for discussions, M Kosmin for his review of the published literature, J Adler for discussion and review of the manuscript, A Warry, U Johnson, V Rompokos, and the rest of the UCLH Radiotherapy Physics team for work on the data collection, R West for work on the visuals, C Game, D Mitchell, and M Johnson for infrastructure and systems administration, A Paine at Softwire for engineering support at UCLH, A Kitchener and the UCLH Information Governance team for support, J Besley and M Bawn for legal assistance, K Ayoub, K Sullivan, and R Ahmed for initiating and supporting the collaboration, the DeepMind Radiographer Consortium made up of B Garie, Y McQuinlan, K Hampton, S Ireland, K Fuller, H Frank, C Tully, A Jones, and L Turner, and the rest of the DeepMind team for their support, ideas, and encouragement. GR and HM were supported by University College London and the National Institute for Health Research UCLH Biomedical Research Centre. The views expressed are those of the authors and not necessarily those of the NHS, the National Institute for Health Research, or the Department of Health.</p><h4 class="h4-border-top">Authors' Contributions</h4><p><p class="abstract-paragraph">MS, TB, OR, JRL, RM, HM, SAM, DD, CC, and COH initiated the project. SB, RM, DC, CB, and DD, CC, and JRL created the data sets. SB, SN, JDF, AZ, YP, COH, HA, and OR contributed to software engineering. SN, JDF, BRP, and OR designed the model architectures. BG, YMQ, SI, KH and KF manually segmented the images. RM, DC, CB, DD, SAM, HM, GR, COH, AK, and JRL contributed clinical expertise. CM, JRL, TB, SAM, KS, and OR managed the project. COH, CK, ML, JRL, SN, SB, JDF, HM, GR, and OR wrote the paper.</p></p><h4 class="h4-border-top">Conflicts of Interest</h4><p><p class="abstract-paragraph">GR, HM, CK, COH, and DC were paid contractors of DeepMind and Google Health.</p></p> &#8206; <div id="app1" name="app1">Multimedia Appendix 1<p class="abstract-paragraph">Additional Tables S1-S8 and Figures S1-S7 show further visual examples of model outputs, performance metrics and detailed comparisons to previously published works.</p><a href="https://jmir.org/api/download?alt_name=jmir_v23i7e26151_app1.pdf&amp;filename=584abec9f29d69baaab930a03ecb2c2d.pdf" target="_blank">PDF File (Adobe PDF File), 10937 KB</a></div><hr/><div class="footnotes"><h4 class="h4-border-top" id="References">References</h4><ol><li><span id="ref1">Jemal A, Bray F, Center MM, Ferlay J, Ward E, Forman D. Global cancer statistics. CA Cancer J Clin 2011;61(2):69-90 [<a href="https://doi.org/10.3322/caac.20107" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.3322/caac.20107">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=21296855&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref2">Head and neck cancers incidence statistics. Cancer Research UK. &#160; URL: <a target="_blank" href="https://www.cancerresearchuk.org/health-professional/cancer-statistics/statistics-by-cancer-type/head-and-neck-cancers/incidence#heading-Two">https:/&#8203;/www.&#8203;cancerresearchuk.org/&#8203;health-professional/&#8203;cancer-statistics/&#8203;statistics-by-cancer-type/&#8203;head-and-neck-cancers/&#8203;incidence#heading-Two</a> [accessed 2018-02-08] </span></li><li><span id="ref3">NCIN data briefing: potentially HPV-related head and neck cancers. National Cancer Intelligence Network. &#160; URL: <a target="_blank" href="http://www.ncin.org.uk/publications/data_briefings/potentially_hpv_related_head_and_neck_cancers">http://www.ncin.org.uk/publications/data_briefings/potentially_hpv_related_head_and_neck_cancers</a> [accessed 2021-05-17] </span></li><li><span id="ref4">Profile of head and neck cancers in England: incidence, mortality and survival. Oxford Cancer Intelligence Unit. 2010. &#160; URL: <a target="_blank" href="http://www.ncin.org.uk/view?rid=69">http://www.ncin.org.uk/view?rid=69</a> [accessed 2021-05-17] </span></li><li><span id="ref5">Parkin DM, Boyd L, Walker LC. 16. The fraction of cancer attributable to lifestyle and environmental factors in the UK in 2010. Br J Cancer 2011 Dec 06;105 Suppl 2:77-81 [<a href="http://europepmc.org/abstract/MED/22158327" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1038/bjc.2011.489">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=22158327&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref6">Jensen K, Lambertsen K, Grau C. Late swallowing dysfunction and dysphagia after radiotherapy for pharynx cancer: frequency, intensity and correlation with dose and volume parameters. Radiother Oncol 2007 Oct;85(1):74-82. [<a target="_blank" href="https://dx.doi.org/10.1016/j.radonc.2007.06.004">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=17673322&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref7">Dirix P, Abbeel S, Vanstraelen B, Hermans R, Nuyts S. Dysphagia after chemoradiotherapy for head-and-neck squamous cell carcinoma: dose-effect relationships for the swallowing structures. Int J Radiat Oncol Biol Phys 2009 Oct 01;75(2):385-392 [<a href="https://doi.org/10.1016/j.ijrobp.2008.11.041" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1016/j.ijrobp.2008.11.041">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=19553033&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref8">Caudell JJ, Schaner PE, Desmond RA, Meredith RF, Spencer SA, Bonner JA. Dosimetric factors associated with long-term dysphagia after definitive radiotherapy for squamous cell carcinoma of the head and neck. Int J Radiat Oncol Biol Phys 2010 Feb 01;76(2):403-409. [<a target="_blank" href="https://dx.doi.org/10.1016/j.ijrobp.2009.02.017">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=19467801&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref9">Nutting CM, Morden JP, Harrington KJ, Urbano TG, Bhide SA, Clark C, et al. Parotid-sparing intensity modulated versus conventional radiotherapy in head and neck cancer (PARSPORT): a phase 3 multicentre randomised controlled trial. Lancet Oncol 2011 Feb;12(2):127-136. [<a target="_blank" href="https://dx.doi.org/10.1016/s1470-2045(10)70290-4">CrossRef</a>]</span></li><li><span id="ref10">Nelms BE, Tom&#233; WA, Robinson G, Wheeler J. Variations in the contouring of organs at risk: test case from a patient with oropharyngeal cancer. Int J Radiat Oncol Biol Phys 2012 Jan 01;82(1):368-378. [<a target="_blank" href="https://dx.doi.org/10.1016/j.ijrobp.2010.10.019">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=21123004&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref11">Voet PW, Dirkx ML, Teguh DN, Hoogeman MS, Levendag PC, Heijmen BJ. Does atlas-based autosegmentation of neck levels require subsequent manual contour editing to avoid risk of severe target underdosage? A dosimetric analysis. Radiother Oncol 2011 Mar;98(3):373-377. [<a target="_blank" href="https://dx.doi.org/10.1016/j.radonc.2010.11.017">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=21269714&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref12">Harari PM, Song S, Tom&#233; WA. Emphasizing conformal avoidance versus target definition for IMRT planning in head-and-neck cancer. Int J Radiat Oncol Biol Phys 2010 Jul 01;77(3):950-958 [<a href="http://europepmc.org/abstract/MED/20378266" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1016/j.ijrobp.2009.09.062">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=20378266&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref13">Chen Z, King W, Pearcey R, Kerba M, Mackillop WJ. The relationship between waiting time for radiotherapy and clinical outcomes: a systematic review of the literature. Radiother Oncol 2008 Apr;87(1):3-16. [<a target="_blank" href="https://dx.doi.org/10.1016/j.radonc.2007.11.016">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=18160158&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref14">Mikeljevic JS, Haward R, Johnston C, Crellin A, Dodwell D, Jones A, et al. Trends in postoperative radiotherapy delay and the effect on survival in breast cancer patients treated with conservation surgery. Br J Cancer 2004 Apr 05;90(7):1343-1348 [<a href="http://europepmc.org/abstract/MED/15054452" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1038/sj.bjc.6601693">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=15054452&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref15">The NHS Cancer Plan and the new NHS, Chapter 5. National Health Service. 2004. &#160; URL: <a target="_blank" href="http://www.wales.nhs.uk/technologymls/english/resources/pdf/cancer_nsf.pdf">http://www.wales.nhs.uk/technologymls/english/resources/pdf/cancer_nsf.pdf</a> [accessed 2021-05-17] </span></li><li><span id="ref16">Round C, Williams M, Mee T, Kirkby N, Cooper T, Hoskin P, et al. Radiotherapy demand and activity in England 2006-2020. Clin Oncol (R Coll Radiol) 2013 Sep;25(9):522-530. [<a target="_blank" href="https://dx.doi.org/10.1016/j.clon.2013.05.005">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=23768454&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref17">Rosenblatt E, Zubizarreta E. Radiotherapy in cancer care: facing the global challenge. International Atomic Energy Agency. 2017. &#160; URL: <a target="_blank" href="https://www-pub.iaea.org/MTCD/Publications/PDF/P1638_web.pdf">https://www-pub.iaea.org/MTCD/Publications/PDF/P1638_web.pdf</a> [accessed 2021-05-17] </span></li><li><span id="ref18">Veiga C, McClelland J, Moinuddin S, Louren&#231;o A, Ricketts K, Annkah J, et al. Toward adaptive radiotherapy for head and neck patients: feasibility study on using CT-to-CBCT deformable registration for "dose of the day" calculations. Med Phys 2014 Mar 19;41(3):031703 [<a href="https://doi.org/10.1118/1.4864240" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1118/1.4864240">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=24593707&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref19">Daisne J, Blumhofer A. Atlas-based automatic segmentation of head and neck organs at risk and nodal target volumes: a clinical validation. Radiat Oncol 2013 Jun 26;8(1):154. [<a target="_blank" href="https://dx.doi.org/10.1186/1748-717x-8-154">CrossRef</a>]</span></li><li><span id="ref20">Fortunati V, Verhaart RF, van der Lijn F, Niessen WJ, Veenland JF, Paulides MM, et al. Tissue segmentation of head and neck CT images for treatment planning: a multiatlas approach combined with intensity modeling. Med Phys 2013 Jul 20;40(7):071905. [<a target="_blank" href="https://dx.doi.org/10.1118/1.4810971">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=23822442&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref21">Hoang Duc AK, Eminowicz G, Mendes R, Wong S, McClelland J, Modat M, et al. Validation of clinical acceptability of an atlas-based segmentation algorithm for the delineation of organs at risk in head and neck cancer. Med Phys 2015 Sep 05;42(9):5027-5034 [<a href="https://doi.org/10.1118/1.4927567" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1118/1.4927567">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=26328953&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref22">Thomson D, Boylan C, Liptrot T, Aitkenhead A, Lee L, Yap B, et al. Evaluation of an automatic segmentation algorithm for definition of head and neck organs at risk. Radiat Oncol 2014;9(1):173. [<a target="_blank" href="https://dx.doi.org/10.1186/1748-717x-9-173">CrossRef</a>]</span></li><li><span id="ref23">Walker GV, Awan M, Tao R, Koay EJ, Boehling NS, Grant JD, et al. Prospective randomized double-blind study of atlas-based organ-at-risk autosegmentation-assisted radiation planning in head and neck cancer. Radiother Oncol 2014 Sep;112(3):321-325 [<a href="https://linkinghub.elsevier.com/retrieve/pii/S0167-8140(14)00358-2" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1016/j.radonc.2014.08.028">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=25216572&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref24">Gacha SJ, Le&#243;n SA. Segmentation of mandibles in computer tomography volumes of patients with foam cells carcinoma. In: Proceedings of the IX International Seminar of Biomedical Engineering (SIB). 2018 Presented at: IX International Seminar of Biomedical Engineering (SIB); May 16-18, 2018; Bogota, Colombia. [<a target="_blank" href="https://dx.doi.org/10.1109/SIB.2018.8467732">CrossRef</a>]</span></li><li><span id="ref25">Raudaschl PF, Zaffino P, Sharp GC, Spadea MF, Chen A, Dawant BM, et al. Evaluation of segmentation methods on head and neck CT: auto-segmentation challenge 2015. Med Phys 2017 May 21;44(5):2020-2036 [<a href="https://doi.org/10.1002/mp.12197" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1002/mp.12197">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=28273355&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref26">Wu X, Udupa JK, Tong Y, Odhner D, Pednekar GV, Simone CB, et al. AAR-RT - A system for auto-contouring organs at risk on CT images for radiation therapy planning: Principles, design, and large-scale evaluation on head-and-neck and thoracic cancer cases. Med Image Anal 2019 May;54:45-62 [<a href="http://europepmc.org/abstract/MED/30831357" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1016/j.media.2019.01.008">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30831357&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref27">Fritscher K, Raudaschl P, Zaffino P, Spadea MF, Sharp GC, Schubert R. Deep neural networks for fast segmentation of 3D medical images. In: Medical Image Computing and Computer-Assisted Intervention &#8211; MICCAI 2016. Switzerland: Springer; 2016:158-165.</span></li><li><span id="ref28">Ibragimov B, Xing L. Segmentation of organs-at-risks in head and neck CT images using convolutional neural networks. Med Phys 2017 Feb 16;44(2):547-557 [<a href="http://europepmc.org/abstract/MED/28205307" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1002/mp.12045">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=28205307&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref29">Mo&#269;nik D, Ibragimov B, Xing L, Strojan P, Likar B, Pernu&#353; F, et al. Segmentation of parotid glands from registered CT and MR images. Phys Med 2018 Aug;52:33-41 [<a href="http://europepmc.org/abstract/MED/30139607" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1016/j.ejmp.2018.06.012">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30139607&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref30">Ren X, Xiang L, Nie D, Shao Y, Zhang H, Shen D, et al. Interleaved 3D-CNNs for joint segmentation of small-volume structures in head and neck CT images. Med Phys 2018 May 23;45(5):2063-2075 [<a href="http://europepmc.org/abstract/MED/29480928" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1002/mp.12837">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=29480928&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref31">Zhong T, Huang X, Tang F, Liang S, Deng X, Zhang Y. Boosting-based cascaded convolutional neural networks for the segmentation of CT organs-at-risk in nasopharyngeal carcinoma. Med Phys 2019 Sep 16;46(12):5602-5611 [<a href="https://doi.org/10.1002/mp.13825" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1002/mp.13825">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=31529501&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref32">Ronneberger O, Fischer P, Brox T. U-net: Convolutional networks for biomedical image segmentation. In: Medical Image Computing and Computer-Assisted Intervention &#8211; MICCAI 2015. Switzerland: Springer; 2015:234-241.</span></li><li><span id="ref33">De Fauw J, Ledsam JR, Romera-Paredes B, Nikolov S, Tomasev N, Blackwell S, et al. Clinically applicable deep learning for diagnosis and referral in retinal disease. Nat Med 2018 Sep 13;24(9):1342-1350. [<a target="_blank" href="https://dx.doi.org/10.1038/s41591-018-0107-6">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30104768&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref34">H&#228;nsch A, Schwier M, Gass T, Morgas T, Haas B, Klein J, et al. Comparison of different deep learning approaches for parotid gland segmentation from CT images. Proc. SPIE 10575, Med Imag 2018: Comp-Aid Diag 2018:1057519 [<a href="https://doi.org/10.1117/12.2292962" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1117/12.2292962">CrossRef</a>]</span></li><li><span id="ref35">Zhu W, Huang Y, Tang H, Qian Z, Du N, Fan W, et al. AnatomyNet: Deep 3D Squeeze-and-excitation U-Nets for fast and fully automated whole-volume anatomical segmentation. BioRxiv 2018:A. [<a target="_blank" href="https://dx.doi.org/10.1101/392969">CrossRef</a>]</span></li><li><span id="ref36">Tong N, Gou S, Yang S, Ruan D, Sheng K. Fully automatic multi-organ segmentation for head and neck cancer radiotherapy using shape representation model constrained fully convolutional neural networks. Med Phys 2018 Oct 19;45(10):4558-4567 [<a href="http://europepmc.org/abstract/MED/30136285" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1002/mp.13147">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30136285&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref37">Liang S, Tang F, Huang X, Yang K, Zhong T, Hu R, et al. Deep-learning-based detection and segmentation of organs at risk in nasopharyngeal carcinoma computed tomographic images for radiotherapy planning. Eur Radiol 2019 Apr 9;29(4):1961-1967 [<a href="https://doi.org/10.1007/s00330-018-5748-9" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1007/s00330-018-5748-9">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30302589&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref38">Willems S, Crijns W, Saint-Esteven AL, Veen JV, Robben D, Depuydt T, et al. Clinical implementation of DeepVoxNet for auto-delineation of organs at risk in head and neck cancer patients in radiotherapy. In: Endoscopy, Clinical Image-Based Procedures, and Skin Image Analysis. Switzerland: Springer; 2018:223-232.</span></li><li><span id="ref39">Kodym O, &#352;pan&#283;l M, Herout A. Segmentation of head and neck organs at risk using CNN with batch dice loss. arXiv.org: Computer Science - Computer Vision and Pattern Recognition. 2018. &#160; URL: <a target="_blank" href="https://arxiv.org/abs/1812.02427">https://arxiv.org/abs/1812.02427</a> [accessed 2021-05-17] </span></li><li><span id="ref40">Wang Y, Zhao L, Wang M, Song Z. Organ at risk segmentation in head and neck CT images using a two-stage segmentation framework based on 3D U-Net. IEEE Access 2019;7:144591-144602 [<a href="https://doi.org/10.1109/ACCESS.2019.2944958" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1109/access.2019.2944958">CrossRef</a>]</span></li><li><span id="ref41">Men K, Geng H, Cheng C, Zhong H, Huang M, Fan Y, et al. Technical Note: More accurate and efficient segmentation of organs-at-risk in radiotherapy with convolutional neural networks cascades. Med Phys 2019 Jan 07;46(1):286-292 [<a href="http://europepmc.org/abstract/MED/30450825" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1002/mp.13296">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30450825&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref42">Tappeiner E, Pr&#246;ll S, H&#246;nig M, Raudaschl PF, Zaffino P, Spadea MF, et al. Multi-organ segmentation of the head and neck area: an efficient hierarchical neural networks approach. Int J Comput Assist Radiol Surg 2019 May 7;14(5):745-754 [<a href="https://doi.org/10.1007/s11548-019-01922-4" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1007/s11548-019-01922-4">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30847761&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref43">Rhee DJ, Cardenas CE, Elhalawani H, McCarroll R, Zhang L, Yang J, et al. Automatic detection of contouring errors using convolutional neural networks. Med Phys 2019 Nov 26;46(11):5086-5097 [<a href="http://europepmc.org/abstract/MED/31505046" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1002/mp.13814">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=31505046&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref44">Tang H, Chen X, Liu Y, Lu Z, You J, Yang M, et al. Clinically applicable deep learning framework for organs at risk delineation in CT images. Nat Mach Intell 2019 Sep 30;1(10):480-491 [<a href="https://doi.org/10.1038/s42256-019-0099-z" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1038/s42256-019-0099-z">CrossRef</a>]</span></li><li><span id="ref45">van Rooij W, Dahele M, Brandao HR, Delaney AR, Slotman BJ, Verbakel WF. Deep learning-based delineation of head and neck organs at risk: geometric and dosimetric evaluation. Int J Radiat Oncol Biol Phys 2019 Jul 01;104(3):677-684 [<a href="https://doi.org/10.1016/j.ijrobp.2019.02.040" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1016/j.ijrobp.2019.02.040">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30836167&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref46">Gou S, Tong N, Qi S, Yang S, Chin R, Sheng K. Self-channel-and-spatial-attention neural network for automated multi-organ segmentation on head and neck CT images. Phys Med Biol 2020 Dec 11;65(24):245034 [<a href="https://doi.org/10.1088/1361-6560/ab79c3" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1088/1361-6560/ab79c3">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=32097892&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref47">Mak RH, Endres MG, Paik JH, Sergeev RA, Aerts H, Williams CL, et al. Use of crowd innovation to develop an artificial intelligence-based solution for radiation therapy targeting. JAMA Oncol 2019 May 01;5(5):654-661 [<a href="http://europepmc.org/abstract/MED/30998808" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1001/jamaoncol.2019.0159">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30998808&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref48">Head-neck cetuximab. The Cancer Imaging Archive (TCIA). &#160; URL: <a target="_blank" href="https://wiki.cancerimagingarchive.net/display/Public/Head-Neck+Cetuximab">https://wiki.cancerimagingarchive.net/display/Public/Head-Neck+Cetuximab</a> [accessed 2021-05-17] </span></li><li><span id="ref49">Clark K, Vendt B, Smith K, Freymann J, Kirby J, Koppel P, et al. The Cancer Imaging Archive (TCIA): maintaining and operating a public information repository. J Digit Imaging 2013 Dec 25;26(6):1045-1057 [<a href="http://europepmc.org/abstract/MED/23884657" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1007/s10278-013-9622-7">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=23884657&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref50">Zuley ML, Jarosz R, Kirk S, Colen R, Garcia K, Aredes ND. Radiology data from the cancer genome atlas head-neck squamous cell carcinoma [TCGA-HNSC] collection. The Cancer Imaging Archive (TCIA) 2020:A [<a href="https://wiki.cancerimagingarchive.net/display/Public/TCGA-HNSC" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.7937/K9/TCIA.2016.LXKQ47MS">CrossRef</a>]</span></li><li><span id="ref51">Brouwer CL, Steenbakkers RJ, Bourhis J, Budach W, Grau C, Gr&#233;goire V, et al. CT-based delineation of organs at risk in the head and neck region: DAHANCA, EORTC, GORTEC, HKNPCSG, NCIC CTG, NCRI, NRG Oncology and TROG consensus guidelines. Radiother Oncol 2015 Oct;117(1):83-90. [<a target="_blank" href="https://dx.doi.org/10.1016/j.radonc.2015.07.041">CrossRef</a>]</span></li><li><span id="ref52">Felzenszwalb PF, Huttenlocher DP. Distance transforms of sampled functions. Theory Comput 2012;8(1):415-428. [<a target="_blank" href="https://dx.doi.org/10.4086/toc.2012.v008a019">CrossRef</a>]</span></li><li><span id="ref53">Kingma DP, Ba J. Adam: a method for stochastic optimization. arXiv.org : Computer Science - Machine Learning. 2014. &#160; URL: <a target="_blank" href="http://arxiv.org/abs/1412.6980">http://arxiv.org/abs/1412.6980</a> [accessed 2021-05-17] </span></li><li><span id="ref54">Wu Z, Shen C, van den Hengel A. Bridging category-level and instance-level semantic image segmentation. arXiv.org : Computer Science - Computer Vision and Pattern Recognition. 2016. &#160; URL: <a target="_blank" href="http://arxiv.org/abs/1605.06885v1">http://arxiv.org/abs/1605.06885v1</a> [accessed 2021-05-17] </span></li><li><span id="ref55">Lorensen WE, Cline HE. Marching cubes: a high resolution 3D surface construction algorithm. SIGGRAPH Comput Graph 1987 Aug;21(4):163-169. [<a target="_blank" href="https://dx.doi.org/10.1145/37402.37422">CrossRef</a>]</span></li><li><span id="ref56">Wang Z, Wei L, Wang L, Gao Y, Chen W, Shen D. Hierarchical vertex regression-based segmentation of head and neck CT images for radiotherapy planning. IEEE Trans Image Process 2018 Feb;27(2):923-937. [<a target="_blank" href="https://dx.doi.org/10.1109/tip.2017.2768621">CrossRef</a>]</span></li><li><span id="ref57">Torosdagli N, Liberton DK, Verma P, Sincan M, Lee J, Pattanaik S, et al. Robust and fully automated segmentation of mandible from CT scans. In: Proceedings of the IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017). 2017 Presented at: IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017); April 18-21, 2017; Melbourne, VIC, Australia. [<a target="_blank" href="https://dx.doi.org/10.1109/isbi.2017.7950734">CrossRef</a>]</span></li><li><span id="ref58">Liang S, Thung K, Nie D, Zhang Y, Shen D. Multi-view spatial aggregation framework for joint localization and segmentation of organs at risk in head and neck CT images. IEEE Trans Med Imaging 2020 Sep;39(9):2794-2805 [<a href="https://doi.org/10.1109/TMI.2020.2975853" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1109/tmi.2020.2975853">CrossRef</a>]</span></li><li><span id="ref59">Qiu B, Guo J, Kraeima J, Glas HH, Borra RJ, Witjes MJ, et al. Recurrent convolutional neural networks for mandible segmentation from computed tomography. arXiv.org: Electrical Engineering and Systems Science - Image and Video Processing. 2020. &#160; URL: <a target="_blank" href="https://arxiv.org/abs/2003.06486">https://arxiv.org/abs/2003.06486</a> [accessed 2021-05-27] </span></li><li><span id="ref60">Sun S, Liu Y, Bai N, Tang H, Chen X, Huang Q, et al. Attentionanatomy: a unified framework for whole-body organs at risk segmentation using multiple partially annotated datasets. In: Proceedings of the IEEE 17th International Symposium on Biomedical Imaging (ISBI). 2020 Presented at: IEEE 17th International Symposium on Biomedical Imaging (ISBI); April 3-7, 2020; Iowa City, IA, USA p. A. [<a target="_blank" href="https://dx.doi.org/10.1109/isbi45749.2020.9098588">CrossRef</a>]</span></li><li><span id="ref61">van Dijk LV, Van den Bosch L, Aljabar P, Peressutti D, Both S, Steenbakkers RJ, et al. Improving automatic delineation for head and neck organs at risk by Deep Learning Contouring. Radiother Oncol 2020 Jan;142:115-123 [<a href="https://linkinghub.elsevier.com/retrieve/pii/S0167-8140(19)33111-1" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1016/j.radonc.2019.09.022">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=31653573&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref62">Wong J, Fong A, McVicar N, Smith S, Giambattista J, Wells D, et al. Comparing deep learning-based auto-segmentation of organs at risk and clinical target volumes to expert inter-observer variability in radiotherapy planning. Radiother Oncol 2020 Mar;144:152-158 [<a href="https://doi.org/10.1016/j.radonc.2019.10.019" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1016/j.radonc.2019.10.019">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=31812930&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref63">Chan JW, Kearney V, Haaf S, Wu S, Bogdanov M, Reddick M, et al. A convolutional neural network algorithm for automatic segmentation of head and neck organs at risk using deep lifelong learning. Med Phys 2019 May 04;46(5):2204-2213 [<a href="https://doi.org/10.1002/mp.13495" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1002/mp.13495">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30887523&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref64">Gao Y, Huang R, Chen M, Wang Z, Deng J, Chen Y, et al. FocusNet: imbalanced large and small organ segmentation with an end-to-end deep neural network for head and neck CT images. In: Medical Image Computing and Computer Assisted Intervention &#8211; MICCAI 2019. Switzerland: Springer; 2019:829-838.</span></li><li><span id="ref65">Jiang J, Sharif E, Um H, Berry S, Veeraraghavan H. Local block-wise self attention for normal organ segmentation. arXiv.org: Computer Science - Computer Vision and Pattern Recognition. 2019. &#160; URL: <a target="_blank" href="https://arxiv.org/abs/1909.05054">https://arxiv.org/abs/1909.05054</a> [accessed 2021-05-16] </span></li><li><span id="ref66">Lei W, Wang H, Gu R, Zhang S, Wang G. DeepIGeoS-V2: Deep interactive segmentation of multiple organs from head and neck images with lightweight CNNs. In: Large-Scale Annotation of Biomedical Data and Expert Label Synthesis and Hardware Aware Learning for Medical Imaging and Computer Assisted Intervention. Switzerland: Springer; 2019:61-69.</span></li><li><span id="ref67">Sun Y, Shi H, Zhang S, Wang P, Zhao W, Zhou X, et al. Accurate and rapid CT image segmentation of the eyes and surrounding organs for precise radiotherapy. Med Phys 2019 May 22;46(5):2214-2222 [<a href="https://doi.org/10.1002/mp.13463" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1002/mp.13463">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30815885&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref68">Tong N, Gou S, Yang S, Cao M, Sheng K. Shape constrained fully convolutional DenseNet with adversarial training for multiorgan segmentation on head and neck CT and low-field MR images. Med Phys 2019 Jun 06;46(6):2669-2682 [<a href="http://europepmc.org/abstract/MED/31002188" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1002/mp.13553">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=31002188&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref69">Xue Y, Tang H, Qiao Z, Gong G, Yin Y, Qian Z, et al. Shape-aware organ segmentation by predicting signed distance maps. arXiv.org: Computer Science - Computer Vision and Pattern Recognition 2020 Apr 03;34(07):12565-12572 [<a href="https://arxiv.org/abs/1912.03849" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1609/aaai.v34i07.6946">CrossRef</a>]</span></li><li><span id="ref70">Nikolov S, Blackwell S, Mendes R, Fauw JD, Meyer C, Hughes C, DeepMind Radiographer Consortium, et al. Deep learning to achieve clinically applicable segmentation of head and neck anatomy for radiotherapy. arXiv.org: Computer Science - Computer Vision and Pattern Recognition. 2018. &#160; URL: <a target="_blank" href="https://arxiv.org/abs/1809.04430v1">https://arxiv.org/abs/1809.04430v1</a> [accessed 2021-05-16] </span></li><li><span id="ref71">Fritscher KD, Peroni M, Zaffino P, Spadea MF, Schubert R, Sharp G. Automatic segmentation of head and neck CT images for radiotherapy treatment planning using multiple atlases, statistical appearance models, and geodesic active contours. Med Phys 2014 May 24;41(5):051910 [<a href="http://europepmc.org/abstract/MED/24784389" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1118/1.4871623">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=24784389&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref72">Qazi AA, Pekar V, Kim J, Xie J, Breen SL, Jaffray DA. Auto-segmentation of normal and target structures in head and neck CT images: a feature-driven model-based approach. Med Phys 2011 Nov 26;38(11):6160-6170 [<a href="https://doi.org/10.1118/1.3654160" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1118/1.3654160">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=22047381&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref73">Tam CM, Yang X, Tian S, Jiang X, Beitler JJ, Li S. Automated delineation of organs-at-risk in head and neck CT images using multi-output support vector regression. In: Proceedings of the SPIE 10578, Medical Imaging 2018: Biomedical Applications in Molecular, Structural, and Functional Imaging. 2018 Presented at: SPIE 10578, Medical Imaging 2018: Biomedical Applications in Molecular, Structural, and Functional Imaging; March 12, 2018; Houston, Texas, United States p. -. [<a target="_blank" href="https://dx.doi.org/10.1117/12.2292556">CrossRef</a>]</span></li><li><span id="ref74">Wang Z, Liu X, Chen W. Head and neck CT atlases alignment based on anatomical priors constraint. J Med Imaging Health Infor 2019 Dec 01;9(9):2004-2011 [<a href="https://doi.org/10.1166/jmihi.2019.2844" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1166/jmihi.2019.2844">CrossRef</a>]</span></li><li><span id="ref75">Ayyalusamy A, Vellaiyan S, Subramanian S, Ilamurugu A, Satpathy S, Nauman M, et al. Auto-segmentation of head and neck organs at risk in radiotherapy and its dependence on anatomic similarity. Radiat Oncol J 2019 Jun;37(2):134-142 [<a href="https://dx.doi.org/10.3857/roj.2019.00038" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.3857/roj.2019.00038">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=31266293&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref76">Haq R, Berry SL, Deasy JO, Hunt M, Veeraraghavan H. Dynamic multiatlas selection-based consensus segmentation of head and neck structures from CT images. Med Phys 2019 Dec 31;46(12):5612-5622 [<a href="http://europepmc.org/abstract/MED/31587300" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1002/mp.13854">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=31587300&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref77">McCarroll RE, Beadle BM, Balter PA, Burger H, Cardenas CE, Dalvie S, et al. Retrospective validation and clinical implementation of automated contouring of organs at risk in the head and neck: a step toward automated radiation treatment planning for low- and middle-income countries. J Glob Oncol 2018 Dec(4):1-11 [<a href="https://doi.org/10.1200/JGO.18.00055" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1200/jgo.18.00055">CrossRef</a>]</span></li><li><span id="ref78">Liu Q, Qin A, Liang J, Yan D. Evaluation of atlas-based auto-segmentation and deformable propagation of organs-at-risk for head-and-neck adaptive radiotherapy. Recent Pat Top Imaging 2016 May 24;5(2):79-87 [<a href="https://www.researchgate.net/profile/An_Qin2/publication/304143072_Evaluation_of_Atlas-Based_Auto-Segmentation_and_Deformable_Propagation_of_Organs-at-Risk_for_Head-and-Neck_Adaptive_Radiotherapy/links/5bd8b8fda6fdcc3a8db1722c/Evaluation-of-Atlas-Based-Auto-Segmentation-and-Deformable-Propagation-of-Organs-at-Risk-for-Head-and-Neck-Adaptive-Radiotherapy.pdf" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.2174/2451827105999160415123925">CrossRef</a>]</span></li><li><span id="ref79">Tao C, Yi J, Chen N, Ren W, Cheng J, Tung S, et al. Multi-subject atlas-based auto-segmentation reduces interobserver variation and improves dosimetric parameter consistency for organs at risk in nasopharyngeal carcinoma: a multi-institution clinical study. Radiother Oncol 2015 Jun;115(3):407-411 [<a href="https://doi.org/10.1016/j.radonc.2015.05.012" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1016/j.radonc.2015.05.012">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=26025546&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref80">Wachinger C, Fritscher K, Sharp G, Golland P. Contour-driven atlas-based segmentation. IEEE Trans Med Imaging 2015 Dec;34(12):2492-2505 [<a href="https://doi.org/10.1109/TMI.2015.2442753" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1109/tmi.2015.2442753">CrossRef</a>]</span></li><li><span id="ref81">Zhu M, Bzdusek K, Brink C, Eriksen JG, Hansen O, Jensen HA, et al. Multi-institutional quantitative evaluation and clinical validation of Smart Probabilistic Image Contouring Engine (SPICE) autosegmentation of target structures and normal tissues on computer tomography images in the head and neck, thorax, liver, and male pelvis areas. Int J Radiat Oncol Biol Phys 2013 Nov 15;87(4):809-816 [<a href="https://doi.org/10.1016/j.ijrobp.2013.08.007" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1016/j.ijrobp.2013.08.007">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=24138920&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref82">Teguh DN, Levendag PC, Voet PW, Al-Mamgani A, Han X, Wolf TK, et al. Clinical validation of atlas-based auto-segmentation of multiple target volumes and normal tissue (swallowing/mastication) structures in the head and neck. Int J Radiat Oncol Biol Phys 2011 Nov 15;81(4):950-957. [<a target="_blank" href="https://dx.doi.org/10.1016/j.ijrobp.2010.07.009">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=20932664&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref83">Han X, Hibbard LS, O'Connell NP, Willcut V. Automatic segmentation of parotids in head and neck CT images using multi-atlas fusion. ResearchGate. 2011. &#160; URL: <a target="_blank" href="https://www.researchgate.net/profile/Lyndon-Hibbard/publication/228519091_Automatic_Segmentation_of_Parotids_in_Head_and_Neck_CT_Images_using_Multi-atlas_Fusion/links/0deec516d54dfccb97000000/Automatic-Segmentation-of-Parotids-in-Head-and-Neck-CT-Images-using-Multi-atlas-Fusion.pdf">https:/&#8203;/www.&#8203;researchgate.net/&#8203;profile/&#8203;Lyndon-Hibbard/&#8203;publication/&#8203;228519091_Automatic_Segmentation_of_Parotids_in_Head_and_Neck_CT_Images_using_Multi-atlas_Fusion/&#8203;links/&#8203;0deec516d54dfccb97000000/&#8203;Automatic-Segmentation-of-Parotids-in-Head-and-Neck-CT-Images-using-Multi-atlas-Fusion.&#8203;pdf</a> [accessed 2021-05-27] </span></li><li><span id="ref84">Sims R, Isambert A, Gr&#233;goire V, Bidault F, Fresco L, Sage J, et al. A pre-clinical assessment of an atlas-based automatic segmentation tool for the head and neck. Radiother Oncol 2009 Dec;93(3):474-478. [<a target="_blank" href="https://dx.doi.org/10.1016/j.radonc.2009.08.013">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=19758720&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref85">Han X, Hoogeman MS, Levendag PC, Hibbard LS, Teguh DN, Voet P, et al. Atlas-based auto-segmentation of head and neck CT images. In: Medical Image Computing and Computer-Assisted Intervention &#8211; MICCAI 2008. Berlin, Heidelberg: Springer; 2008:434-441.</span></li><li><span id="ref86">Hoogeman M, Han X, Teguh D, Voet P, Nowak P, Wolf T, et al. Atlas-based auto-segmentation of CT images in head and neck cancer: what is the best approach? Int J Radiat Oncol Biol Phys 2008 Sep;72(1):591 [<a href="https://doi.org/10.1016/j.ijrobp.2008.06.196" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1016/j.ijrobp.2008.06.196">CrossRef</a>]</span></li><li><span id="ref87">Huang C, Badiei M, Seo H, Ma M, Liang X, Capaldi D, et al. Atlas based segmentations via semi-supervised diffeomorphic registrations. arXiv.org: Computer Science - Computer Vision and Pattern Recognition. 2019. &#160; URL: <a target="_blank" href="https://arxiv.org/abs/1911.10417">https://arxiv.org/abs/1911.10417</a> [accessed 2021-05-16] </span></li><li><span id="ref88">Hardcastle N, Tom&#233; WA, Cannon DM, Brouwer CL, Wittendorp PW, Dogan N, et al. A multi-institution evaluation of deformable image registration algorithms for automatic organ delineation in adaptive head and neck radiotherapy. Radiat Oncol 2012 Jun 15;7(1):90 [<a href="https://link.springer.com/article/10.1186/1748-717X-7-90" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1186/1748-717x-7-90">CrossRef</a>]</span></li><li><span id="ref89">La Macchia M, Fellin F, Amichetti M, Cianchetti M, Gianolini S, Paola V, et al. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer. Radiat Oncol 2012;7(1):160 [<a href="https://link.springer.com/article/10.1186/1748-717X-7-160" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1186/1748-717x-7-160">CrossRef</a>]</span></li><li><span id="ref90">Zhang T, Chi Y, Meldolesi E, Yan D. Automatic delineation of on-line head-and-neck computed tomography images: toward on-line adaptive radiotherapy. Int J Radiat Oncol Biol Phys 2007 Jun 01;68(2):522-530 [<a href="https://doi.org/10.1016/j.ijrobp.2007.01.038" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1016/j.ijrobp.2007.01.038">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=17418960&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref91">Dice LR. Measures of the amount of ecologic association between species. Ecol Soc Am 1945;26(3):297-302. [<a target="_blank" href="https://dx.doi.org/10.2307/1932409">CrossRef</a>]</span></li><li><span id="ref92">Hong T, Tome W, Chappel R, Harari P. Variations in target delineation for head and neck IMRT: an international multi-institutional study. Int J Radiat Oncol Biol Phys 2004 Sep;60:157-158 [<a href="http://www.sciencedirect.com/science/article/pii/S0360301604011307" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1016/s0360-3016(04)01130-7">CrossRef</a>]</span></li><li><span id="ref93">Wuthrick EJ, Zhang Q, Machtay M, Rosenthal DI, Nguyen-Tan PF, Fortin A, et al. Institutional clinical trial accrual volume and survival of patients with head and neck cancer. J Clin Oncol 2015 Jan 10;33(2):156-164 [<a href="https://doi.org/10.1200/JCO.2014.56.5218" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1200/jco.2014.56.5218">CrossRef</a>]</span></li><li><span id="ref94">Vaassen F, Hazelaar C, Vaniqui A, Gooding M, van der Heyden B, Canters R, et al. Evaluation of measures for assessing time-saving of automatic organ-at-risk segmentation in radiotherapy. Phys Imaging Radiat Oncol 2020 Jan;13:1-6 [<a href="https://linkinghub.elsevier.com/retrieve/pii/S2405-6316(19)30063-6" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1016/j.phro.2019.12.001">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=33458300&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref95">Kiser KJ, Barman A, Stieb S, Fuller CD, Giancardo L. Novel autosegmentation spatial similarity metrics capture the time required to correct segmentations better than traditional metrics in a thoracic cavity segmentation workflow. medRxiv 2020. [<a target="_blank" href="https://dx.doi.org/10.1101/2020.05.14.20102103">CrossRef</a>]</span></li><li><span id="ref96">Sharp G, Fritscher KD, Pekar V, Peroni M, Shusharina N, Veeraraghavan H, et al. Vision 20/20: perspectives on automated image segmentation for radiotherapy. Med Phys 2014 May 24;41(5):050902 [<a href="http://europepmc.org/abstract/MED/24784366" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1118/1.4871620">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=24784366&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref97">Kosmin M, Ledsam J, Romera-Paredes B, Mendes R, Moinuddin S, de Souza D, et al. Rapid advances in auto-segmentation of organs at risk and target volumes in head and neck cancer. Radiother Oncol 2019 Jun;135:130-140 [<a href="https://doi.org/10.1016/j.radonc.2019.03.004" target="_blank">FREE Full text</a>] [<a target="_blank" href="https://dx.doi.org/10.1016/j.radonc.2019.03.004">CrossRef</a>] [<a href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=31015159&amp;dopt=Abstract" target="_blank">Medline</a>]</span></li><li><span id="ref98">Guo D, Jin D, Zhu Z, Ho TY, Harrison HP, Chao CH, et al. Organ at risk segmentation for head and neck cancer using stratified learning and neural architecture search. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). 2020 Presented at: IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR); June 13-19, 2020; Seattle, WA, USA. [<a target="_blank" href="https://dx.doi.org/10.1109/cvpr42600.2020.00428">CrossRef</a>]</span></li><li><span id="ref99">Surface distance. DeepMind. &#160; URL: <a target="_blank" href="https://github.com/deepmind/surface-distance">https://github.com/deepmind/surface-distance</a> [accessed 2021-05-27] </span></li><li><span id="ref100">TCIA CT scan dataset. DeepMind. &#160; URL: <a target="_blank" href="https://github.com/deepmind/tcia-ct-scan-dataset">https://github.com/deepmind/tcia-ct-scan-dataset</a> [accessed 2021-05-27] </span></li></ol></div><br/><hr/><a name="Abbreviations">&#8206;</a><h4 class="navigation-heading" id="Abbreviations" data-label="Abbreviations">Abbreviations</h4><table width="80%" border="0" align="center"><tr><td><b>CT:</b> computed tomography</td></tr><tr><td><b>DSC:</b> Dice similarity coefficient</td></tr><tr><td><b>NHS:</b> National Health Service</td></tr><tr><td><b>PDDCA:</b> public domain database for computational anatomy</td></tr><tr><td><b>TCGA-HNSC:</b> The Cancer Genome Atlas Head-Neck Squamous Cell Carcinoma</td></tr><tr><td><b>TCIA:</b> The Cancer Imaging Archive</td></tr><tr><td><b>UCLH:</b> University College London Hospitals</td></tr></table><br/><hr/><p style="font-style: italic">Edited by R Kukafka; submitted 30.11.20; peer-reviewed by JA Ben&#237;tez-Andrades, R Vilela; comments to author 11.01.21; revised version received 10.02.21; accepted 30.04.21; published 12.07.21</p><a href="https://support.jmir.org/hc/en-us/articles/115002955531" id="Copyright" target="_blank" class="navigation-heading h4 d-block" aria-label="Copyright - what is a Creative Commons License?" data-label="Copyright">Copyright <span class="fas fa-question-circle"/></a><p class="article-copyright">&#169;Stanislav Nikolov, Sam Blackwell, Alexei Zverovitch, Ruheena Mendes, Michelle Livne, Jeffrey De Fauw, Yojan Patel, Clemens Meyer, Harry Askham, Bernadino Romera-Paredes, Christopher Kelly, Alan Karthikesalingam, Carlton Chu, Dawn Carnell, Cheng Boon, Derek D'Souza, Syed Ali Moinuddin, Bethany Garie, Yasmin McQuinlan, Sarah Ireland, Kiarna Hampton, Krystle Fuller, Hugh Montgomery, Geraint Rees, Mustafa Suleyman, Trevor Back, C&#237;an Owen Hughes, Joseph R Ledsam, Olaf Ronneberger. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 12.07.2021.</p><small class="article-license"><p class="abstract-paragraph">This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.</p></small><br/></section></article></section></section></main></div></div></div></div> <aside data-test="sidebar-exists" class="sidebar-citation col-lg-3 mb-5"><!----> <div><h2 class="h4 green-heading-underline width-fit-content"> Citation </h2> <p class="fw-bold"> Please cite as: </p> <p><span> Nikolov S<span>,</span></span><span> Blackwell S<span>,</span></span><span> Zverovitch A<span>,</span></span><span> Mendes R<span>,</span></span><span> Livne M<span>,</span></span><span> De Fauw J<span>,</span></span><span> Patel Y<span>,</span></span><span> Meyer C<span>,</span></span><span> Askham H<span>,</span></span><span> Romera-Paredes B<span>,</span></span><span> Kelly C<span>,</span></span><span> Karthikesalingam A<span>,</span></span><span> Chu C<span>,</span></span><span> Carnell D<span>,</span></span><span> Boon C<span>,</span></span><span> D'Souza D<span>,</span></span><span> Moinuddin SA<span>,</span></span><span> Garie B<span>,</span></span><span> McQuinlan Y<span>,</span></span><span> Ireland S<span>,</span></span><span> Hampton K<span>,</span></span><span> Fuller K<span>,</span></span><span> Montgomery H<span>,</span></span><span> Rees G<span>,</span></span><span> Suleyman M<span>,</span></span><span> Back T<span>,</span></span><span> Hughes CO<span>,</span></span><span> Ledsam JR<span>,</span></span><span> Ronneberger O<!----></span> <br> <span>Clinically Applicable Segmentation of Head and Neck Anatomy for Radiotherapy: Deep Learning Algorithm Development and Validation Study</span> <br> <span>J Med Internet Res 2021;23(7):e26151</span> <br> <span>doi: <span><a aria-label="DOI number 10.2196/26151" data-test="article-doi" target="_blank" href="https://doi.org/10.2196/26151"> 10.2196/26151 </a></span></span> <span style="display: block"> PMID: <span><a data-test="article-pmid" aria-label="PMID 34255661" target="_blank" href="https://www.ncbi.nlm.nih.gov/pubmed/34255661">34255661</a></span></span> <span style="display: block"> PMCID: <span><a data-test="article-pmcid" aria-label="PMCID 8314151" target="_blank" href="https://www.ncbi.nlm.nih.gov/pmc/articles/8314151">8314151</a></span></span></p> <button title="Copy Citation" data-test="copy-to-clipboard-button" class="btn btn-small btn-grey"><span aria-hidden="true" class="icon fas fa-paste"></span> Copy Citation to Clipboard </button> <!----></div> <div class="export-metadata"><h2 class="h4 green-heading-underline width-fit-content"> Export Metadata </h2> <div><a aria-label="Export metadata in END" target="_blank" data-test="test-end-link" href="https://www.jmir.org/article/export/end/jmir_v23i7e26151" rel="noreferrer"> END </a><span> for: Endnote</span></div> <div><a aria-label="Export metadata in BibTeX" target="_blank" data-test="test-bib-link" href="https://www.jmir.org/article/export/bib/jmir_v23i7e26151" rel="noreferrer"> BibTeX </a><span> for: BibDesk, LaTeX</span></div> <div><a aria-label="Export metadata in RIS" target="_blank" data-test="test-ris-link" href="https://www.jmir.org/article/export/ris/jmir_v23i7e26151" rel="noreferrer"> RIS </a><span> for: RefMan, Procite, Endnote, RefWorks</span></div> <div><a target="_blank" data-test="doi-link" href="http://www.mendeley.com/import/?doi=10.2196/26151"> Add this article to your Mendeley library </a></div></div> <div class="collection desktop-show"><h2 tabindex="0" data-test="article-collection" class="h4 green-heading-underline width-fit-content"> This paper is in the following <span class="collection__span">e-collection/theme issue:</span></h2> <a href="/themes/797" data-test="article-collection" aria-label="1389 articles belongs to Artificial Intelligence e-collection/theme issue" class="collection__link"> Artificial Intelligence (1389) </a><a href="/themes/500" data-test="article-collection" aria-label="1478 articles belongs to Machine Learning e-collection/theme issue" class="collection__link"> Machine Learning (1478) </a><a href="/themes/58" data-test="article-collection" aria-label="1033 articles belongs to Clinical Informatics e-collection/theme issue" class="collection__link"> Clinical Informatics (1033) </a><a href="/themes/412" data-test="article-collection" aria-label="195 articles belongs to Imaging Informatics e-collection/theme issue" class="collection__link"> Imaging Informatics (195) </a><a href="/themes/186" data-test="article-collection" aria-label="1194 articles belongs to Decision Support for Health Professionals e-collection/theme issue" class="collection__link"> Decision Support for Health Professionals (1194) </a><a href="/themes/67" data-test="article-collection" aria-label="1415 articles belongs to Clinical Information and Decision Making e-collection/theme issue" class="collection__link"> Clinical Information and Decision Making (1415) </a><a href="/themes/297" data-test="article-collection" aria-label="425 articles belongs to Innovations and Technology in Cancer Care e-collection/theme issue" class="collection__link"> Innovations and Technology in Cancer Care (425) </a></div> <div><h2 class="h4 green-heading-underline width-fit-content"> Download </h2> <div class="download-btns"><a target="_blank" href="https://www.jmir.org/2021/7/e26151/PDF" aria-label="Download PDF" data-test="pdf-button" class="btn btn-small btn-grey mt-1"><span aria-hidden="true" class="icon fas fa-download"></span> Download PDF</a> <a target="_blank" href="https://www.jmir.org/2021/7/e26151/XML" aria-label="Download XML" data-test="xml-button" class="btn btn-small btn-grey mt-1"><span aria-hidden="true" class="icon fas fa-download"></span> Download XML</a></div></div> <div><h2 class="h4 green-heading-underline width-fit-content"> Share Article </h2> <span class="sm-icons"><a title="share-on-Twitter" href="https://www.jmir.org/2021/7/e26151?text=Check out this fascinating article" aria-label="Share this item on Twitter" data-test="twitter-button" rel="noreferrer" target="_blank" class="twitter small"></a> <a title="share-on-Facebook" href="https://www.jmir.org/2021/7/e26151?quote=Check out this fascinating article" aria-label="Share this item on Facebook" data-test="facebook-button" rel="noreferrer" target="_blank" class="facebook small"></a> <a title="Linkedin" href="https://www.linkedin.com/sharing/share-offsite/?url=https://www.jmir.org/2021/7/e26151" aria-label="Share this item on Linkedin" data-test="linkedin-button" rel="noreferrer" target="_blank" class="linkedin small"></a></span></div> <h2 class="h4 green-heading-underline width-fit-content careers"> Career Opportunities </h2> <div class="career-widget"><a href="https://careers.jmir.org/jobs/json/20807699/psychiatrist" data-test="job-link" target="_blank" class="job"><p class="job-title"> Psychiatrist </p> <p> California Correctional Health Care Services </p> <p> Tehachapi, California </p></a></div><div class="career-widget"><a href="https://careers.jmir.org/jobs/json/20807658/psychologist" data-test="job-link" target="_blank" class="job"><p class="job-title"> Psychologist </p> <p> California Correctional Health Care Services </p> <p> Blythe, California </p></a></div><div class="career-widget"><a href="https://careers.jmir.org/jobs/json/20807706/clinical-social-worker" data-test="job-link" target="_blank" class="job"><p class="job-title"> Clinical Social Worker </p> <p> California Correctional Health Care Services </p> <p> San Diego, California </p></a></div><div class="career-widget"><a href="https://careers.jmir.org/jobs/json/20807713/clinical-social-worker" data-test="job-link" target="_blank" class="job"><p class="job-title"> Clinical Social Worker </p> <p> California Correctional Health Care Services </p> <p> Blythe, California </p></a></div><div class="career-widget"><a href="https://careers.jmir.org/jobs/json/20807704/clinical-social-worker" data-test="job-link" target="_blank" class="job"><p class="job-title"> Clinical Social Worker </p> <p> California Correctional Health Care Services </p> <p> Jamestown, California </p></a></div></aside></div></div></div></div></div> <div><section data-test="footer" class="footer"><footer id="footer"><div class="container-fluid footer-journal-name"><div class="col-12"><h2 data-test="journal-name"> Journal of Medical Internet Research <span> ISSN 1438-8871 </span></h2></div></div> <div class="container"><div class="row"><div class="col-lg-3 col-6"><h3 tabindex="0" class="footer-title"> Resource Centre </h3> <ul><li data-test="resource-links"><a href="/resource-centre/author-hub"> Author Hub </a></li><li data-test="resource-links"><a href="/resource-centre/editor-hub"> Editor Hub </a></li><li data-test="resource-links"><a href="/resource-centre/reviewer-hub"> Reviewer Hub </a></li><li data-test="resource-links"><a href="/resource-centre/librarian-hub"> Librarian Hub </a></li></ul></div> <div class="col-lg-3 col-6"><h3 tabindex="0" class="footer-title"> Browse Journal </h3> <ul><li data-test="journal-links"><a href="/announcements"> Latest Announcements </a></li><li data-test="journal-links"><a href="/search/authors"> Authors </a></li> <li data-test="journal-links"><a href="/themes"> Themes </a></li><li data-test="journal-links"><a href="/issues"> Issues </a></li> <li data-test="journal-links"><a href="https://blog.jmir.org "> Blog </a></li></ul></div> <div class="col-lg-3 col-6"><h3 tabindex="0" class="footer-title"> About </h3> <ul><li data-test="about-links"><a href="/privacy-statement"> Privacy Statement </a></li><li data-test="about-links"><a href="/contact-us"> Contact Us </a></li> <li><a href="/sitemap.xml" target="_blank"> Sitemap </a></li></ul></div> <div class="col-lg-3 col-6"><h3 tabindex="0" class="footer-title"> Connect With Us </h3> <span class="sm-icons"><a aria-label="JMIR Publications Tweeter account" title="Twitter" href="https://twitter.com/jmirpub" target="_blank" rel="noreferrer" data-test="social-links" class="twitter mr-1"></a><a aria-label="JMIR Publications Facebook account" title="Facebook" href="https://www.facebook.com/JMedInternetRes" target="_blank" rel="noreferrer" data-test="social-links" class="facebook mr-1"></a><a aria-label="JMIR Publications Linkedin account" title="Linkedin" href="https://www.linkedin.com/company/jmir-publications" target="_blank" rel="noreferrer" data-test="social-links" class="linkedin mr-1"></a><a aria-label="JMIR Publications YouTube account" title="YouTube" href="https://www.youtube.com/c/JMIRPublications" target="_blank" rel="noreferrer" data-test="social-links" class="youtube mr-1"></a><a aria-label="JMIR Publications Instagram account" title="Instagram" href="https://www.instagram.com/jmirpub" target="_blank" rel="noreferrer" data-test="social-links" class="instagram"></a> <a target="_blank" rel="noreferrer" aria-label="RSS Subscription" title="RSS Subscription" href="https://www.jmir.org/feed/atom" class="rss"></a></span></div> <div class="email-subscribtion-button col-lg-3 col-md-6 col-sm-6 col-12"><h3 tabindex="0" class="footer-title"> Get Table of Contents Alerts </h3> <a target="_blank" rel="noopener noreferrer" aria-label="Newsletter Subscription" title="Newsletter Subscription" href="https://landingpage.jmirpublications.com/journal-preference-selection"><span>Get Alerts</span> <span class="icon fas fa-paper-plane"></span></a></div> <div class="col-12 text-center mt-5"><p class="footer-copyright"> Copyright © <time datetime="2024"> 2024 </time> JMIR Publications </p></div></div></div></footer></section></div></div> <!----> <div><a tabindex="0" href="javascript:;" title="Scroll to the top of the page" role="button" class="scroll-to-very-top"><span aria-hidden="true" class="icon fas fa-chevron-up"></span></a></div> <!----><!----><!----><!----><!----><!----><!----><!----><!----><!----><!----><!----><!----></div></div></div><script>window.__NUXT__=(function(a,b,c,d,e,f,g,h,i,j,k,l,m,n,o,p,q,r,s,t,u,v,w,x,y,z,A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z,_,$,aa,ab,ac,ad,ae,af,ag,ah,ai,aj,ak,al,am,an,ao,ap,aq,ar,as,at,au,av,aw,ax,ay,az,aA,aB,aC,aD,aE,aF,aG,aH,aI,aJ,aK,aL,aM,aN,aO,aP,aQ,aR,aS,aT,aU,aV,aW,aX,aY,aZ,a_,a$,ba,bb,bc,bd,be,bf,bg,bh,bi,bj,bk,bl,bm,bn,bo,bp,bq,br,bs){return {layout:"front",data:[{tabs:{html:{name:"Article",route:c,path:"articleHtml"},authors:{name:"Authors",route:"authors",path:"articleAuthors"},citations:{name:"Cited by (147)",route:"citations",path:"articleCitations"},tweetations:{name:"Tweetations (12)",route:"tweetations",path:"articleTweetations"},metrics:{name:"Metrics",route:"metrics",path:"articleMetrics"}},registeredReport:G,jobs:[{title:"Psychiatrist",employer:o,city:"Tehachapi",state_province:p,url:"https:\u002F\u002Fcareers.jmir.org\u002Fjobs\u002Fjson\u002F20807699\u002Fpsychiatrist"},{title:"Psychologist",employer:o,city:ae,state_province:p,url:"https:\u002F\u002Fcareers.jmir.org\u002Fjobs\u002Fjson\u002F20807658\u002Fpsychologist"},{title:H,employer:o,city:"San Diego",state_province:p,url:"https:\u002F\u002Fcareers.jmir.org\u002Fjobs\u002Fjson\u002F20807706\u002Fclinical-social-worker"},{title:H,employer:o,city:ae,state_province:p,url:"https:\u002F\u002Fcareers.jmir.org\u002Fjobs\u002Fjson\u002F20807713\u002Fclinical-social-worker"},{title:H,employer:o,city:"Jamestown",state_province:p,url:"https:\u002F\u002Fcareers.jmir.org\u002Fjobs\u002Fjson\u002F20807704\u002Fclinical-social-worker"}]},{html:"\u003Cmain id=\"wrapper\" class=\"wrapper ArticleMain clearfix\"\u003E\u003Csection class=\"inner-wrapper clearfix\"\u003E\u003Csection class=\"main-article-content clearfix\"\u003E\u003Carticle class=\"ajax-article-content\"\u003E\u003Ch4 class=\"h4-original-paper\"\u003E\u003Cspan class=\"typcn typcn-document-text\"\u002F\u003EOriginal Paper\u003C\u002Fh4\u003E\u003Cdiv class=\"authors-container\"\u003E\u003Cdiv class=\"authors clearfix\"\u002F\u003E\u003C\u002Fdiv\u003E\u003Cdiv class=\"authors-container\"\u003E\u003Cdiv class=\"authors clearfix\"\u002F\u003E\u003C\u002Fdiv\u003E\u003Cdiv class=\"authors-container\"\u003E\u003Cdiv class=\"authors clearfix\"\u003E\u003Cul class=\"clearfix\"\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Stanislav+Nikolov\" class=\"btn-view-author-options\"\u003EStanislav Nikolov\u003Csup\u003E\u003Csmall\u003E1\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E\u003Csup\u003E*\u003C\u002Fsup\u003E, MEng\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0001-8234-0751\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Sam+Blackwell\" class=\"btn-view-author-options\"\u003ESam Blackwell\u003Csup\u003E\u003Csmall\u003E1\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E\u003Csup\u003E*\u003C\u002Fsup\u003E, MEng\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0001-8730-3036\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Alexei+Zverovitch\" class=\"btn-view-author-options\"\u003EAlexei Zverovitch\u003Csup\u003E\u003Csmall\u003E2\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E\u003Csup\u003E*\u003C\u002Fsup\u003E, PhD\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0002-0567-5440\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Ruheena+Mendes\" class=\"btn-view-author-options\"\u003ERuheena Mendes\u003Csup\u003E\u003Csmall\u003E3\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, MB ChB\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0003-4754-1181\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Michelle+Livne\" class=\"btn-view-author-options\"\u003EMichelle Livne\u003Csup\u003E\u003Csmall\u003E2\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, PhD\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0002-8277-4733\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Jeffrey+De Fauw\" class=\"btn-view-author-options\"\u003EJeffrey De Fauw\u003Csup\u003E\u003Csmall\u003E1\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, BSc\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0001-6971-5678\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Yojan+Patel\" class=\"btn-view-author-options\"\u003EYojan Patel\u003Csup\u003E\u003Csmall\u003E2\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, BA\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0001-6397-6279\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Clemens+Meyer\" class=\"btn-view-author-options\"\u003EClemens Meyer\u003Csup\u003E\u003Csmall\u003E1\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, MSc\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0003-1165-6104\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Harry+Askham\" class=\"btn-view-author-options\"\u003EHarry Askham\u003Csup\u003E\u003Csmall\u003E1\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, MSc\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0003-1530-4683\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Bernadino+Romera-Paredes\" class=\"btn-view-author-options\"\u003EBernadino Romera-Paredes\u003Csup\u003E\u003Csmall\u003E1\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, PhD\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0003-3604-3590\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Christopher+Kelly\" class=\"btn-view-author-options\"\u003EChristopher Kelly\u003Csup\u003E\u003Csmall\u003E2\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, PhD\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0002-1246-844X\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Alan+Karthikesalingam\" class=\"btn-view-author-options\"\u003EAlan Karthikesalingam\u003Csup\u003E\u003Csmall\u003E2\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, PhD\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0001-5074-898X\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Carlton+Chu\" class=\"btn-view-author-options\"\u003ECarlton Chu\u003Csup\u003E\u003Csmall\u003E1\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, PhD\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0001-8282-6364\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Dawn+Carnell\" class=\"btn-view-author-options\"\u003EDawn Carnell\u003Csup\u003E\u003Csmall\u003E3\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, MD\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0002-2898-3219\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Cheng+Boon\" class=\"btn-view-author-options\"\u003ECheng Boon\u003Csup\u003E\u003Csmall\u003E4\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, MB ChB\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0003-2652-9263\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Derek+D'Souza\" class=\"btn-view-author-options\"\u003EDerek D'Souza\u003Csup\u003E\u003Csmall\u003E3\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, MSc\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0002-4393-7683\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Syed Ali+Moinuddin\" class=\"btn-view-author-options\"\u003ESyed Ali Moinuddin\u003Csup\u003E\u003Csmall\u003E3\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, MSc\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0002-8955-8224\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Bethany+Garie\" class=\"btn-view-author-options\"\u003EBethany Garie\u003Csup\u003E\u003Csmall\u003E1\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, BMRSc (RT)\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0003-3538-9063\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Yasmin+McQuinlan\" class=\"btn-view-author-options\"\u003EYasmin McQuinlan\u003Csup\u003E\u003Csmall\u003E1\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, BRT\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0002-8464-0640\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Sarah+Ireland\" class=\"btn-view-author-options\"\u003ESarah Ireland\u003Csup\u003E\u003Csmall\u003E1\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, BMRSc (RT)\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0002-2975-2447\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Kiarna+Hampton\" class=\"btn-view-author-options\"\u003EKiarna Hampton\u003Csup\u003E\u003Csmall\u003E1\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, MPH\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0002-4384-6108\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Krystle+Fuller\" class=\"btn-view-author-options\"\u003EKrystle Fuller\u003Csup\u003E\u003Csmall\u003E1\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, BAppSc (RT)\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0003-0706-6857\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Hugh+Montgomery\" class=\"btn-view-author-options\"\u003EHugh Montgomery\u003Csup\u003E\u003Csmall\u003E5\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, BSc, MB BS, MD\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0001-8797-5019\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Geraint+Rees\" class=\"btn-view-author-options\"\u003EGeraint Rees\u003Csup\u003E\u003Csmall\u003E5\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, PhD\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0002-9623-7007\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Mustafa+Suleyman\" class=\"btn-view-author-options\"\u003EMustafa Suleyman\u003Csup\u003E\u003Csmall\u003E6\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0002-5415-4457\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Trevor+Back\" class=\"btn-view-author-options\"\u003ETrevor Back\u003Csup\u003E\u003Csmall\u003E1\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E, PhD\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0002-0567-8043\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=C&#237;an Owen+Hughes\" class=\"btn-view-author-options\"\u003EC&#237;an Owen Hughes\u003Csup\u003E\u003Csmall\u003E2\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E\u003Csup\u003E*\u003C\u002Fsup\u003E, MBChB, MRCS, MSc\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0001-6901-0985\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Joseph R+Ledsam\" class=\"btn-view-author-options\"\u003EJoseph R Ledsam\u003Csup\u003E\u003Csmall\u003E7\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E\u003Csup\u003E*\u003C\u002Fsup\u003E, MB ChB\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0001-9917-7196\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E;&#160;\u003C\u002Fli\u003E\u003Cli\u003E\u003Ca href=\"\u002Fsearch\u002FsearchResult?field%5B%5D=author&amp;criteria%5B%5D=Olaf+Ronneberger\" class=\"btn-view-author-options\"\u003EOlaf Ronneberger\u003Csup\u003E\u003Csmall\u003E1\u003C\u002Fsmall\u003E\u003C\u002Fsup\u003E\u003Csup\u003E*\u003C\u002Fsup\u003E, PhD\u003C\u002Fa\u003E\u003Ca class=\"author-orcid\" href=\"https:\u002F\u002Forcid.org\u002F0000-0002-4266-1515\" target=\"_blank\" title=\"ORCID\"\u003E&#160;\u003C\u002Fa\u003E\u003C\u002Fli\u003E\u003C\u002Ful\u003E\u003Cdiv class=\"author-affiliation-details\"\u003E\u003Cp\u003E\u003Csup\u003E1\u003C\u002Fsup\u003EDeepMind, London, United Kingdom\u003C\u002Fp\u003E\u003Cp\u003E\u003Csup\u003E2\u003C\u002Fsup\u003EGoogle Health, London, United Kingdom\u003C\u002Fp\u003E\u003Cp\u003E\u003Csup\u003E3\u003C\u002Fsup\u003EUniversity College London Hospitals NHS Foundation Trust, London, United Kingdom\u003C\u002Fp\u003E\u003Cp\u003E\u003Csup\u003E4\u003C\u002Fsup\u003EClatterbridge Cancer Centre NHS Foundation Trust, Liverpool, United Kingdom\u003C\u002Fp\u003E\u003Cp\u003E\u003Csup\u003E5\u003C\u002Fsup\u003EUniversity College London, London, United Kingdom\u003C\u002Fp\u003E\u003Cp\u003E\u003Csup\u003E6\u003C\u002Fsup\u003EGoogle, London, United Kingdom\u003C\u002Fp\u003E\u003Cp\u003E\u003Csup\u003E7\u003C\u002Fsup\u003EGoogle AI, Tokyo, Japan\u003C\u002Fp\u003E\u003Cp\u003E*these authors contributed equally\u003C\u002Fp\u003E\u003C\u002Fdiv\u003E\u003C\u002Fdiv\u003E\u003Cdiv class=\"corresponding-author-and-affiliations clearfix\"\u003E\u003Cdiv class=\"corresponding-author-details\"\u003E\u003Ch3\u003ECorresponding Author:\u003C\u002Fh3\u003E\u003Cp\u003EC&#237;an Owen Hughes, MBChB, MRCS, MSc\u003C\u002Fp\u003E\u003Cp\u002F\u003E\u003Cp\u003EGoogle Health\u003C\u002Fp\u003E\u003Cp\u003E6 Pancras Square\u003C\u002Fp\u003E\u003Cp\u003ELondon, N1C 4AG\u003C\u002Fp\u003E\u003Cp\u003EUnited Kingdom\u003C\u002Fp\u003E\u003Cp\u003EPhone: 1 650 253 0000\u003C\u002Fp\u003E\u003Cp\u003EFax:1 650 253 0001\u003C\u002Fp\u003E\u003Cp\u003EEmail: \u003Ca href=\"mailto:cianh@google.com\"\u003Ecianh@google.com\u003C\u002Fa\u003E\u003C\u002Fp\u003E\u003Cbr\u002F\u003E\u003C\u002Fdiv\u003E\u003C\u002Fdiv\u003E\u003C\u002Fdiv\u003E\u003Csection class=\"article-content clearfix\"\u003E\u003Carticle class=\"abstract\"\u003E\u003Ch3 id=\"Abstract\" class=\"navigation-heading\" data-label=\"Abstract\"\u003EAbstract\u003C\u002Fh3\u003E\u003Cp\u003E\u003Cspan class=\"abstract-sub-heading\"\u003EBackground: \u003C\u002Fspan\u003EOver half a million individuals are diagnosed with head and neck cancer each year globally. Radiotherapy is an important curative treatment for this disease, but it requires manual time to delineate radiosensitive organs at risk. This planning process can delay treatment while also introducing interoperator variability, resulting in downstream radiation dose differences. Although auto-segmentation algorithms offer a potentially time-saving solution, the challenges in defining, quantifying, and achieving expert performance remain.\u003Cbr\u002F\u003E\u003C\u002Fp\u003E\u003Cp\u003E\u003Cspan class=\"abstract-sub-heading\"\u003EObjective: \u003C\u002Fspan\u003EAdopting a deep learning approach, we aim to demonstrate a 3D U-Net architecture that achieves expert-level performance in delineating 21 distinct head and neck organs at risk commonly segmented in clinical practice.\u003Cbr\u002F\u003E\u003C\u002Fp\u003E\u003Cp\u003E\u003Cspan class=\"abstract-sub-heading\"\u003EMethods: \u003C\u002Fspan\u003EThe model was trained on a data set of 663 deidentified computed tomography scans acquired in routine clinical practice and with both segmentations taken from clinical practice and segmentations created by experienced radiographers as part of this research, all in accordance with consensus organ at risk definitions.\u003Cbr\u002F\u003E\u003C\u002Fp\u003E\u003Cp\u003E\u003Cspan class=\"abstract-sub-heading\"\u003EResults: \u003C\u002Fspan\u003EWe demonstrated the model&#8217;s clinical applicability by assessing its performance on a test set of 21 computed tomography scans from clinical practice, each with 21 organs at risk segmented by 2 independent experts. We also introduced surface Dice similarity coefficient, a new metric for the comparison of organ delineation, to quantify the deviation between organ at risk surface contours rather than volumes, better reflecting the clinical task of correcting errors in automated organ segmentations. The model&#8217;s generalizability was then demonstrated on 2 distinct open-source data sets, reflecting different centers and countries to model training.\u003Cbr\u002F\u003E\u003C\u002Fp\u003E\u003Cp\u003E\u003Cspan class=\"abstract-sub-heading\"\u003EConclusions: \u003C\u002Fspan\u003EDeep learning is an effective and clinically applicable technique for the segmentation of the head and neck anatomy for radiotherapy. With appropriate validation studies and regulatory approvals, this system could improve the efficiency, consistency, and safety of radiotherapy pathways.\u003Cbr\u002F\u003E\u003C\u002Fp\u003E\u003Cstrong class=\"h4-article-volume-issue\"\u003EJ Med Internet Res 2021;23(7):e26151\u003C\u002Fstrong\u003E\u003Cbr\u002F\u003E\u003Cbr\u002F\u003E\u003Cspan class=\"article-doi\"\u003E\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.2196\u002F26151\"\u003Edoi:10.2196\u002F26151\u003C\u002Fa\u003E\u003C\u002Fspan\u003E\u003Cbr\u002F\u003E\u003Cbr\u002F\u003E\u003Ch3 class=\"h3-main-heading\" id=\"Keywords\"\u003EKeywords\u003C\u002Fh3\u003E\u003Cdiv class=\"keywords\"\u003E\u003Cspan\u003E\u003Ca href=\"\u002Fsearch?type=keyword&amp;term=radiotherapy\"\u003Eradiotherapy\u003C\u002Fa\u003E;&#160;\u003C\u002Fspan\u003E\u003Cspan\u003E\u003Ca href=\"\u002Fsearch?type=keyword&amp;term=segmentation\"\u003Esegmentation\u003C\u002Fa\u003E;&#160;\u003C\u002Fspan\u003E\u003Cspan\u003E\u003Ca href=\"\u002Fsearch?type=keyword&amp;term=contouring\"\u003Econtouring\u003C\u002Fa\u003E;&#160;\u003C\u002Fspan\u003E\u003Cspan\u003E\u003Ca href=\"\u002Fsearch?type=keyword&amp;term=machine learning\"\u003Emachine learning\u003C\u002Fa\u003E;&#160;\u003C\u002Fspan\u003E\u003Cspan\u003E\u003Ca href=\"\u002Fsearch?type=keyword&amp;term=artificial intelligence\"\u003Eartificial intelligence\u003C\u002Fa\u003E;&#160;\u003C\u002Fspan\u003E\u003Cspan\u003E\u003Ca href=\"\u002Fsearch?type=keyword&amp;term=UNet\"\u003EUNet\u003C\u002Fa\u003E;&#160;\u003C\u002Fspan\u003E\u003Cspan\u003E\u003Ca href=\"\u002Fsearch?type=keyword&amp;term=convolutional neural networks\"\u003Econvolutional neural networks\u003C\u002Fa\u003E;&#160;\u003C\u002Fspan\u003E\u003Cspan\u003E\u003Ca href=\"\u002Fsearch?type=keyword&amp;term=surface DSC\"\u003Esurface DSC\u003C\u002Fa\u003E&#160;\u003C\u002Fspan\u003E\u003C\u002Fdiv\u003E\u003Cdiv id=\"trendmd-suggestions\"\u002F\u003E\u003C\u002Farticle\u003E\u003Cbr\u002F\u003E\u003Carticle class=\"main-article clearfix\"\u003E\u003Cbr\u002F\u003E\u003Ch3 class=\"navigation-heading h3-main-heading\" id=\"Introduction\" data-label=\"Introduction\"\u003EIntroduction\u003C\u002Fh3\u003E\u003Ch4\u003EBackground\u003C\u002Fh4\u003E\u003Cp class=\"abstract-paragraph\"\u003EEach year, 550,000 people worldwide are diagnosed with cancer of the head and neck [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref1\" rel=\"footnote\"\u003E1\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]. This incidence is rising [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref2\" rel=\"footnote\"\u003E2\u003C\u002Fa\u003E\u003C\u002Fspan\u003E] and more than doubling in certain subgroups over the last 30 years [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref3\" rel=\"footnote\"\u003E3\u003C\u002Fa\u003E\u003C\u002Fspan\u003E-\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref5\" rel=\"footnote\"\u003E5\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]. Where available, most patients will be treated with radiotherapy, which targets the tumor mass and areas at high risk of microscopic tumor spread. However, strategies are needed to mitigate the dose-dependent adverse effects that result from incidental irradiation of normal anatomical structures (\u003Ci\u003Eorgans at risk\u003C\u002Fi\u003E) [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref6\" rel=\"footnote\"\u003E6\u003C\u002Fa\u003E\u003C\u002Fspan\u003E-\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref9\" rel=\"footnote\"\u003E9\u003C\u002Fa\u003E\u003C\u002Fspan\u003E].\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EThus, the efficacy and safety of head and neck radiotherapy depends on the accurate delineation of organs at risk and tumors, a process known as segmentation or contouring. However, the fact that this process is predominantly done manually means that results may be both inconsistent and imperfectly accurate [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref10\" rel=\"footnote\"\u003E10\u003C\u002Fa\u003E\u003C\u002Fspan\u003E], leading to large inter- and intrapractitioner variability even among experts and thus variation in care quality [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref11\" rel=\"footnote\"\u003E11\u003C\u002Fa\u003E\u003C\u002Fspan\u003E].\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003ESegmentation is also very time consuming: an expert can spend 4 hours or more on a single case [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref12\" rel=\"footnote\"\u003E12\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]. The duration of resulting delays in treatment initiation (\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#figure1\" rel=\"footnote\"\u003EFigure 1\u003C\u002Fa\u003E\u003C\u002Fspan\u003E) is associated with an increased risk of both local recurrence and overall mortality [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref13\" rel=\"footnote\"\u003E13\u003C\u002Fa\u003E\u003C\u002Fspan\u003E-\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref15\" rel=\"footnote\"\u003E15\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]. Increasing demands for, and shortages of, trained staff already place a heavy burden on health care systems, which can lead to long delays for patients as radiotherapy is planned [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref16\" rel=\"footnote\"\u003E16\u003C\u002Fa\u003E\u003C\u002Fspan\u003E,\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref17\" rel=\"footnote\"\u003E17\u003C\u002Fa\u003E\u003C\u002Fspan\u003E], and the continued rise in head and neck cancer incidence may make it impossible to maintain even current temporal reporting standards [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref4\" rel=\"footnote\"\u003E4\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]. Such issues also represent a barrier to \u003Ci\u003Eadaptive radiotherapy\u003C\u002Fi\u003E&#8212;the process of repeated scanning, segmentation, and radiotherapy planning throughout treatment, which maintains the precision of tumor targeting (and organ at risk avoidance) in the face of treatment-related anatomic changes such as tumor shrinkage [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref18\" rel=\"footnote\"\u003E18\u003C\u002Fa\u003E\u003C\u002Fspan\u003E].\u003C\u002Fp\u003E\u003Cfigure\u003E\u003Ca name=\"figure1\"\u003E&#8206;\u003C\u002Fa\u003E\u003Ca class=\"fancybox\" title=\"Figure 1. A typical clinical pathway for radiotherapy. After a patient is diagnosed and the decision is made to treat with radiotherapy, a defined workflow aims to provide treatment that is both safe and effective. In the United Kingdom, the time delay between decision to treat and treatment delivery should be no greater than 31 days. Time-intensive manual segmentation and dose optimization steps can introduce delays to treatment.\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fc3a17c2b83daa977a6e80bb77dcfc12f.png\" id=\"figure1\"\u003E\u003Cimg class=\"figure-image\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fc3a17c2b83daa977a6e80bb77dcfc12f.png\"\u002F\u003E\u003C\u002Fa\u003E\u003Cfigcaption\u003E\u003Cspan class=\"typcn typcn-image\"\u002F\u003EFigure 1. A typical clinical pathway for radiotherapy. After a patient is diagnosed and the decision is made to treat with radiotherapy, a defined workflow aims to provide treatment that is both safe and effective. In the United Kingdom, the time delay between decision to treat and treatment delivery should be no greater than 31 days. Time-intensive manual segmentation and dose optimization steps can introduce delays to treatment. \u003C\u002Ffigcaption\u003E\u003Ca class=\"fancybox\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fc3a17c2b83daa977a6e80bb77dcfc12f.png\" title=\"Figure 1. A typical clinical pathway for radiotherapy. After a patient is diagnosed and the decision is made to treat with radiotherapy, a defined workflow aims to provide treatment that is both safe and effective. In the United Kingdom, the time delay between decision to treat and treatment delivery should be no greater than 31 days. Time-intensive manual segmentation and dose optimization steps can introduce delays to treatment.\"\u003EView this figure\u003C\u002Fa\u003E\u003C\u002Ffigure\u003E\u003Cp class=\"abstract-paragraph\"\u003EAutomated (ie, computer-performed) segmentation has the potential to address these challenges. However, most segmentation algorithms in clinical use are atlas based, producing segmentations by fitting previously labeled reference images to the new target scan. This might not sufficiently account for either postsurgical changes or the variability in normal anatomical structures that exist between patients, particularly when considering the variable effect that tumors may have on local anatomy; thus, they may be prone to systematic error. To date, such algorithm-derived segmentations still require significant manual editing, perform at expert levels on only a small number of organs, demonstrate an overall performance in clinical practice inferior to that of human experts, and have failed to significantly improve clinical workflows [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref19\" rel=\"footnote\"\u003E19\u003C\u002Fa\u003E\u003C\u002Fspan\u003E-\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref26\" rel=\"footnote\"\u003E26\u003C\u002Fa\u003E\u003C\u002Fspan\u003E].\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EIn recent years, deep learning&#8211;based algorithms have proven capable of delivering substantially better performance than traditional segmentation algorithms. Several deep learning&#8211;based approaches have been proposed for head and neck cancer segmentation. Some of them use standard convolutional neural network classifiers on patches with tailored pre- and postprocessing [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref27\" rel=\"footnote\"\u003E27\u003C\u002Fa\u003E\u003C\u002Fspan\u003E-\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref31\" rel=\"footnote\"\u003E31\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]. However, the U-Net convolutional architecture [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref32\" rel=\"footnote\"\u003E32\u003C\u002Fa\u003E\u003C\u002Fspan\u003E] has shown promise in the area of deep learning&#8211;based medical image segmentation [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref33\" rel=\"footnote\"\u003E33\u003C\u002Fa\u003E\u003C\u002Fspan\u003E] and has also been applied to head and neck radiotherapy segmentation [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref34\" rel=\"footnote\"\u003E34\u003C\u002Fa\u003E\u003C\u002Fspan\u003E-\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref47\" rel=\"footnote\"\u003E47\u003C\u002Fa\u003E\u003C\u002Fspan\u003E].\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EDespite the promise that deep learning offers, barriers remain in the application of auto-segmentation in radiotherapy planning. These include the absence of consensus on how \u003Ci\u003Eexpert\u003C\u002Fi\u003E performance is defined, the lack of available methods by which such human performance can be compared with that delivered by automated segmentation processes, and thus how the clinical acceptability of automated processes can be defined.\u003C\u002Fp\u003E\u003Ch4\u003EObjectives\u003C\u002Fh4\u003E\u003Cp class=\"abstract-paragraph\"\u003EIn this paper, we address these challenges in defining comparison metrics and report a deep learning approach that delineates a wide range of important organs at risk in head and neck cancer radiotherapy scans. We aim to achieve this using a study design that includes (1) the introduction of a clinically meaningful performance metric for segmentation in radiotherapy planning, (2) a representative set of images acquired during routine clinical practice, (3) an unambiguous segmentation protocol for all organs, and (4) a segmentation of each test set image according to these protocols by 2 independent experts. In addition to the model&#8217;s generalizability, as demonstrated on two distinct open-source data sets, by achieving performance equal to that of human experts on previously unseen patients from the same hospital site used for training, we aim to demonstrate the clinical applicability of our approach.\u003C\u002Fp\u003E\u003Cbr\u002F\u003E\u003Ch3 class=\"navigation-heading h3-main-heading\" id=\"Methods\" data-label=\"Methods\"\u003EMethods\u003C\u002Fh3\u003E\u003Ch4\u003EData Sets\u003C\u002Fh4\u003E\u003Cp class=\"abstract-paragraph\"\u003EUniversity College London Hospitals (UCLH) National Health Service (NHS) Foundation Trust serves an urban, mixed socioeconomic and ethnic population in central London, United Kingdom, and houses a specialist center for cancer treatment. Data were selected from a retrospective cohort of all-adult (aged &gt;18 years) UCLH patients who underwent computed tomography (CT) scans to plan radical radiotherapy treatment for head and neck cancer between January 1, 2008, and March 20, 2016. Both initial CT images and rescans were included in the training data set. Patients with all tumor types, stages, and histological grades were considered for inclusion, as long as their CT scans were available in digital form and were of sufficient diagnostic quality. The standard CT pixel spacing was 0.976&#215;0.976&#215;2.5 mm, and scans with nonstandard spacing (with the exception of 1.25-mm spacing scans that were subsampled) were excluded to ensure consistent performance metrics during training. It should be noted that for the Cancer Imaging Archive (TCIA) test set, the in-plane pixel spacing was not used as an exclusion criterion, \u003Ci\u003Ei\u003C\u002Fi\u003E ranged from 0.94 to 1.27 mm. For the public domain database for computational anatomy (PDDCA) test set, we included all scans, and the voxels varied between 2 to 3 mm in height and 0.98 to 1.27 mm in axial dimension. Patients&#8217; requests to not have their data shared for research were respected.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EOf the 513 patients who underwent radiotherapy at UCLH within the given study dates, a total of 486 patients (94.7%; 838 scans; mean age 57 years; male 337, female 146, and gender unknown 3) met the inclusion criteria. Of note, no scans were excluded because of poor diagnostic quality. Scans from UCLH were split into a training set (389 patients; 663 scans), validation set (51 patients; 100 scans), and test set (46 patients; 75 scans). From the selected test set, 19 patients (21 scans) underwent adjudicated contouring described below. No patient was included in multiple data sets; in cases where multiple scans were present for a single patient, all were included in the same subset. Multiple scans present for a single patient reflect CT scans taken for the purpose of replanning radiotherapy owing to anatomical changes during the course of treatment. It is important for models to perform well in both scenarios as treatment naive and postradiotherapy organ at risk anatomies can differ. However, to avoid potential correlation between the same organs segmented twice in the same data set, care was taken to avoid this in the TCIA test set (described later in this section).\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EIn total, 21 organs at risk were selected throughout the head and neck area to represent a wide range of anatomical regions. We used a combination of segmentations sourced from those used clinically at UCLH and additional segmentations performed in-house by trained radiographers.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EWe divided our UCLH data set into the following categories: (1) \u003Ci\u003Etraining set\u003C\u002Fi\u003E, used to train the model, a combination of UCLH clinical segmentations and in-house segmentations, some of which were only 2D slices (owing to the time required to segment larger organs manually, we initially relied heavily on sparse segmentations to make efficient use of the radiographers&#8217; time). (2) \u003Ci\u003EUCLH validation set\u003C\u002Fi\u003E: used to evaluate model performance and steer additional data set priorities, which used in-house segmentations only, as we did not want to overfit any clinical bias. (3) \u003Ci\u003EUCLH test set\u003C\u002Fi\u003E: our primary result set; each scan has every organ at risk labeled and was independently segmented from scratch by 2 radiographers before one of the pairs of scans (chosen arbitrarily) was reviewed and corrected by an experienced radiation oncologist.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EAs these scans were taken from UCLH patients not present elsewhere, and to consider generalizability, we curated additional open-source CT scans available from The Cancer Genome Atlas Head-Neck Squamous Cell Carcinoma (TCGA-HNSC) and Head-Neck Cetuximab [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref48\" rel=\"footnote\"\u003E48\u003C\u002Fa\u003E\u003C\u002Fspan\u003E-\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref50\" rel=\"footnote\"\u003E50\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]. The open-source (category 4) TCIA validation set and (category 5) TCIA test set were both labeled in the same way as our UCLH test set.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003ENon-CT planning scans and those that did not meet the same slice thickness as the UCLH scans (2.5 mm) were excluded. These were then manually segmented in-house according to the Brouwer Atlas (the segmentation procedure is described in further detail in the \u003Ci\u003EClinical Labeling and Annotation\u003C\u002Fi\u003E section [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref51\" rel=\"footnote\"\u003E51\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]). We included 31 scans (22 Head-Neck Cetuximab and 9 TCGA-HNSC) that met these criteria, which we further split into validation (6 patients; 7 scans) and test (24 patients; 24 scans) sets (\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#figure2\" rel=\"footnote\"\u003EFigure 2\u003C\u002Fa\u003E\u003C\u002Fspan\u003E). The original segmentations from the Head-Neck Cetuximab data set were not included; a consensus assessment by experienced radiographers and oncologists found the segmentations either nonconformant to the selected segmentation protocol or below the quality that would be acceptable for clinical care. The original inclusion criteria for Head-Neck Cetuximab were patients with stage 3-4 carcinoma of the oropharynx, larynx, and hypopharynx, with a Zubrod performance of 0-1, and meeting predefined blood chemistry criteria between November 2005 and March 2009. The TCGA-HNSC data set included patients treated for head-neck squamous cell carcinoma, with no further restrictions being apparent [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref48\" rel=\"footnote\"\u003E48\u003C\u002Fa\u003E\u003C\u002Fspan\u003E,\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref50\" rel=\"footnote\"\u003E50\u003C\u002Fa\u003E\u003C\u002Fspan\u003E].\u003C\u002Fp\u003E\u003Cfigure\u003E\u003Ca name=\"figure2\"\u003E&#8206;\u003C\u002Fa\u003E\u003Ca class=\"fancybox\" title=\"Figure 2. Case selection from the University College London Hospitals and The Cancer Imaging Archive computed tomography data sets. A consort-style diagram demonstrating the application of inclusion and exclusion criteria to select the training, validation, and test sets used in this work. CT: computed tomography; HN_C: Head and Neck Carcinoma; N\u002FA: not applicable; TCIA: The Cancer Imaging Archive; TCGA: The Cancer Genome Atlas; UCLH: University College London Hospitals; Val: validation.\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F9c1afe91ef715718f5005d99d08c139a.png\" id=\"figure2\"\u003E\u003Cimg class=\"figure-image\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F9c1afe91ef715718f5005d99d08c139a.png\"\u002F\u003E\u003C\u002Fa\u003E\u003Cfigcaption\u003E\u003Cspan class=\"typcn typcn-image\"\u002F\u003EFigure 2. Case selection from the University College London Hospitals and The Cancer Imaging Archive computed tomography data sets. A consort-style diagram demonstrating the application of inclusion and exclusion criteria to select the training, validation, and test sets used in this work. CT: computed tomography; HN_C: Head and Neck Carcinoma; N\u002FA: not applicable; TCIA: The Cancer Imaging Archive; TCGA: The Cancer Genome Atlas; UCLH: University College London Hospitals; Val: validation. \u003C\u002Ffigcaption\u003E\u003Ca class=\"fancybox\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F9c1afe91ef715718f5005d99d08c139a.png\" title=\"Figure 2. Case selection from the University College London Hospitals and The Cancer Imaging Archive computed tomography data sets. A consort-style diagram demonstrating the application of inclusion and exclusion criteria to select the training, validation, and test sets used in this work. CT: computed tomography; HN_C: Head and Neck Carcinoma; N\u002FA: not applicable; TCIA: The Cancer Imaging Archive; TCGA: The Cancer Genome Atlas; UCLH: University College London Hospitals; Val: validation.\"\u003EView this figure\u003C\u002Fa\u003E\u003C\u002Ffigure\u003E\u003Cp class=\"abstract-paragraph\"\u003EAll test sets were kept separate during model training and validation. \u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#table1\" rel=\"footnote\"\u003ETable 1\u003C\u002Fa\u003E\u003C\u002Fspan\u003E describes in detail the demographics and characteristics within the data sets; to obtain a balanced demographic in each of the tests, the validation and training data sets, we sampled randomly stratified splits and selected one that minimized the differences between the key demographics in each data set.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EIn addition, the (6) \u003Ci\u003EPDDCA open-source data set\u003C\u002Fi\u003E consisted of 15 patients selected from the Head-Neck Cetuximab open-source data set [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref48\" rel=\"footnote\"\u003E48\u003C\u002Fa\u003E\u003C\u002Fspan\u003E], owing to differences in selection criteria and test, validation, or training set allocation, five scans were present in both the TCIA and PDDCA test sets. This data set was used without further postprocessing and only accessed once to assess the volumetric Dice similarity coefficient (DSC) performance. The PDDCA test set differs from the TCIA test set in both the segmentation protocol and the axial slice thickness. The work by Raudaschl et al [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref25\" rel=\"footnote\"\u003E25\u003C\u002Fa\u003E\u003C\u002Fspan\u003E] provides more details on the data set characteristics and preprocessing.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003E\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#table1\" rel=\"footnote\"\u003ETable 1\u003C\u002Fa\u003E\u003C\u002Fspan\u003E details the characteristics of these data sets and patient demographics.\u003C\u002Fp\u003E\u003Cdiv class=\"figure-table\"\u003E\u003Cfigcaption\u003E\u003Cspan class=\"typcn typcn-clipboard\"\u002F\u003ETable 1.\n Data set characteristics\u003Csup\u003Ea\u003C\u002Fsup\u003E.\u003C\u002Ffigcaption\u003E\u003Ctable width=\"1000\" cellpadding=\"5\" cellspacing=\"0\" border=\"1\" rules=\"groups\" frame=\"hsides\"\u003E\u003Ccol width=\"30\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"290\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"0\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"120\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"0\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"130\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"0\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"100\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"0\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"0\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"130\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"0\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"100\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"0\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"0\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"100\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Cthead\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003EData set\u003C\u002Ftd\u003E\u003Ctd colspan=\"7\" rowspan=\"1\"\u003EUCLH\u003Csup\u003Eb\u003C\u002Fsup\u003E\u003C\u002Ftd\u003E\u003Ctd colspan=\"5\" rowspan=\"1\"\u003ETCIA\u003Csup\u003Ec\u003C\u002Fsup\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EPDDCA\u003Csup\u003Ed\u003C\u002Fsup\u003E\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003ETrain\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003EValidation\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003ETest\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003EValidation\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003ETest\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003ETest\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003C\u002Fthead\u003E\u003Ctbody\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003ETotal scans (patients), n\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E663 (389)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E100 (51)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E21 (19)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E7 (6)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E24 (24)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E15 (15)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003EAverage patient age (years)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E57.1\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E57.5\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E59.6\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E56.5\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E59.9\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E58.6\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd colspan=\"16\" rowspan=\"1\"\u003E\u003Cb\u003ESex, number of scans (number of patients)\u003C\u002Fb\u003E\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EFemale\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E207 (115)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E36 (19)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E7 (6)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E2 (2)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E2 (2)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E2 (2)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EMale\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E450 (271)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E64 (32)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E14 (13)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E5 (4)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E20 (20)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E9 (9)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EUnknown\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E6 (3)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E2 (2)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E4 (4)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd colspan=\"16\" rowspan=\"1\"\u003E\u003Cb\u003ETumor\u003C\u002Fb\u003E\u003Cb\u003Esite, number of scans (number of patients)\u003C\u002Fb\u003E\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EOropharynx\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E145 (86)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E27 (15)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E7 (6)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E8 (8)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E2 (2)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ELip, oral cavity, and pharynx\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E80 (52)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E20 (8)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E4 (4)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E1 (1)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E3 (3)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ETongue\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E53 (26)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E8 (5)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E1 (1)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E2 (2)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E7 (7)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ELarynx\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E46 (31)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E8 (3)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E2 (2)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E2 (2)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E4 (4)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ENasopharynx\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E48 (24)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E5 (3)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EHead, face, and neck\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E37 (23)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E8 (3)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E1 (1)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ENasal cavity\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E32 (19)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E2 (1)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E1 (1)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EConnective and soft tissue\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E37 (18)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E2 (1)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E1 (1)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EHypopharynx\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E17 (10)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E1 (1)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E2 (1)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E1 (1)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EAccessory sinus\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E10 (7)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E2 (1)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EEsophagus\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E6 (2)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E1 (1)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EOther\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E33 (20)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E1 (1)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EUnknown\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E119 (71)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E16 (9)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E4 (3)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E13 (13)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd colspan=\"16\" rowspan=\"1\"\u003E\u003Cb\u003ESource, number of scans (number of patients)\u003C\u002Fb\u003E\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ETCGA\u003Csup\u003Ee\u003C\u002Fsup\u003E\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E&#8212;\u003Csup\u003Ef\u003C\u002Fsup\u003E\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E&#8212;\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E&#8212;\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E2 (2)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E7 (7)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EHN_Cetux\u003Csup\u003Eg\u003C\u002Fsup\u003E\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E&#8212;\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E&#8212;\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E&#8212;\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E5 (4)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E17 (17)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E15 (15)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd colspan=\"16\" rowspan=\"1\"\u003E\u003Cb\u003ESite, number of scans (number of patients)\u003C\u002Fb\u003E\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EUCLH\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E663 (389)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E100 (51)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E21 (19)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EMD Anderson Cancer Center\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E2 (2)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E7 (7)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E\u003Cbr\u002F\u003E\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EUnknown (US)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E0 (0)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E5 (4)\u003C\u002Ftd\u003E\u003Ctd colspan=\"2\" rowspan=\"1\"\u003E17 (17)\u003C\u002Ftd\u003E\u003Ctd colspan=\"3\" rowspan=\"1\"\u003E15 (15)\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003C\u002Ftbody\u003E\u003C\u002Ftable\u003E\u003Cfn id=\"table1fn1\"\u003E\u003Cp\u003E\u003Csup\u003Ea\u003C\u002Fsup\u003ETumor sites were derived from International Classification of Diseases codes. The Cancer Genome Atlas Head-Neck Squamous Cell Carcinoma [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref52\" rel=\"footnote\"\u003E52\u003C\u002Fa\u003E\u003C\u002Fspan\u003E] is an open-source data set hosted on The Cancer Imaging Archive (TCIA). Head-Neck Cetuximab is an open-source data set hosted on TCIA [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref53\" rel=\"footnote\"\u003E53\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]. Public Domain Database for Computational Anatomy data set released as part of the 2015 challenge in the segmentation of head and neck anatomy at the International Conference on Medical Image Computing and Computer Assisted Intervention.\u003C\u002Fp\u003E\u003C\u002Ffn\u003E\u003Cfn id=\"table1fn2\"\u003E\u003Cp\u003E\u003Csup\u003Eb\u003C\u002Fsup\u003EUCLH: University College London Hospitals.\u003C\u002Fp\u003E\u003C\u002Ffn\u003E\u003Cfn id=\"table1fn3\"\u003E\u003Cp\u003E\u003Csup\u003Ec\u003C\u002Fsup\u003ETCIA: The Cancer Imaging Archive.\u003C\u002Fp\u003E\u003C\u002Ffn\u003E\u003Cfn id=\"table1fn4\"\u003E\u003Cp\u003E\u003Csup\u003Ed\u003C\u002Fsup\u003EPDDCA: Public Domain Database for Computational Anatomy.\u003C\u002Fp\u003E\u003C\u002Ffn\u003E\u003Cfn id=\"table1fn5\"\u003E\u003Cp\u003E\u003Csup\u003Ee\u003C\u002Fsup\u003ETCGA: The Cancer Genome Atlas Program.\u003C\u002Fp\u003E\u003C\u002Ffn\u003E\u003Cfn id=\"table1fn6\"\u003E\u003Cp\u003E\u003Csup\u003Ef\u003C\u002Fsup\u003EThe University College London Hospitals (UCLH) data set was sourced entirely from UCLH.\u003C\u002Fp\u003E\u003C\u002Ffn\u003E\u003Cfn id=\"table1fn7\"\u003E\u003Cp\u003E\u003Csup\u003Eg\u003C\u002Fsup\u003EHN_Cetux: Head-Neck Cetuximab.\u003C\u002Fp\u003E\u003C\u002Ffn\u003E\u003C\u002Fdiv\u003E\u003Ch4\u003EClinical Taxonomy\u003C\u002Fh4\u003E\u003Cp class=\"abstract-paragraph\"\u003ETo select the organs at risk to be included in the study, we used the Brouwer Atlas (consensus guidelines for delineating organs at risk for head and neck radiotherapy, defined by an international panel of radiation oncologists [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref51\" rel=\"footnote\"\u003E51\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]). From this, we excluded those regions that required additional magnetic resonance imaging for segmentation, those that were not relevant to routine head and neck radiotherapy, or those that were not used clinically at UCLH. This resulted in a set of 21 organs at risk (\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#table2\" rel=\"footnote\"\u003ETable 2\u003C\u002Fa\u003E\u003C\u002Fspan\u003E).\u003C\u002Fp\u003E\u003Cdiv class=\"figure-table\"\u003E\u003Cfigcaption\u003E\u003Cspan class=\"typcn typcn-clipboard\"\u002F\u003ETable 2.\n Taxonomy of segmentation regions.\u003C\u002Ffigcaption\u003E\u003Ctable width=\"1000\" cellpadding=\"5\" cellspacing=\"0\" border=\"1\" rules=\"groups\" frame=\"hsides\"\u003E\u003Ccol width=\"200\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"250\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Ccol width=\"550\" span=\"1\"\u003E\u003C\u002Fcol\u003E\u003Cthead\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EOrgan at risk\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ETotal number of labeled slices included\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EAnatomical landmarks and definition\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003C\u002Fthead\u003E\u003Ctbody\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EBrain\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E11,476\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ESits inside the cranium and includes all brain vessels excluding the brainstem and optic chiasm.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EBrainstem\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E34,794\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EThe posterior aspect of the brain including the midbrain, pons, and medulla oblongata. Extending inferior from the lateral ventricles to the tip of the dens at C2. It is structurally continuous with the spinal cord.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ECochlea-left\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E4526\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EEmbedded in the temporal bone and lateral to the internal auditory meatus.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ECochlea-right\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E4754\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EEmbedded in the temporal bone and lateral to the internal auditory meatus.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ELacrimal-left\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E17,186\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EConcave-shaped gland located at the superolateral aspect of the orbit.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ELacrimal-right\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E17,788\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EConcave-shaped gland located at the superolateral aspect of the orbit.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ELens-left\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E3006\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EAn oval structure that sits within the anterior segment of the orbit. Can be variable in position but never sitting posterior beyond the level of the outer canthus.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ELens-right\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E3354\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EAn oval structure that sits within the anterior segment of the orbit. Can be variable in position but never sitting posterior beyond the level of the outer canthus.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ELung-left\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E8340\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EEncompassed by the thoracic cavity adjacent to the lateral aspect of the mediastinum, extending from the first rib to the diaphragm excluding the carina.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ELung-right\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E9158\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EEncompassed by the thoracic cavity adjacent to the lateral aspect of the mediastinum, extending from the first rib to the diaphragm excluding the carina.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EMandible\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E25,074\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EThe entire mandible bone including the temporomandibular joint, ramus, and body, excluding the teeth. The mandible joins to the inferior aspect of the temporal bone and forms the entire lower jaw.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EOptic-nerve-left\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E3458\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EA 2 to 5 mm thick nerve that runs from the posterior aspect of the eye, through the optic canal and ends at the lateral aspect of the optic chiasm.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EOptic-nerve-right\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E3012\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EA 2 to 5 mm thick nerve that runs from the posterior aspect of the eye, through the optic canal and ends at the lateral aspect of the optic chiasm.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EOrbit-left\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E8538\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ESpherical organ sitting within the orbital cavity. Includes the vitreous humor, retina, cornea, and lens with the optic nerve attached posteriorly.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EOrbit-right\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E8242\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ESpherical organ sitting within the orbital cavity. Includes the vitreous humor, retina, cornea, and lens with the optic nerve attached posteriorly.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EParotid-left\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E8984\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EMulti-lobed salivary gland wrapped around the mandibular ramus. Extends medially to the styloid process and parapharyngeal space. Laterally extending to the subcutaneous fat. Posteriorly extending to the sternocleidomastoid muscle. Anterior extending to posterior border of the mandible bone and masseter muscle. In cases where the retromandibular vein is encapsulated by parotid, this is included in the segmentation.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EParotid-right\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E11,752\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EMulti-lobed salivary gland wrapped around the mandibular ramus. Extends medially to the styloid process and parapharyngeal space. Laterally extending to the subcutaneous fat. Posteriorly extending to the sternocleidomastoid muscle. Anterior extending to posterior border of the mandible bone and masseter muscle. In cases where the retromandibular vein is encapsulated by parotid this is included in the segmentation.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ESpinal-canal\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E37,000\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003EHollow cavity that runs through the foramen of the vertebrae, extending from the base of skull to the end of the sacrum.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ESpinal-cord\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E37,096\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ESits inside the spinal canal and extends from the level of the foramen magnum to the bottom of L2.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ESubmandibular-left\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E10,652\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ESits within the submandibular portion of the anterior triangle of the neck, making up the floor of the mouth and extending both superior and inferior to the posterior aspect of the mandible and is limited laterally by the mandible and medially by the hypoglossal muscle.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr valign=\"top\"\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ESubmandibular-right\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003E10,716\u003C\u002Ftd\u003E\u003Ctd rowspan=\"1\" colspan=\"1\"\u003ESits within the submandibular portion of the anterior triangle of the neck, making up the floor of the mouth and extending both superior and inferior to the posterior aspect of the mandible and is limited laterally by the mandible and medially by the hypoglossal muscle.\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003C\u002Ftbody\u003E\u003C\u002Ftable\u003E\u003C\u002Fdiv\u003E\u003Ch4\u003EClinical Labeling and Annotation\u003C\u002Fh4\u003E\u003Cp class=\"abstract-paragraph\"\u003EOwing to the large variability of segmentation protocols used and annotation quality in the UCLH data set, all segmentations from all scans selected for inclusion in the training set were manually reviewed by a radiographer with at least 4 years of experience in the segmentation of head and neck organs at risk. Volumes that did not conform to the Brouwer Atlas were excluded from the training. To increase the number of training examples, additional axial slices were randomly selected for further manual organ at risk segmentations to be added based on model performance or perceived imbalances in the data set. These were then produced by a radiographer with at least 4 years of experience in head and neck radiotherapy, arbitrated by a second radiographer with the same level of experience. The total number of examples from the original UCLH segmentations and additional slices are provided in \u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#table2\" rel=\"footnote\"\u003ETable 2\u003C\u002Fa\u003E\u003C\u002Fspan\u003E.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EFor the TCIA test and validation sets, the original dense segmentations were not used owing to poor adherence to the chosen study protocol. To produce the ground truth labels, the full volumes of all 21 organs at risk included in the study were segmented. This was done initially by a radiographer with at least 4 years of experience in the segmentation of head and neck organs at risk and then arbitrated by a second radiographer with similar experience. Further arbitration was then performed by a radiation oncologist with at least 5 years of postcertification experience in head and neck radiotherapy. The same process was repeated with 2 additional radiographers working independently, but after peer arbitration, these segmentations were not reviewed by an oncologist; rather, they became the human reference to which the model was compared. This is schematically shown in \u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#figure3\" rel=\"footnote\"\u003EFigure 3\u003C\u002Fa\u003E\u003C\u002Fspan\u003E. Before participation, all radiographers and oncologists were required to study the Brouwer Atlas for head and neck organ at risk segmentation [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref51\" rel=\"footnote\"\u003E51\u003C\u002Fa\u003E\u003C\u002Fspan\u003E] and demonstrate competence in adhering to these guidelines.\u003C\u002Fp\u003E\u003Cfigure\u003E\u003Ca name=\"figure3\"\u003E&#8206;\u003C\u002Fa\u003E\u003Ca class=\"fancybox\" title=\"Figure 3. Process for the segmentation of ground truth and radiographer organs at risk volumes. The flowchart illustrates how the ground truth segmentations were created and compared with independent radiographer segmentations and the model. For the ground truth, each computed tomography scan in The Cancer Imaging Archive test set was segmented first by a radiographer and peer reviewed by a second radiographer. This then went through one or more iterations of review and editing with a specialist oncologist before creating a ground truth used to compare with the segmentations produced by both the model and additional radiographer. CT: computed tomography.\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F927f4012c92da3f5a055064e95bb5f0b.png\" id=\"figure3\"\u003E\u003Cimg class=\"figure-image\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F927f4012c92da3f5a055064e95bb5f0b.png\"\u002F\u003E\u003C\u002Fa\u003E\u003Cfigcaption\u003E\u003Cspan class=\"typcn typcn-image\"\u002F\u003EFigure 3. Process for the segmentation of ground truth and radiographer organs at risk volumes. The flowchart illustrates how the ground truth segmentations were created and compared with independent radiographer segmentations and the model. For the ground truth, each computed tomography scan in The Cancer Imaging Archive test set was segmented first by a radiographer and peer reviewed by a second radiographer. This then went through one or more iterations of review and editing with a specialist oncologist before creating a ground truth used to compare with the segmentations produced by both the model and additional radiographer. CT: computed tomography. \u003C\u002Ffigcaption\u003E\u003Ca class=\"fancybox\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F927f4012c92da3f5a055064e95bb5f0b.png\" title=\"Figure 3. Process for the segmentation of ground truth and radiographer organs at risk volumes. The flowchart illustrates how the ground truth segmentations were created and compared with independent radiographer segmentations and the model. For the ground truth, each computed tomography scan in The Cancer Imaging Archive test set was segmented first by a radiographer and peer reviewed by a second radiographer. This then went through one or more iterations of review and editing with a specialist oncologist before creating a ground truth used to compare with the segmentations produced by both the model and additional radiographer. CT: computed tomography.\"\u003EView this figure\u003C\u002Fa\u003E\u003C\u002Ffigure\u003E\u003Ch4\u003EModel Architecture\u003C\u002Fh4\u003E\u003Cp class=\"abstract-paragraph\"\u003EWe used a residual 3D U-Net architecture with 8 levels (\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#figure4\" rel=\"footnote\"\u003EFigure 4\u003C\u002Fa\u003E\u003C\u002Fspan\u003E). Our network takes in a CT volume (single channel) and outputs a segmentation mask with 21 channels, where each channel contains a binary segmentation mask for a specific organ at risk. The network consists of 7 residual convolutional blocks in the downward path, a residual fully connected block at the bottom, and 7 residual convolutional blocks in the upward path. A 1&#215;1&#215;1 convolution layer with sigmoidal activation produces the final output in the original resolution of the input image. Each predicted slice had 21 slices of context. The 21-slice context (ie, 21 &#215; 2.5 mm=52.5 mm) was found to provide the optimal context. This is not the case with the 21 organs at risk used in this study.\u003C\u002Fp\u003E\u003Cfigure\u003E\u003Ca name=\"figure4\"\u003E&#8206;\u003C\u002Fa\u003E\u003Ca class=\"fancybox\" title=\"Figure 4. 3D U-Net model architecture. (a) At training time, the model receives 21 contiguous computed tomography slices, which are processed through a series of &#8220;down&#8221; blocks, a fully connected block, and a series of &#8220;up&#8221; blocks to create a segmentation prediction. (b) A detailed view of the convolutional residual down and up blocks and the residual fully connected block.\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fa8378390342460c336ae6203846eba44.png\" id=\"figure4\"\u003E\u003Cimg class=\"figure-image\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fa8378390342460c336ae6203846eba44.png\"\u002F\u003E\u003C\u002Fa\u003E\u003Cfigcaption\u003E\u003Cspan class=\"typcn typcn-image\"\u002F\u003EFigure 4. 3D U-Net model architecture. (a) At training time, the model receives 21 contiguous computed tomography slices, which are processed through a series of &#8220;down&#8221; blocks, a fully connected block, and a series of &#8220;up&#8221; blocks to create a segmentation prediction. (b) A detailed view of the convolutional residual down and up blocks and the residual fully connected block. \u003C\u002Ffigcaption\u003E\u003Ca class=\"fancybox\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fa8378390342460c336ae6203846eba44.png\" title=\"Figure 4. 3D U-Net model architecture. (a) At training time, the model receives 21 contiguous computed tomography slices, which are processed through a series of &#8220;down&#8221; blocks, a fully connected block, and a series of &#8220;up&#8221; blocks to create a segmentation prediction. (b) A detailed view of the convolutional residual down and up blocks and the residual fully connected block.\"\u003EView this figure\u003C\u002Fa\u003E\u003C\u002Ffigure\u003E\u003Cp class=\"abstract-paragraph\"\u003EWe trained our network with a regularized top-\u003Ci\u003Ek\u003C\u002Fi\u003E-percent, pixel-wise, binary, cross-entropy loss [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref54\" rel=\"footnote\"\u003E54\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]; for each output channel, the top-\u003Ci\u003Ek\u003C\u002Fi\u003E loss selects only the \u003Ci\u003Ek\u003C\u002Fi\u003E% most difficult pixels (those with the highest binary cross-entropy) and only adds their contribution to the total loss. This speeds up training and helps the network to tackle the large class imbalance and to focus on difficult examples.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EWe regularized the model using standard L2 weight regularization with scale 10\u003Csup\u003E&#8722;6\u003C\u002Fsup\u003E and extensive data augmentation using random in-plane (ie, in \u003Ci\u003Ex\u003C\u002Fi\u003E and \u003Ci\u003Ey\u003C\u002Fi\u003E directions only) translation, rotation, scaling, shearing, mirroring, elastic deformations, and pixel-wise noise. We used uniform translations between &#8722;32 and 32 pixels, uniform rotations between &#8722;9&#176; and 9&#176;, uniform scaling factors between 0.8&#176; and 1.2&#176;, and uniform shear factors between &#8722;0.1 and 0.1. We mirrored the images (and adjusted the corresponding left and right labels) with a probability of 0.5. We performed elastic deformations by placing random displacement vectors (SD 5 mm, in-plane displacements only) on a control point grid with 100&#215;100&#215;100 mm spacing and by deriving the dense deformation field using cubic b-spline interpolation. In the implementation, all spatial transformations are first combined to a dense deformation field, which is then applied to the image using bilinear interpolation and extrapolation with zero padding. We added zero-mean Gaussian intensity noise independently to each pixel with an SD of 20 Hounsfield units.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EWe trained the model with the Adam optimizer [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref53\" rel=\"footnote\"\u003E53\u003C\u002Fa\u003E\u003C\u002Fspan\u003E] for 120,000 steps and a batch size of 32 (32 graphical processing units) using synchronous stochastic gradient descent. We used an initial learning rate of 10\u003Csup\u003E&#8722;4\u003C\u002Fsup\u003E and scaled the learning rate by 1\u002F2, 1\u002F8, 1\u002F64, and 1\u002F256 at time steps of 24,000, 60,000, 108,000, and 114,000, respectively.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EWe used the validation set to select the model that performed at over 95% for most organs at risk according to our chosen surface DSC performance metric, breaking ties by preferring better performance on more clinically impactful organs at risk and the absolute performance obtained.\u003C\u002Fp\u003E\u003Ch4\u003EPerformance Metrics\u003C\u002Fh4\u003E\u003Cp class=\"abstract-paragraph\"\u003EAll performance metrics are reported for each organ independently (eg, separately for just the left parotid), so we only need to deal with binary masks (eg, a left parotid voxel and a non&#8211;left-parotid voxel). Masks are defined as a subset of \u003Cimg class=\"inline-graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F3f007e4a26946ad814aa5bc2778fb2d4.png\" border=\"0\" style=\"width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;\"\u002F\u003E, that is, \u003Cimg class=\"inline-graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fa4794be6bc36041d9601047477506cf6.png\" border=\"0\" style=\"width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;\"\u002F\u003E (\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#figure5\" rel=\"footnote\"\u003EFigure 5\u003C\u002Fa\u003E\u003C\u002Fspan\u003E).\u003C\u002Fp\u003E\u003Cfigure\u003E\u003Ca name=\"figure5\"\u003E&#8206;\u003C\u002Fa\u003E\u003Ca class=\"fancybox\" title=\"Figure 5. Illustrations of masks, surfaces, border regions, and the &#8220;overlapping&#8221; surface at tolerance &#964;.\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fe0be8500f85a93e28f44b06d90eb45fb.png\" id=\"figure5\"\u003E\u003Cimg class=\"figure-image\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fe0be8500f85a93e28f44b06d90eb45fb.png\"\u002F\u003E\u003C\u002Fa\u003E\u003Cfigcaption\u003E\u003Cspan class=\"typcn typcn-image\"\u002F\u003EFigure 5. Illustrations of masks, surfaces, border regions, and the &#8220;overlapping&#8221; surface at tolerance &#964;. \u003C\u002Ffigcaption\u003E\u003Ca class=\"fancybox\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fe0be8500f85a93e28f44b06d90eb45fb.png\" title=\"Figure 5. Illustrations of masks, surfaces, border regions, and the &#8220;overlapping&#8221; surface at tolerance &#964;.\"\u003EView this figure\u003C\u002Fa\u003E\u003C\u002Ffigure\u003E\u003Cp class=\"abstract-paragraph\"\u003EThe volume of a mask is denoted as \u003Cimg class=\"inline-graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F694ddae7b041e6d1952f700634d0acbc.png\" border=\"0\" style=\"width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;\"\u002F\u003E, with\u003C\u002Fp\u003E\u003Cblockquote\u003E\u003Cimg class=\"graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F3fa19e4f0be5b69a802dd0f12eb23ee6.png\" border=\"0\" style=\"text-align:center;margin-left: auto;margin-right: auto;display: block;background-color: #ffffff;\"\u002F\u003E\u003C\u002Fblockquote\u003E\u003Cp class=\"abstract-paragraph\"\u003EWith this notation, the standard (volumetric) DSC for two given masks \u003Ci\u003EM\u003C\u002Fi\u003E\u003Csub\u003E1\u003C\u002Fsub\u003E and \u003Ci\u003EM\u003C\u002Fi\u003E\u003Csub\u003E2\u003C\u002Fsub\u003E and can be written as:\u003C\u002Fp\u003E\u003Cblockquote\u003E\u003Cimg class=\"graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F6deb202a14a449a075ea5b82910d2f90.png\" border=\"0\" style=\"text-align:center;margin-left: auto;margin-right: auto;display: block;background-color: #ffffff;\"\u002F\u003E\u003C\u002Fblockquote\u003E\u003Cp class=\"abstract-paragraph\"\u003EIn the case of sparse ground truth segmentations (ie, only a few slices of the CT scan are labeled), we estimate the volumetric DSC by aggregating data from labeled voxels across multiple scans and patients as\u003C\u002Fp\u003E\u003Cblockquote\u003E\u003Cimg class=\"graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F9805ff85f092184875b547c172cb2e91.png\" border=\"0\" style=\"text-align:center;margin-left: auto;margin-right: auto;display: block;background-color: #ffffff;\"\u002F\u003E\u003C\u002Fblockquote\u003E\u003Cp class=\"abstract-paragraph\"\u003Ewhere the mask \u003Ci\u003EM\u003C\u002Fi\u003E\u003Csub\u003E1,\u003C\u002Fsub\u003E\u003Ci\u003E\u003Csub\u003Ep\u003C\u002Fsub\u003E\u003C\u002Fi\u003E and the labeled region \u003Ci\u003EL\u003Csub\u003Ep\u003C\u002Fsub\u003E\u003C\u002Fi\u003E represent the sparse ground truth segmentation for a patient \u003Ci\u003Ep\u003C\u002Fi\u003E and the mask \u003Ci\u003EM\u003C\u002Fi\u003E\u003Csub\u003E2,\u003C\u002Fsub\u003E\u003Ci\u003E\u003Csub\u003Ep\u003C\u002Fsub\u003E\u003C\u002Fi\u003E is the full volume predicted segmentation for the patient \u003Ci\u003Ep\u003C\u002Fi\u003E.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EOwing to the shortcomings of the volumetric DSC metric for the presented radiotherapy use case, we introduced the \u003Ci\u003Esurface DSC\u003C\u002Fi\u003E metric, which assesses the overlap of two surfaces (at a specified tolerance) instead of the overlap of two volumes (see \u003Ci\u003EResults\u003C\u002Fi\u003E section). A surface is the border of a mask, \u003Cimg class=\"inline-graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fbd9b9263e53953e076fb11bab157ebb3.png\" border=\"0\" style=\"width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;\"\u002F\u003E, and the area of the surface is denoted as\u003C\u002Fp\u003E\u003Cblockquote\u003E\u003Cimg class=\"graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F690e2bf25800200a3b0cadc1c0fc697f.png\" border=\"0\" style=\"text-align:center;margin-left: auto;margin-right: auto;display: block;background-color: #ffffff;\"\u002F\u003E\u003C\u002Fblockquote\u003E\u003Cp class=\"abstract-paragraph\"\u003Ewhere \u003Cimg class=\"inline-graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F342d467e9edd7fc4b51b06d55466c27b.png\" border=\"0\" style=\"width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;\"\u002F\u003E is a point on the surface using arbitrary parameterization. The mapping from this parameterization to a point in \u003Cimg class=\"inline-graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fa1d33ba387e9ccab97a506db3e0392ce.png\" border=\"0\" style=\"width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;\"\u002F\u003E is denoted as \u003Cimg class=\"inline-graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F91159bf804fe2c4fae791272fb556053.png\" border=\"0\" style=\"width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;\"\u002F\u003E, that is, \u003Cimg class=\"inline-graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F69078fc22497f2707c06e912c3218ff7.png\" border=\"0\" style=\"width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;\"\u002F\u003E. With this we can define the border region \u003Cimg class=\"inline-graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F693bab5ffe535fb785c90130ff377c38.png\" border=\"0\" style=\"width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;\"\u002F\u003E, for the surface \u003Ci\u003ES\u003Csub\u003Ei\u003C\u002Fsub\u003E\u003C\u002Fi\u003E, at a given tolerance \u003Ci\u003E&#964;\u003C\u002Fi\u003E as (\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#figure5\" rel=\"footnote\"\u003EFigure 5\u003C\u002Fa\u003E\u003C\u002Fspan\u003E)\u003C\u002Fp\u003E\u003Cblockquote\u003E\u003Cimg class=\"graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F005a7b98c397f6836b094b8fa0b4a3fa.png\" border=\"0\" style=\"text-align:center;margin-left: auto;margin-right: auto;display: block;background-color: #ffffff;\"\u002F\u003E\u003C\u002Fblockquote\u003E\u003Cp class=\"abstract-paragraph\"\u003EUsing these definitions, we can write the \u003Ci\u003Esurface DSC at tolerance &#964;\u003C\u002Fi\u003E as\u003C\u002Fp\u003E\u003Cblockquote\u003E\u003Cimg class=\"graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F4b6c33e0555e226e29c7fbbb351f01b7.png\" border=\"0\" style=\"text-align:center;margin-left: auto;margin-right: auto;display: block;background-color: #ffffff;\"\u002F\u003E\u003C\u002Fblockquote\u003E\u003Cp class=\"abstract-paragraph\"\u003Eusing an informal notation for the intersection of the surface with the boundary, that is,\u003C\u002Fp\u003E\u003Cblockquote\u003E\u003Cimg class=\"graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F608002607a80f003d874f34b4ddb1e6c.png\" border=\"0\" style=\"text-align:center;margin-left: auto;margin-right: auto;display: block;background-color: #ffffff;\"\u002F\u003E\u003C\u002Fblockquote\u003E\u003Ch4\u003EImplementation of Surface DSC\u003C\u002Fh4\u003E\u003Cp class=\"abstract-paragraph\"\u003EThe computation of surface integrals on sampled images is not straightforward, especially for medical images, where the voxel spacing is usually not equal in all 3 dimensions. The common approximation of the integral by counting the surface voxels can lead to substantial systematic errors.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EAnother common challenge is the representation of a surface with voxels. As the surface of a binary mask is located between voxels, a definition of \u003Ci\u003Esurface voxels\u003C\u002Fi\u003E in the raster-space of the image introduces a bias: using foreground voxels to represent the surface leads to an underestimation of the surface, whereas the use of background voxels leads to an overestimation.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EOur proposed implementation uses a surface representation that provides less-biased estimates but still allows us to compute the performance metrics with linear complexity O(\u003Ci\u003EN\u003C\u002Fi\u003E), with \u003Ci\u003EN\u003C\u002Fi\u003E: number of voxels). We placed the surface points between the voxels on a raster that is shifted by half of the raster spacing on each axis (see \u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#figure6\" rel=\"footnote\"\u003EFigure 6\u003C\u002Fa\u003E\u003C\u002Fspan\u003E for a 2D illustration).\u003C\u002Fp\u003E\u003Cfigure\u003E\u003Ca name=\"figure6\"\u003E&#8206;\u003C\u002Fa\u003E\u003Ca class=\"fancybox\" title=\"Figure 6. 2D illustration of the implementation of the surface Dice similarity coefficient. (a) A binary mask displayed as an image. The origin of the image raster is (0,0). (b) The surface points (red circles) are located in a raster that is shifted half of the raster spacing on each axis. Each surface point has 4 neighbors in 2D (8 neighbors in 3D). The local contour (blue line) assigned to each surface point (red circle) depends on the neighbor constellation.\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fd1861009f50dd76330c2350f605fa9bf.png\" id=\"figure6\"\u003E\u003Cimg class=\"figure-image\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fd1861009f50dd76330c2350f605fa9bf.png\"\u002F\u003E\u003C\u002Fa\u003E\u003Cfigcaption\u003E\u003Cspan class=\"typcn typcn-image\"\u002F\u003EFigure 6. 2D illustration of the implementation of the surface Dice similarity coefficient. (a) A binary mask displayed as an image. The origin of the image raster is (0,0). (b) The surface points (red circles) are located in a raster that is shifted half of the raster spacing on each axis. Each surface point has 4 neighbors in 2D (8 neighbors in 3D). The local contour (blue line) assigned to each surface point (red circle) depends on the neighbor constellation. \u003C\u002Ffigcaption\u003E\u003Ca class=\"fancybox\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fd1861009f50dd76330c2350f605fa9bf.png\" title=\"Figure 6. 2D illustration of the implementation of the surface Dice similarity coefficient. (a) A binary mask displayed as an image. The origin of the image raster is (0,0). (b) The surface points (red circles) are located in a raster that is shifted half of the raster spacing on each axis. Each surface point has 4 neighbors in 2D (8 neighbors in 3D). The local contour (blue line) assigned to each surface point (red circle) depends on the neighbor constellation.\"\u003EView this figure\u003C\u002Fa\u003E\u003C\u002Ffigure\u003E\u003Cp class=\"abstract-paragraph\"\u003EFor 3D images, each point in the raster has 8 neighboring voxels. As we analyzed binary masks, there are only 2\u003Csup\u003E8\u003C\u002Fsup\u003E=256 possible neighbor constellations. For each of these constellations, we computed the resulting triangles using the marching cube triangulation [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref55\" rel=\"footnote\"\u003E55\u003C\u002Fa\u003E\u003C\u002Fspan\u003E,\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref56\" rel=\"footnote\"\u003E56\u003C\u002Fa\u003E\u003C\u002Fspan\u003E] and stored the surface area of the triangles (in mm\u003Csup\u003E2\u003C\u002Fsup\u003E) in a look-up table. With this look-up table, we then created a surface image (on the above-mentioned raster) that contains zeros at positions that have 8 identical neighbors or the local surface area at all positions that have both foreground and background neighbors. These surface images were created for masks \u003Ci\u003EM\u003C\u002Fi\u003E\u003Csub\u003E1\u003C\u002Fsub\u003E and \u003Ci\u003EM\u003C\u002Fi\u003E\u003Csub\u003E2\u003C\u002Fsub\u003E. In addition, we created a distance map from each of these surface images using the distance transform algorithm [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref57\" rel=\"footnote\"\u003E57\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]. Iterating over the nonzero elements in the first surface image and looking up the distance from the other surface in the corresponding distance map allows the creation of a list of tuples (surface element area and distance from other surfaces). From this list, we can easily compute the surface area by summing the area of the surface elements that are within the tolerance. To account for the quantized distances, there is only a discrete set \u003Cimg class=\"inline-graphic-image\" alt=\"\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F579a0894d6013d9da94e9a74b700579f.png\" border=\"0\" style=\"width:auto; height:12pt; position:relative; top:3px; background-color: #ffffff;\"\u002F\u003E of distances between voxels in a 3D raster with spacing (\u003Ci\u003Ed\u003C\u002Fi\u003E\u003Csub\u003E1\u003C\u002Fsub\u003E, \u003Ci\u003Ed\u003C\u002Fi\u003E\u003Csub\u003E2\u003C\u002Fsub\u003E, \u003Ci\u003Ed\u003C\u002Fi\u003E\u003Csub\u003E3\u003C\u002Fsub\u003E)&#8212;we also rounded the tolerance to the nearest neighbor in set \u003Ci\u003ED\u003C\u002Fi\u003E for each image before computing the surface DSC. Our open-source implementation of surface DSC provides more details.\u003C\u002Fp\u003E\u003Cbr\u002F\u003E\u003Ch3 class=\"navigation-heading h3-main-heading\" id=\"Results\" data-label=\"Results\"\u003EResults\u003C\u002Fh3\u003E\u003Ch4\u003ESelecting Clinically Representative Data Sets\u003C\u002Fh4\u003E\u003Cp class=\"abstract-paragraph\"\u003EData sets are described in detail in the Methods section. In brief, the first data set was a representative sample of CT scans used to plan curative-intent radiotherapy of head and neck cancer for patients at UCLH NHS Foundation Trust, a single high-volume center. We performed iterative cycles of model development using the UCLH scans (\u003Ci\u003Etraining\u003C\u002Fi\u003E and \u003Ci\u003Evalidation\u003C\u002Fi\u003E subsets), taking the performance on a previously unseen subset (\u003Ci\u003Etest\u003C\u002Fi\u003E) as our primary outcome.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EIt is also important to demonstrate a model&#8217;s generalizability to data from previously unseen demographics and distributions. To do this, we curated the test and validation data sets of open-source CT scans. These were collected from the \u003Ci\u003ETCIA test set\u003C\u002Fi\u003E [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref48\" rel=\"footnote\"\u003E48\u003C\u002Fa\u003E\u003C\u002Fspan\u003E-\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref50\" rel=\"footnote\"\u003E50\u003C\u002Fa\u003E\u003C\u002Fspan\u003E] and the \u003Ci\u003EPDDCA data set\u003C\u002Fi\u003E released as part of the 2015 challenge (\u003Ci\u003EPDDCA test set\u003C\u002Fi\u003E [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref25\" rel=\"footnote\"\u003E25\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]).\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003E\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#table1\" rel=\"footnote\"\u003ETable 1\u003C\u002Fa\u003E\u003C\u002Fspan\u003E details the characteristics of these data sets and their patient demographics. Ethnicity and protected-group status are not reported, as this information was not available in the source systems. In total, 21 organs at risk were selected to represent a wide range of anatomical regions throughout the head and neck. To provide a human clinical comparison for the algorithm, each case was manually segmented by a single radiographer with arbitration by a second radiographer. This was compared with our study&#8217;s \u003Ci\u003Egold standard\u003C\u002Fi\u003E ground truth graded by 2 other radiographers and arbitrated by one of 2 independent specialist oncologists, each with a minimum of 4 years specialist experience in radiotherapy treatment planning for patients with head and neck cancer.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EAn example of model performance is shown in \u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#figure7\" rel=\"footnote\"\u003EFigure 7\u003C\u002Fa\u003E\u003C\u002Fspan\u003E, two further randomly selected UCLH set scans are shown in Figures S1 and S2 of \u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#app1\" rel=\"footnote\"\u003EMultimedia Appendix 1\u003C\u002Fa\u003E\u003C\u002Fspan\u003E [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref19\" rel=\"footnote\"\u003E19\u003C\u002Fa\u003E\u003C\u002Fspan\u003E-\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref31\" rel=\"footnote\"\u003E31\u003C\u002Fa\u003E\u003C\u002Fspan\u003E,\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref34\" rel=\"footnote\"\u003E34\u003C\u002Fa\u003E\u003C\u002Fspan\u003E-\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref46\" rel=\"footnote\"\u003E46\u003C\u002Fa\u003E\u003C\u002Fspan\u003E,\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref56\" rel=\"footnote\"\u003E56\u003C\u002Fa\u003E\u003C\u002Fspan\u003E-\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref90\" rel=\"footnote\"\u003E90\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]. Three randomly selected TCIA set scans are shown in Figures S3, S4 and S5 of \u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#app1\" rel=\"footnote\"\u003EMultimedia Appendix 1\u003C\u002Fa\u003E\u003C\u002Fspan\u003E to visually demonstrate the model&#8217;s generalizability. We compared our performance (model vs oncologist) to radiographer performance (radiographer vs oncologist). For more information on data set selection and inclusion and exclusion criteria for patients and organs at risk, see the \u003Ci\u003EMethods\u003C\u002Fi\u003E section.\u003C\u002Fp\u003E\u003Cfigure\u003E\u003Ca name=\"figure7\"\u003E&#8206;\u003C\u002Fa\u003E\u003Ca class=\"fancybox\" title=\"Figure 7. Example results. Computed tomography (CT) image: axial slices at 5 representative levels from the raw CT scan of a male patient aged 55-59 years were selected from the University College London Hospitals data set (patient 20). These were selected to best demonstrate the organs at risks included in the work. The levels shown as 2D slices have been selected to demonstrate all 21 organs at risks included in this study. The window leveling has been adjusted for each to best display the anatomy present. Oncologist contour: the ground truth segmentation, as defined by experienced radiographers and arbitrated by a head and neck specialist oncologist. Model contour: segmentations produced by our model. Contour comparison: contoured by oncologist only (green region) or model only (yellow region). Best viewed on a display. CT: computed tomography.\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fbc23e253cd9b35d2effefc6bad63e7cd.png\" id=\"figure7\"\u003E\u003Cimg class=\"figure-image\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fbc23e253cd9b35d2effefc6bad63e7cd.png\"\u002F\u003E\u003C\u002Fa\u003E\u003Cfigcaption\u003E\u003Cspan class=\"typcn typcn-image\"\u002F\u003EFigure 7. Example results. Computed tomography (CT) image: axial slices at 5 representative levels from the raw CT scan of a male patient aged 55-59 years were selected from the University College London Hospitals data set (patient 20). These were selected to best demonstrate the organs at risks included in the work. The levels shown as 2D slices have been selected to demonstrate all 21 organs at risks included in this study. The window leveling has been adjusted for each to best display the anatomy present. Oncologist contour: the ground truth segmentation, as defined by experienced radiographers and arbitrated by a head and neck specialist oncologist. Model contour: segmentations produced by our model. Contour comparison: contoured by oncologist only (green region) or model only (yellow region). Best viewed on a display. CT: computed tomography. \u003C\u002Ffigcaption\u003E\u003Ca class=\"fancybox\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002Fbc23e253cd9b35d2effefc6bad63e7cd.png\" title=\"Figure 7. Example results. Computed tomography (CT) image: axial slices at 5 representative levels from the raw CT scan of a male patient aged 55-59 years were selected from the University College London Hospitals data set (patient 20). These were selected to best demonstrate the organs at risks included in the work. The levels shown as 2D slices have been selected to demonstrate all 21 organs at risks included in this study. The window leveling has been adjusted for each to best display the anatomy present. Oncologist contour: the ground truth segmentation, as defined by experienced radiographers and arbitrated by a head and neck specialist oncologist. Model contour: segmentations produced by our model. Contour comparison: contoured by oncologist only (green region) or model only (yellow region). Best viewed on a display. CT: computed tomography.\"\u003EView this figure\u003C\u002Fa\u003E\u003C\u002Ffigure\u003E\u003Ch4\u003EA New Metric for Assessing Clinical Performance\u003C\u002Fh4\u003E\u003Cp class=\"abstract-paragraph\"\u003EIn routine clinical care, algorithm-derived segmentation is reviewed and potentially corrected by a human expert, just as those created by radiographers currently are. Segmentation performance is thus best assessed by determining the fraction of the surface that needs to be redrawn. The standard volumetric DSC [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref91\" rel=\"footnote\"\u003E91\u003C\u002Fa\u003E\u003C\u002Fspan\u003E] is not well suited to this because it weighs all regions of misplaced delineation equally and independently of their distance from the surface. For example, two inaccurate segmentations could have a similar volumetric DSC score if one were to deviate from the correct surface boundary by a small amount in many places, whereas the other had a large deviation at a single point. Correcting the former would likely take a considerable amount of time as it would require redrawing almost all of the boundary, whereas the latter could be corrected much faster, potentially with a single edit action.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EFor quantitative analysis, we therefore introduced a new segmentation performance metric, the \u003Ci\u003Esurface DSC\u003C\u002Fi\u003E (\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#figure8\" rel=\"footnote\"\u003EFigure 8\u003C\u002Fa\u003E\u003C\u002Fspan\u003E), which assesses the overlap of two surfaces (at a specified tolerance) instead of the overlap of two volumes. This provides a measure of agreement between the surfaces of two structures, which is where most of the human effort in correcting is usually expended. In doing so, we also addressed the volumetric DSC&#8217;s bias toward large organs at risk, where the large (and mostly trivial) internal volume accounts for a much larger proportion of the score.\u003C\u002Fp\u003E\u003Cfigure\u003E\u003Ca name=\"figure8\"\u003E&#8206;\u003C\u002Fa\u003E\u003Ca class=\"fancybox\" title=\"Figure 8. Surface Dice similarity coefficient performance metric. (a) Illustration of the computation of the surface Dice similarity coefficient. Continuous line: predicted surface. Dashed line: ground truth surface. Black arrow: the maximum margin of deviation that may be tolerated without penalty, hereafter referred to by &#964;. Note that in our use case each organ at risk has an independently calculated value for &#964;. Green: acceptable surface parts (distance between surfaces &#8804;&#964;). Pink: unacceptable regions of the surfaces (distance between surfaces &#8804;&#964;). The proposed surface Dice similarity coefficient metric reports the good surface parts compared with the total surface (sum of predicted surface area and ground truth surface area). (b) Illustration of the determination of the organ-specific tolerance. Green: segmentation of an organ by oncologist A. Black: segmentation by oncologist B. Red: distances between the surfaces.\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F22d8f0548a88dcb5fd0e3bf1929b02c8.png\" id=\"figure8\"\u003E\u003Cimg class=\"figure-image\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F22d8f0548a88dcb5fd0e3bf1929b02c8.png\"\u002F\u003E\u003C\u002Fa\u003E\u003Cfigcaption\u003E\u003Cspan class=\"typcn typcn-image\"\u002F\u003EFigure 8. Surface Dice similarity coefficient performance metric. (a) Illustration of the computation of the surface Dice similarity coefficient. Continuous line: predicted surface. Dashed line: ground truth surface. Black arrow: the maximum margin of deviation that may be tolerated without penalty, hereafter referred to by &#964;. Note that in our use case each organ at risk has an independently calculated value for &#964;. Green: acceptable surface parts (distance between surfaces &#8804;&#964;). Pink: unacceptable regions of the surfaces (distance between surfaces &#8804;&#964;). The proposed surface Dice similarity coefficient metric reports the good surface parts compared with the total surface (sum of predicted surface area and ground truth surface area). (b) Illustration of the determination of the organ-specific tolerance. Green: segmentation of an organ by oncologist A. Black: segmentation by oncologist B. Red: distances between the surfaces. \u003C\u002Ffigcaption\u003E\u003Ca class=\"fancybox\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F22d8f0548a88dcb5fd0e3bf1929b02c8.png\" title=\"Figure 8. Surface Dice similarity coefficient performance metric. (a) Illustration of the computation of the surface Dice similarity coefficient. Continuous line: predicted surface. Dashed line: ground truth surface. Black arrow: the maximum margin of deviation that may be tolerated without penalty, hereafter referred to by &#964;. Note that in our use case each organ at risk has an independently calculated value for &#964;. Green: acceptable surface parts (distance between surfaces &#8804;&#964;). Pink: unacceptable regions of the surfaces (distance between surfaces &#8804;&#964;). The proposed surface Dice similarity coefficient metric reports the good surface parts compared with the total surface (sum of predicted surface area and ground truth surface area). (b) Illustration of the determination of the organ-specific tolerance. Green: segmentation of an organ by oncologist A. Black: segmentation by oncologist B. Red: distances between the surfaces.\"\u003EView this figure\u003C\u002Fa\u003E\u003C\u002Ffigure\u003E\u003Cp class=\"abstract-paragraph\"\u003EWhen evaluating the surface DSC, we must define a threshold within which the variation is clinically acceptable. To do this, we first defined the organ-specific tolerances (in mm) as a parameter of the proposed metric, &#964;. We computed these acceptable tolerances for each organ by measuring the interobserver variation in segmentations between 3 different consultant oncologists (each with over 10 years of experience in organ at risk delineation) on the validation subset of TCIA images.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003ETo penalize both false-negative and false-positive parts of the predicted surface, our proposed metrics measure both the nonsymmetric distances between the surfaces and then normalize them by the combined surface area. Similar to volumetric DSC, the surface DSC ranges from 0 (no overlap) to 1 (perfect overlap).\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EThis means that approximately 95% of the surface was properly outlined (ie, within &#964; mm of the correct boundary), whereas 5% needs to be corrected. There is no consensus as to what constitutes a nonsignificant variation in such a segmentation. Thus, we selected a surface DSC of 0.95, a stringency that likely far exceeds the expert oncologist intrarater concordance [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref19\" rel=\"footnote\"\u003E19\u003C\u002Fa\u003E\u003C\u002Fspan\u003E,\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref92\" rel=\"footnote\"\u003E92\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]. For a more formal definition and implementation, see the \u003Ci\u003EMethods\u003C\u002Fi\u003E section.\u003C\u002Fp\u003E\u003Ch4\u003EModel Performance\u003C\u002Fh4\u003E\u003Cp class=\"abstract-paragraph\"\u003EModel performance was evaluated alongside that of therapeutic radiographers (each with at least 4 years of experience) segmenting the test set of UCLH images independently of the oncologist-reviewed scans (which we used as our ground truth).\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EThe model performed similarly to humans. For all organs at risk studied, there was no clinically meaningful difference between the deep learning model&#8217;s segmentations and those of the radiographers (\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#figure9\" rel=\"footnote\"\u003EFigure 9\u003C\u002Fa\u003E\u003C\u002Fspan\u003E and Tables S1 and S2, \u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#app1\" rel=\"footnote\"\u003EMultimedia Appendix 1\u003C\u002Fa\u003E\u003C\u002Fspan\u003E). For details on the number of labelled scans in the UCLH test set, see Table S3 in \u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#app1\" rel=\"footnote\"\u003EMultimedia Appendix 1\u003C\u002Fa\u003E\u003C\u002Fspan\u003E.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003ETo investigate the generalizability of our model, we additionally evaluated the performance of open-source scans (\u003Ci\u003ETCIA test set\u003C\u002Fi\u003E). These were collected from sites in the United States, where patient demographics, clinical pathways for radiotherapy, and scanner type and parameters differed from our UK training set in meaningful ways. Nevertheless, model performance was preserved, and in 90% (19\u002F21) organs at risk, the model was performed within the threshold defined for human variability (\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#figure10\" rel=\"footnote\"\u003EFigure 10\u003C\u002Fa\u003E\u003C\u002Fspan\u003E). The fact that performance in 2 organs at risk (brainstem and right lens) was less than that in UK data may relate to issues of image quality in several TCIA test set scans.\u003C\u002Fp\u003E\u003Cfigure\u003E\u003Ca name=\"figure9\"\u003E&#8206;\u003C\u002Fa\u003E\u003Ca class=\"fancybox\" title=\"Figure 9. University College London Hospitals (UCLH) test set: quantitative performance of the model in comparison with radiographers. (a) The model achieves a surface Dice similarity coefficient similar to humans in all 21 organs at risk (on the UCLH held out test set) when compared with the gold standard for each organ at an organ-specific tolerance &#964;. Blue: our model; green: radiographers. (b) Performance difference between the model and the radiographers. Each blue dot represents a model-radiographer pair. The gray area highlights nonsubstantial differences (&#8722;5% to +5%). The box extends from the lower to upper quartile values of the data, with a line at the median. The whiskers indicate most extreme, nonoutlier data points. Where data lie outside, an IQR of 1.5 is represented as a circular flier. The notches represent the 95% CI around the median. DSC: Dice similarity coefficient; UCLH: University College London Hospitals.\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F575e67d4074029b6161117fe32c21a66.png\" id=\"figure9\"\u003E\u003Cimg class=\"figure-image\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F575e67d4074029b6161117fe32c21a66.png\"\u002F\u003E\u003C\u002Fa\u003E\u003Cfigcaption\u003E\u003Cspan class=\"typcn typcn-image\"\u002F\u003EFigure 9. University College London Hospitals (UCLH) test set: quantitative performance of the model in comparison with radiographers. (a) The model achieves a surface Dice similarity coefficient similar to humans in all 21 organs at risk (on the UCLH held out test set) when compared with the gold standard for each organ at an organ-specific tolerance &#964;. Blue: our model; green: radiographers. (b) Performance difference between the model and the radiographers. Each blue dot represents a model-radiographer pair. The gray area highlights nonsubstantial differences (&#8722;5% to +5%). The box extends from the lower to upper quartile values of the data, with a line at the median. The whiskers indicate most extreme, nonoutlier data points. Where data lie outside, an IQR of 1.5 is represented as a circular flier. The notches represent the 95% CI around the median. DSC: Dice similarity coefficient; UCLH: University College London Hospitals. \u003C\u002Ffigcaption\u003E\u003Ca class=\"fancybox\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F575e67d4074029b6161117fe32c21a66.png\" title=\"Figure 9. University College London Hospitals (UCLH) test set: quantitative performance of the model in comparison with radiographers. (a) The model achieves a surface Dice similarity coefficient similar to humans in all 21 organs at risk (on the UCLH held out test set) when compared with the gold standard for each organ at an organ-specific tolerance &#964;. Blue: our model; green: radiographers. (b) Performance difference between the model and the radiographers. Each blue dot represents a model-radiographer pair. The gray area highlights nonsubstantial differences (&#8722;5% to +5%). The box extends from the lower to upper quartile values of the data, with a line at the median. The whiskers indicate most extreme, nonoutlier data points. Where data lie outside, an IQR of 1.5 is represented as a circular flier. The notches represent the 95% CI around the median. DSC: Dice similarity coefficient; UCLH: University College London Hospitals.\"\u003EView this figure\u003C\u002Fa\u003E\u003C\u002Ffigure\u003E\u003Cfigure\u003E\u003Ca name=\"figure10\"\u003E&#8206;\u003C\u002Fa\u003E\u003Ca class=\"fancybox\" title=\"Figure 10. Model generalizability to an independent test set from The Cancer Imaging Archive (TCIA). Quantitative performance of the model on TCIA test set in comparison with radiographers. (a) Surface Dice similarity coefficient (on the TCIA open-source test set) for the segmentations compared with the gold standard for each organ at an organ-specific tolerance &#964;. Blue: our model, green: radiographers. (b) Performance difference between the model and the radiographers. Each blue dot represents a model-radiographer pair. Red lines show the mean difference. The gray area highlights nonsubstantial differences (&#8722;5% to +5%). The box extends from the lower to upper quartile values of the data, with a line at the median. The whiskers indicate most extreme, nonoutlier data points. Where data lie outside, an IQR of 1.5 is represented as a circular flier. The notches represent the 95% CI around the median. DSC: Dice similarity coefficient; TCIA: The Cancer Imaging Archive.\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F695687440a4e5a83918929c8810032a5.png\" id=\"figure10\"\u003E\u003Cimg class=\"figure-image\" src=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F695687440a4e5a83918929c8810032a5.png\"\u002F\u003E\u003C\u002Fa\u003E\u003Cfigcaption\u003E\u003Cspan class=\"typcn typcn-image\"\u002F\u003EFigure 10. Model generalizability to an independent test set from The Cancer Imaging Archive (TCIA). Quantitative performance of the model on TCIA test set in comparison with radiographers. (a) Surface Dice similarity coefficient (on the TCIA open-source test set) for the segmentations compared with the gold standard for each organ at an organ-specific tolerance &#964;. Blue: our model, green: radiographers. (b) Performance difference between the model and the radiographers. Each blue dot represents a model-radiographer pair. Red lines show the mean difference. The gray area highlights nonsubstantial differences (&#8722;5% to +5%). The box extends from the lower to upper quartile values of the data, with a line at the median. The whiskers indicate most extreme, nonoutlier data points. Where data lie outside, an IQR of 1.5 is represented as a circular flier. The notches represent the 95% CI around the median. DSC: Dice similarity coefficient; TCIA: The Cancer Imaging Archive. \u003C\u002Ffigcaption\u003E\u003Ca class=\"fancybox\" href=\"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F695687440a4e5a83918929c8810032a5.png\" title=\"Figure 10. Model generalizability to an independent test set from The Cancer Imaging Archive (TCIA). Quantitative performance of the model on TCIA test set in comparison with radiographers. (a) Surface Dice similarity coefficient (on the TCIA open-source test set) for the segmentations compared with the gold standard for each organ at an organ-specific tolerance &#964;. Blue: our model, green: radiographers. (b) Performance difference between the model and the radiographers. Each blue dot represents a model-radiographer pair. Red lines show the mean difference. The gray area highlights nonsubstantial differences (&#8722;5% to +5%). The box extends from the lower to upper quartile values of the data, with a line at the median. The whiskers indicate most extreme, nonoutlier data points. Where data lie outside, an IQR of 1.5 is represented as a circular flier. The notches represent the 95% CI around the median. DSC: Dice similarity coefficient; TCIA: The Cancer Imaging Archive.\"\u003EView this figure\u003C\u002Fa\u003E\u003C\u002Ffigure\u003E\u003Cp class=\"abstract-paragraph\"\u003EFor more detailed results demonstrating surface DSC and volumetric DSC for each individual patient from the TCIA test set, see Table S4 and Table S5, respectively, in \u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#app1\" rel=\"footnote\"\u003EMultimedia Appendix 1\u003C\u002Fa\u003E\u003C\u002Fspan\u003E.\u003C\u002Fp\u003E\u003Ch4\u003EComparison With Previous Work\u003C\u002Fh4\u003E\u003Cp class=\"abstract-paragraph\"\u003EAn accurate quantitative comparison with previously published literature is difficult because of inherent differences in definitions of ground truth segmentations and varied processes of arbitration and consensus building. Given that the use of surface DSC is novel in this study, we also reported the standard volumetric DSC scores achieved by our algorithm (despite the shortcomings of this method) so that our results can be directly compared with those in the existing literature. An overview of past papers that have reported mean volumetric DSC for unedited automatic delineation of head and neck CT organs at risk can be found in Table S6, \u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#app1\" rel=\"footnote\"\u003EMultimedia Appendix 1\u003C\u002Fa\u003E\u003C\u002Fspan\u003E. Each used different data sets, scanning parameters, and labeling protocols, meaning that the resulting volumetric DSC results varied significantly. No study, other than ours, segmented the lacrimal glands. We compared these results with those obtained when we applied our model to three different data sets: the TCIA open-source test set, an additional test set from the original UCLH data set (\u003Ci\u003EUCLH test set\u003C\u002Fi\u003E) and the data set released by the PDDCA as part of the 2015 Medical Image Computing and Computer Assisted Intervention head and neck radiotherapy organ at risk segmentation challenge (\u003Ci\u003EPDDCA test set\u003C\u002Fi\u003E [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref25\" rel=\"footnote\"\u003E25\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]). To contextualize the performance of our model, radiographer performance is shown on the TCIA test set, and oncologist interobserver variation is shown on the UCLH test set.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EAlthough not the primary test set, we nevertheless present per-patient surface DSC and volumetric DSC for the PDDCA test set in Table S7 and Table S8 in \u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#app1\" rel=\"footnote\"\u003EMultimedia Appendix 1\u003C\u002Fa\u003E\u003C\u002Fspan\u003E, respectively.\u003C\u002Fp\u003E\u003Cbr\u002F\u003E\u003Ch3 class=\"navigation-heading h3-main-heading\" id=\"Discussion\" data-label=\"Discussion\"\u003EDiscussion\u003C\u002Fh3\u003E\u003Ch4\u003EPrincipal Findings\u003C\u002Fh4\u003E\u003Cp class=\"abstract-paragraph\"\u003EWe demonstrated an automated deep learning&#8211;based segmentation algorithm that can perform as well as experienced radiographers for head and neck radiotherapy planning. Our model was developed using CT scans derived from routine clinical practice and therefore should be applicable in a hospital setting for the segmentation of organs at risk, routine radiation therapy quality assurance peer review, and in reducing the associated variability between different specialists and radiotherapy centers [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref93\" rel=\"footnote\"\u003E93\u003C\u002Fa\u003E\u003C\u002Fspan\u003E].\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EClinical applicability must be supported not only by high model performance but also by evidence of model generalizability to new external data sets. To achieve this, we presented these results on three separate test sets, one of which (the PDDCA test set) uses a different segmentation protocol. In this study, performance in most organs at risk was maintained when tested on scans taken from a range of previously unseen international sites. Although these scans varied in patient demographics, scanning protocol, device manufacturer, and image quality, the model still achieved human performance on 19 of the 21 organs at risk studied; only the right lens and brainstem were below radiographer performance. For these organs at risk, the performance of the model might have been lower than expert performance owing to lower image quality. This is particularly evident for the right lens, where the anatomical borders were quite indistinct in some TCIA test set cases, thus preventing full segmentation by the model (Figure S6, \u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#app1\" rel=\"footnote\"\u003EMultimedia Appendix 1\u003C\u002Fa\u003E\u003C\u002Fspan\u003E). Moreover, a precise CT definition of the brainstem&#8217;s proximal and distal boundaries is lacking, a factor that might have contributed to labeling variability and thus to decreased model performance. Finally, demographic bias may have resulted from the TCIA data set selection for cases of more advanced head and neck cancer [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref48\" rel=\"footnote\"\u003E48\u003C\u002Fa\u003E\u003C\u002Fspan\u003E] or from variability in the training data [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref10\" rel=\"footnote\"\u003E10\u003C\u002Fa\u003E\u003C\u002Fspan\u003E].\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EOne major contribution of this paper is the presentation of a performance measure that represents the clinical task of organ at risk correction. In the first preprint of this work, we introduced surface DSC [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref70\" rel=\"footnote\"\u003E70\u003C\u002Fa\u003E\u003C\u002Fspan\u003E], a metric conceived to be sensitive to clinically significant errors in organ at risk delineation. Surface DSC has recently been shown to be more strongly correlated with the amount of time required to correct segmentation for clinical use than traditional metrics, including volumetric DSC [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref94\" rel=\"footnote\"\u003E94\u003C\u002Fa\u003E\u003C\u002Fspan\u003E,\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref95\" rel=\"footnote\"\u003E95\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]. Small deviations in organ at risk border placement can have a potentially serious impact, increasing the risk of debilitating side effects for the patient. Misplacement by only a small offset may thus require the entire region to be redrawn, and in such cases, an automated segmentation algorithm may offer no time savings. Volumetric DSC is relatively insensitive to such small changes in large organs, as the absolute overlap is also large. Difficulties identifying the exact borders of smaller organs can result in large differences in volumetric DSC, even if these differences are not clinically relevant in terms of their effect on radiotherapy treatment. By strongly penalizing border placement outside a tolerance determined by consultant oncologists, the surface DSC metric resolves these issues.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EAlthough volumetric DSC is therefore not representative of clinical consequences, it remains to be the most popular metric for evaluating segmentation models and therefore the only metric that allows comparison with previously published works. In recent years, fully convolutional networks have become the most popular and successful methodology for organ at risk segmentation in head and neck CT for de novo radiotherapy planning [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref40\" rel=\"footnote\"\u003E40\u003C\u002Fa\u003E\u003C\u002Fspan\u003E-\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref45\" rel=\"footnote\"\u003E45\u003C\u002Fa\u003E\u003C\u002Fspan\u003E,\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref58\" rel=\"footnote\"\u003E58\u003C\u002Fa\u003E\u003C\u002Fspan\u003E-\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref69\" rel=\"footnote\"\u003E69\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]. Although not directly comparable owing to different data sets and labeling protocols, our volumetric DSC results compare favorably with the existing published literature for many of the organs at risk (see Table S6 and Figure S7, \u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#app1\" rel=\"footnote\"\u003EMultimedia Appendix 1\u003C\u002Fa\u003E\u003C\u002Fspan\u003E, for more details on this and other prior publications). In organs at risk with inferior volumetric DSC scores compared with the published literature, both our model and human radiographers achieved similar scores. This suggests that current and previously published results are difficult to compare, either because of the inclusion of more difficult cases than previous studies or because of different segmentation and scanning protocols. To allow more objective comparisons of different segmentation methods, we made our labeled TCIA data sets freely available to the academic community (see the Acknowledgments section on data availability). At least 11 auto-segmentation software solutions are currently available commercially, with varying claims regarding their potential to lower segmentation time during radiotherapy planning [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref96\" rel=\"footnote\"\u003E96\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]. The principal factor that determines whether automatic segmentation is time saving during the radiotherapy workflow is the degree to which automated segmentations require correction by oncologists.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EThe wide variability in state-of-the-art and limited uptake in routine clinical practice motivates the need for clinical studies evaluating model performance in practice. Future work will seek to define the clinical acceptability of the segmented organs at risk produced by our models and estimate the time saving that could be achieved during the radiotherapy planning workflow in a real-world setting.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EA number of other study limitations should be addressed in future studies. First, we included only planning CT scans because magnetic resonance imaging and positron emission tomography scans were not routinely performed for all patients in the UCLH data set. Some organ at risk classes, such as optic chiasm, require co-registration with MR images for optimal delineation, and access to additional imaging has been shown to improve the delineation of optic nerves [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref29\" rel=\"footnote\"\u003E29\u003C\u002Fa\u003E\u003C\u002Fspan\u003E]. As a result, certain organ at risk classes were deliberately excluded from this CT-based project and will be addressed in future work that will incorporate magnetic resonance imaging scans. A second limitation is with regard to the classes of organs at risk in this study. Although we presented one of the largest sets of reported organs at risk in the literature [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref44\" rel=\"footnote\"\u003E44\u003C\u002Fa\u003E\u003C\u002Fspan\u003E,\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref97\" rel=\"footnote\"\u003E97\u003C\u002Fa\u003E\u003C\u002Fspan\u003E,\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref98\" rel=\"footnote\"\u003E98\u003C\u002Fa\u003E\u003C\u002Fspan\u003E], some omissions occurred (eg, oral cavity) owing to an insufficient number of examples in the training data that conformed to a standard international protocol. The number of oncologists used in the creation of our ground truth may not have fully captured the variability in organ at risk segmentation or may have been biased toward a particular interpretation of the Brouwer Atlas used as our segmentation protocol. Even in an organ as simple as the spinal cord that is traditionally reliably outlined by auto-segmentation algorithms, there is ambiguity between the inclusion of, for example, the nerve roots. Such variation may widen the thresholds of acceptable deviation in favor of the model, despite a consistent protocol. Future studies will address these deficits alongside time-consuming lymph node segmentation.\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EFinally, neither of the test sets used in this study included the patients&#8217; protected-characteristic status. This is a significant limitation, as it prevents the study of intersectional fairness.\u003C\u002Fp\u003E\u003Ch4\u003EConclusions\u003C\u002Fh4\u003E\u003Cp class=\"abstract-paragraph\"\u003EIn conclusion, we demonstrated that deep learning can achieve human expert&#8211;level performance in the segmentation of head and neck organs at risk in radiotherapy planning CT scans, using a clinically applicable performance metric designed for this clinical scenario. We provided evidence of the generalizability of this model by testing it on patients from different geographies, demographics, and scanning protocols. This segmentation algorithm was performed with similar accuracy compared with experts and has the potential to improve the speed, efficiency, and consistency of radiotherapy workflows, with an expected positive influence on patient outcomes. Future work will investigate the impact of our segmentation algorithm in clinical practice.\u003C\u002Fp\u003E\u003C\u002Farticle\u003E\u003Cp\u003E\u003Ch4 class=\"h4-border-top\"\u003EAcknowledgments\u003C\u002Fh4\u003E\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EThe codebase for the deep learning framework makes use of proprietary components, and we are unable to publicly release this code. However, all experiments and implementation details are described in detail in the Methods section to allow independent replication with nonproprietary libraries. The surface DSC performance metric code is available on the internet [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref99\" rel=\"footnote\"\u003E99\u003C\u002Fa\u003E\u003C\u002Fspan\u003E].\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EThe clinical data used for training and validation sets were collected and deidentified at the UCLH NHS Foundation Trust. The data were used for both local and national permissions. They are not publicly available, and restrictions apply to their use. The data, or a subset, may be available from the UCLH NHS Foundation Trust, subject to local and national ethical approvals. The released test or validation set data were collected from two data sets hosted on TCIA. The subset used, along with the ground truth segmentations added, is available on the internet [\u003Cspan class=\"footers\"\u003E\u003Ca class=\"citation-link\" href=\"#ref100\" rel=\"footnote\"\u003E100\u003C\u002Fa\u003E\u003C\u002Fspan\u003E].\u003C\u002Fp\u003E\u003Cp class=\"abstract-paragraph\"\u003EThe authors thank the patients treated at UCLH whose scans were used in this work, A Zisserman, D King, D Barrett, V Cornelius, C Beltran, J Cornebise, R Sharma, J Ashburner, J Good, and N Haji for discussions, M Kosmin for his review of the published literature, J Adler for discussion and review of the manuscript, A Warry, U Johnson, V Rompokos, and the rest of the UCLH Radiotherapy Physics team for work on the data collection, R West for work on the visuals, C Game, D Mitchell, and M Johnson for infrastructure and systems administration, A Paine at Softwire for engineering support at UCLH, A Kitchener and the UCLH Information Governance team for support, J Besley and M Bawn for legal assistance, K Ayoub, K Sullivan, and R Ahmed for initiating and supporting the collaboration, the DeepMind Radiographer Consortium made up of B Garie, Y McQuinlan, K Hampton, S Ireland, K Fuller, H Frank, C Tully, A Jones, and L Turner, and the rest of the DeepMind team for their support, ideas, and encouragement. GR and HM were supported by University College London and the National Institute for Health Research UCLH Biomedical Research Centre. The views expressed are those of the authors and not necessarily those of the NHS, the National Institute for Health Research, or the Department of Health.\u003C\u002Fp\u003E\u003Ch4 class=\"h4-border-top\"\u003EAuthors' Contributions\u003C\u002Fh4\u003E\u003Cp\u003E\u003Cp class=\"abstract-paragraph\"\u003EMS, TB, OR, JRL, RM, HM, SAM, DD, CC, and COH initiated the project. SB, RM, DC, CB, and DD, CC, and JRL created the data sets. SB, SN, JDF, AZ, YP, COH, HA, and OR contributed to software engineering. SN, JDF, BRP, and OR designed the model architectures. BG, YMQ, SI, KH and KF manually segmented the images. RM, DC, CB, DD, SAM, HM, GR, COH, AK, and JRL contributed clinical expertise. CM, JRL, TB, SAM, KS, and OR managed the project. COH, CK, ML, JRL, SN, SB, JDF, HM, GR, and OR wrote the paper.\u003C\u002Fp\u003E\u003C\u002Fp\u003E\u003Ch4 class=\"h4-border-top\"\u003EConflicts of Interest\u003C\u002Fh4\u003E\u003Cp\u003E\u003Cp class=\"abstract-paragraph\"\u003EGR, HM, CK, COH, and DC were paid contractors of DeepMind and Google Health.\u003C\u002Fp\u003E\u003C\u002Fp\u003E\n &#8206;\n \u003Cdiv id=\"app1\" name=\"app1\"\u003EMultimedia Appendix 1\u003Cp class=\"abstract-paragraph\"\u003EAdditional Tables S1-S8 and Figures S1-S7 show further visual examples of model outputs, performance metrics and detailed comparisons to previously published works.\u003C\u002Fp\u003E\u003Ca href=\"https:\u002F\u002Fjmir.org\u002Fapi\u002Fdownload?alt_name=jmir_v23i7e26151_app1.pdf&amp;filename=584abec9f29d69baaab930a03ecb2c2d.pdf\" target=\"_blank\"\u003EPDF File (Adobe PDF File), 10937 KB\u003C\u002Fa\u003E\u003C\u002Fdiv\u003E\u003Chr\u002F\u003E\u003Cdiv class=\"footnotes\"\u003E\u003Ch4 class=\"h4-border-top\" id=\"References\"\u003EReferences\u003C\u002Fh4\u003E\u003Col\u003E\u003Cli\u003E\u003Cspan id=\"ref1\"\u003EJemal A, Bray F, Center MM, Ferlay J, Ward E, Forman D. Global cancer statistics. CA Cancer J Clin 2011;61(2):69-90 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.3322\u002Fcaac.20107\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.3322\u002Fcaac.20107\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=21296855&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref2\"\u003EHead and neck cancers incidence statistics. Cancer Research UK. \n &#160; URL: \u003Ca target=\"_blank\" href=\"https:\u002F\u002Fwww.cancerresearchuk.org\u002Fhealth-professional\u002Fcancer-statistics\u002Fstatistics-by-cancer-type\u002Fhead-and-neck-cancers\u002Fincidence#heading-Two\"\u003Ehttps:\u002F&#8203;\u002Fwww.&#8203;cancerresearchuk.org\u002F&#8203;health-professional\u002F&#8203;cancer-statistics\u002F&#8203;statistics-by-cancer-type\u002F&#8203;head-and-neck-cancers\u002F&#8203;incidence#heading-Two\u003C\u002Fa\u003E [accessed 2018-02-08]\n \u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref3\"\u003ENCIN data briefing: potentially HPV-related head and neck cancers. National Cancer Intelligence Network. \n &#160; URL: \u003Ca target=\"_blank\" href=\"http:\u002F\u002Fwww.ncin.org.uk\u002Fpublications\u002Fdata_briefings\u002Fpotentially_hpv_related_head_and_neck_cancers\"\u003Ehttp:\u002F\u002Fwww.ncin.org.uk\u002Fpublications\u002Fdata_briefings\u002Fpotentially_hpv_related_head_and_neck_cancers\u003C\u002Fa\u003E [accessed 2021-05-17]\n \u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref4\"\u003EProfile of head and neck cancers in England: incidence, mortality and survival. Oxford Cancer Intelligence Unit. 2010. \n &#160; URL: \u003Ca target=\"_blank\" href=\"http:\u002F\u002Fwww.ncin.org.uk\u002Fview?rid=69\"\u003Ehttp:\u002F\u002Fwww.ncin.org.uk\u002Fview?rid=69\u003C\u002Fa\u003E [accessed 2021-05-17]\n \u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref5\"\u003EParkin DM, Boyd L, Walker LC. 16. The fraction of cancer attributable to lifestyle and environmental factors in the UK in 2010. Br J Cancer 2011 Dec 06;105 Suppl 2:77-81 [\u003Ca href=\"http:\u002F\u002Feuropepmc.org\u002Fabstract\u002FMED\u002F22158327\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1038\u002Fbjc.2011.489\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=22158327&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref6\"\u003EJensen K, Lambertsen K, Grau C. Late swallowing dysfunction and dysphagia after radiotherapy for pharynx cancer: frequency, intensity and correlation with dose and volume parameters. Radiother Oncol 2007 Oct;85(1):74-82. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.radonc.2007.06.004\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=17673322&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref7\"\u003EDirix P, Abbeel S, Vanstraelen B, Hermans R, Nuyts S. Dysphagia after chemoradiotherapy for head-and-neck squamous cell carcinoma: dose-effect relationships for the swallowing structures. Int J Radiat Oncol Biol Phys 2009 Oct 01;75(2):385-392 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.ijrobp.2008.11.041\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.ijrobp.2008.11.041\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=19553033&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref8\"\u003ECaudell JJ, Schaner PE, Desmond RA, Meredith RF, Spencer SA, Bonner JA. Dosimetric factors associated with long-term dysphagia after definitive radiotherapy for squamous cell carcinoma of the head and neck. Int J Radiat Oncol Biol Phys 2010 Feb 01;76(2):403-409. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.ijrobp.2009.02.017\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=19467801&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref9\"\u003ENutting CM, Morden JP, Harrington KJ, Urbano TG, Bhide SA, Clark C, et al. Parotid-sparing intensity modulated versus conventional radiotherapy in head and neck cancer (PARSPORT): a phase 3 multicentre randomised controlled trial. Lancet Oncol 2011 Feb;12(2):127-136. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fs1470-2045(10)70290-4\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref10\"\u003ENelms BE, Tom&#233; WA, Robinson G, Wheeler J. Variations in the contouring of organs at risk: test case from a patient with oropharyngeal cancer. Int J Radiat Oncol Biol Phys 2012 Jan 01;82(1):368-378. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.ijrobp.2010.10.019\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=21123004&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref11\"\u003EVoet PW, Dirkx ML, Teguh DN, Hoogeman MS, Levendag PC, Heijmen BJ. Does atlas-based autosegmentation of neck levels require subsequent manual contour editing to avoid risk of severe target underdosage? A dosimetric analysis. Radiother Oncol 2011 Mar;98(3):373-377. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.radonc.2010.11.017\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=21269714&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref12\"\u003EHarari PM, Song S, Tom&#233; WA. Emphasizing conformal avoidance versus target definition for IMRT planning in head-and-neck cancer. Int J Radiat Oncol Biol Phys 2010 Jul 01;77(3):950-958 [\u003Ca href=\"http:\u002F\u002Feuropepmc.org\u002Fabstract\u002FMED\u002F20378266\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.ijrobp.2009.09.062\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=20378266&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref13\"\u003EChen Z, King W, Pearcey R, Kerba M, Mackillop WJ. The relationship between waiting time for radiotherapy and clinical outcomes: a systematic review of the literature. Radiother Oncol 2008 Apr;87(1):3-16. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.radonc.2007.11.016\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=18160158&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref14\"\u003EMikeljevic JS, Haward R, Johnston C, Crellin A, Dodwell D, Jones A, et al. Trends in postoperative radiotherapy delay and the effect on survival in breast cancer patients treated with conservation surgery. Br J Cancer 2004 Apr 05;90(7):1343-1348 [\u003Ca href=\"http:\u002F\u002Feuropepmc.org\u002Fabstract\u002FMED\u002F15054452\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1038\u002Fsj.bjc.6601693\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=15054452&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref15\"\u003EThe NHS Cancer Plan and the new NHS, Chapter 5. National Health Service. 2004. \n &#160; URL: \u003Ca target=\"_blank\" href=\"http:\u002F\u002Fwww.wales.nhs.uk\u002Ftechnologymls\u002Fenglish\u002Fresources\u002Fpdf\u002Fcancer_nsf.pdf\"\u003Ehttp:\u002F\u002Fwww.wales.nhs.uk\u002Ftechnologymls\u002Fenglish\u002Fresources\u002Fpdf\u002Fcancer_nsf.pdf\u003C\u002Fa\u003E [accessed 2021-05-17]\n \u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref16\"\u003ERound C, Williams M, Mee T, Kirkby N, Cooper T, Hoskin P, et al. Radiotherapy demand and activity in England 2006-2020. Clin Oncol (R Coll Radiol) 2013 Sep;25(9):522-530. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.clon.2013.05.005\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=23768454&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref17\"\u003ERosenblatt E, Zubizarreta E. Radiotherapy in cancer care: facing the global challenge. International Atomic Energy Agency. 2017. \n &#160; URL: \u003Ca target=\"_blank\" href=\"https:\u002F\u002Fwww-pub.iaea.org\u002FMTCD\u002FPublications\u002FPDF\u002FP1638_web.pdf\"\u003Ehttps:\u002F\u002Fwww-pub.iaea.org\u002FMTCD\u002FPublications\u002FPDF\u002FP1638_web.pdf\u003C\u002Fa\u003E [accessed 2021-05-17]\n \u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref18\"\u003EVeiga C, McClelland J, Moinuddin S, Louren&#231;o A, Ricketts K, Annkah J, et al. Toward adaptive radiotherapy for head and neck patients: feasibility study on using CT-to-CBCT deformable registration for \"dose of the day\" calculations. Med Phys 2014 Mar 19;41(3):031703 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1118\u002F1.4864240\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1118\u002F1.4864240\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=24593707&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref19\"\u003EDaisne J, Blumhofer A. Atlas-based automatic segmentation of head and neck organs at risk and nodal target volumes: a clinical validation. Radiat Oncol 2013 Jun 26;8(1):154. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1186\u002F1748-717x-8-154\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref20\"\u003EFortunati V, Verhaart RF, van der Lijn F, Niessen WJ, Veenland JF, Paulides MM, et al. Tissue segmentation of head and neck CT images for treatment planning: a multiatlas approach combined with intensity modeling. Med Phys 2013 Jul 20;40(7):071905. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1118\u002F1.4810971\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=23822442&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref21\"\u003EHoang Duc AK, Eminowicz G, Mendes R, Wong S, McClelland J, Modat M, et al. Validation of clinical acceptability of an atlas-based segmentation algorithm for the delineation of organs at risk in head and neck cancer. Med Phys 2015 Sep 05;42(9):5027-5034 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1118\u002F1.4927567\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1118\u002F1.4927567\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=26328953&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref22\"\u003EThomson D, Boylan C, Liptrot T, Aitkenhead A, Lee L, Yap B, et al. Evaluation of an automatic segmentation algorithm for definition of head and neck organs at risk. Radiat Oncol 2014;9(1):173. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1186\u002F1748-717x-9-173\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref23\"\u003EWalker GV, Awan M, Tao R, Koay EJ, Boehling NS, Grant JD, et al. Prospective randomized double-blind study of atlas-based organ-at-risk autosegmentation-assisted radiation planning in head and neck cancer. Radiother Oncol 2014 Sep;112(3):321-325 [\u003Ca href=\"https:\u002F\u002Flinkinghub.elsevier.com\u002Fretrieve\u002Fpii\u002FS0167-8140(14)00358-2\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.radonc.2014.08.028\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=25216572&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref24\"\u003EGacha SJ, Le&#243;n SA. Segmentation of mandibles in computer tomography volumes of patients with foam cells carcinoma. In: Proceedings of the IX International Seminar of Biomedical Engineering (SIB). 2018 Presented at: IX International Seminar of Biomedical Engineering (SIB); May 16-18, 2018; Bogota, Colombia. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1109\u002FSIB.2018.8467732\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref25\"\u003ERaudaschl PF, Zaffino P, Sharp GC, Spadea MF, Chen A, Dawant BM, et al. Evaluation of segmentation methods on head and neck CT: auto-segmentation challenge 2015. Med Phys 2017 May 21;44(5):2020-2036 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1002\u002Fmp.12197\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1002\u002Fmp.12197\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=28273355&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref26\"\u003EWu X, Udupa JK, Tong Y, Odhner D, Pednekar GV, Simone CB, et al. AAR-RT - A system for auto-contouring organs at risk on CT images for radiation therapy planning: Principles, design, and large-scale evaluation on head-and-neck and thoracic cancer cases. Med Image Anal 2019 May;54:45-62 [\u003Ca href=\"http:\u002F\u002Feuropepmc.org\u002Fabstract\u002FMED\u002F30831357\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.media.2019.01.008\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30831357&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref27\"\u003EFritscher K, Raudaschl P, Zaffino P, Spadea MF, Sharp GC, Schubert R. Deep neural networks for fast segmentation of 3D medical images. In: Medical Image Computing and Computer-Assisted Intervention &#8211; MICCAI 2016. Switzerland: Springer; 2016:158-165.\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref28\"\u003EIbragimov B, Xing L. Segmentation of organs-at-risks in head and neck CT images using convolutional neural networks. Med Phys 2017 Feb 16;44(2):547-557 [\u003Ca href=\"http:\u002F\u002Feuropepmc.org\u002Fabstract\u002FMED\u002F28205307\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1002\u002Fmp.12045\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=28205307&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref29\"\u003EMo&#269;nik D, Ibragimov B, Xing L, Strojan P, Likar B, Pernu&#353; F, et al. Segmentation of parotid glands from registered CT and MR images. Phys Med 2018 Aug;52:33-41 [\u003Ca href=\"http:\u002F\u002Feuropepmc.org\u002Fabstract\u002FMED\u002F30139607\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.ejmp.2018.06.012\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30139607&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref30\"\u003ERen X, Xiang L, Nie D, Shao Y, Zhang H, Shen D, et al. Interleaved 3D-CNNs for joint segmentation of small-volume structures in head and neck CT images. Med Phys 2018 May 23;45(5):2063-2075 [\u003Ca href=\"http:\u002F\u002Feuropepmc.org\u002Fabstract\u002FMED\u002F29480928\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1002\u002Fmp.12837\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=29480928&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref31\"\u003EZhong T, Huang X, Tang F, Liang S, Deng X, Zhang Y. Boosting-based cascaded convolutional neural networks for the segmentation of CT organs-at-risk in nasopharyngeal carcinoma. Med Phys 2019 Sep 16;46(12):5602-5611 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1002\u002Fmp.13825\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1002\u002Fmp.13825\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=31529501&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref32\"\u003ERonneberger O, Fischer P, Brox T. U-net: Convolutional networks for biomedical image segmentation. In: Medical Image Computing and Computer-Assisted Intervention &#8211; MICCAI 2015. Switzerland: Springer; 2015:234-241.\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref33\"\u003EDe Fauw J, Ledsam JR, Romera-Paredes B, Nikolov S, Tomasev N, Blackwell S, et al. Clinically applicable deep learning for diagnosis and referral in retinal disease. Nat Med 2018 Sep 13;24(9):1342-1350. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1038\u002Fs41591-018-0107-6\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30104768&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref34\"\u003EH&#228;nsch A, Schwier M, Gass T, Morgas T, Haas B, Klein J, et al. Comparison of different deep learning approaches for parotid gland segmentation from CT images. Proc. SPIE 10575, Med Imag 2018: Comp-Aid Diag 2018:1057519 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1117\u002F12.2292962\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1117\u002F12.2292962\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref35\"\u003EZhu W, Huang Y, Tang H, Qian Z, Du N, Fan W, et al. AnatomyNet: Deep 3D Squeeze-and-excitation U-Nets for fast and fully automated whole-volume anatomical segmentation. BioRxiv 2018:A. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1101\u002F392969\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref36\"\u003ETong N, Gou S, Yang S, Ruan D, Sheng K. Fully automatic multi-organ segmentation for head and neck cancer radiotherapy using shape representation model constrained fully convolutional neural networks. Med Phys 2018 Oct 19;45(10):4558-4567 [\u003Ca href=\"http:\u002F\u002Feuropepmc.org\u002Fabstract\u002FMED\u002F30136285\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1002\u002Fmp.13147\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30136285&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref37\"\u003ELiang S, Tang F, Huang X, Yang K, Zhong T, Hu R, et al. Deep-learning-based detection and segmentation of organs at risk in nasopharyngeal carcinoma computed tomographic images for radiotherapy planning. Eur Radiol 2019 Apr 9;29(4):1961-1967 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1007\u002Fs00330-018-5748-9\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1007\u002Fs00330-018-5748-9\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30302589&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref38\"\u003EWillems S, Crijns W, Saint-Esteven AL, Veen JV, Robben D, Depuydt T, et al. Clinical implementation of DeepVoxNet for auto-delineation of organs at risk in head and neck cancer patients in radiotherapy. In: Endoscopy, Clinical Image-Based Procedures, and Skin Image Analysis. Switzerland: Springer; 2018:223-232.\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref39\"\u003EKodym O, &#352;pan&#283;l M, Herout A. Segmentation of head and neck organs at risk using CNN with batch dice loss. arXiv.org: Computer Science - Computer Vision and Pattern Recognition. 2018. \n &#160; URL: \u003Ca target=\"_blank\" href=\"https:\u002F\u002Farxiv.org\u002Fabs\u002F1812.02427\"\u003Ehttps:\u002F\u002Farxiv.org\u002Fabs\u002F1812.02427\u003C\u002Fa\u003E [accessed 2021-05-17]\n \u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref40\"\u003EWang Y, Zhao L, Wang M, Song Z. Organ at risk segmentation in head and neck CT images using a two-stage segmentation framework based on 3D U-Net. IEEE Access 2019;7:144591-144602 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1109\u002FACCESS.2019.2944958\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1109\u002Faccess.2019.2944958\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref41\"\u003EMen K, Geng H, Cheng C, Zhong H, Huang M, Fan Y, et al. Technical Note: More accurate and efficient segmentation of organs-at-risk in radiotherapy with convolutional neural networks cascades. Med Phys 2019 Jan 07;46(1):286-292 [\u003Ca href=\"http:\u002F\u002Feuropepmc.org\u002Fabstract\u002FMED\u002F30450825\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1002\u002Fmp.13296\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30450825&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref42\"\u003ETappeiner E, Pr&#246;ll S, H&#246;nig M, Raudaschl PF, Zaffino P, Spadea MF, et al. Multi-organ segmentation of the head and neck area: an efficient hierarchical neural networks approach. Int J Comput Assist Radiol Surg 2019 May 7;14(5):745-754 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1007\u002Fs11548-019-01922-4\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1007\u002Fs11548-019-01922-4\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30847761&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref43\"\u003ERhee DJ, Cardenas CE, Elhalawani H, McCarroll R, Zhang L, Yang J, et al. Automatic detection of contouring errors using convolutional neural networks. Med Phys 2019 Nov 26;46(11):5086-5097 [\u003Ca href=\"http:\u002F\u002Feuropepmc.org\u002Fabstract\u002FMED\u002F31505046\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1002\u002Fmp.13814\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=31505046&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref44\"\u003ETang H, Chen X, Liu Y, Lu Z, You J, Yang M, et al. Clinically applicable deep learning framework for organs at risk delineation in CT images. Nat Mach Intell 2019 Sep 30;1(10):480-491 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1038\u002Fs42256-019-0099-z\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1038\u002Fs42256-019-0099-z\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref45\"\u003Evan Rooij W, Dahele M, Brandao HR, Delaney AR, Slotman BJ, Verbakel WF. Deep learning-based delineation of head and neck organs at risk: geometric and dosimetric evaluation. Int J Radiat Oncol Biol Phys 2019 Jul 01;104(3):677-684 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.ijrobp.2019.02.040\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.ijrobp.2019.02.040\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30836167&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref46\"\u003EGou S, Tong N, Qi S, Yang S, Chin R, Sheng K. Self-channel-and-spatial-attention neural network for automated multi-organ segmentation on head and neck CT images. Phys Med Biol 2020 Dec 11;65(24):245034 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1088\u002F1361-6560\u002Fab79c3\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1088\u002F1361-6560\u002Fab79c3\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=32097892&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref47\"\u003EMak RH, Endres MG, Paik JH, Sergeev RA, Aerts H, Williams CL, et al. Use of crowd innovation to develop an artificial intelligence-based solution for radiation therapy targeting. JAMA Oncol 2019 May 01;5(5):654-661 [\u003Ca href=\"http:\u002F\u002Feuropepmc.org\u002Fabstract\u002FMED\u002F30998808\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1001\u002Fjamaoncol.2019.0159\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30998808&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref48\"\u003EHead-neck cetuximab. The Cancer Imaging Archive (TCIA). \n &#160; URL: \u003Ca target=\"_blank\" href=\"https:\u002F\u002Fwiki.cancerimagingarchive.net\u002Fdisplay\u002FPublic\u002FHead-Neck+Cetuximab\"\u003Ehttps:\u002F\u002Fwiki.cancerimagingarchive.net\u002Fdisplay\u002FPublic\u002FHead-Neck+Cetuximab\u003C\u002Fa\u003E [accessed 2021-05-17]\n \u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref49\"\u003EClark K, Vendt B, Smith K, Freymann J, Kirby J, Koppel P, et al. The Cancer Imaging Archive (TCIA): maintaining and operating a public information repository. J Digit Imaging 2013 Dec 25;26(6):1045-1057 [\u003Ca href=\"http:\u002F\u002Feuropepmc.org\u002Fabstract\u002FMED\u002F23884657\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1007\u002Fs10278-013-9622-7\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=23884657&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref50\"\u003EZuley ML, Jarosz R, Kirk S, Colen R, Garcia K, Aredes ND. Radiology data from the cancer genome atlas head-neck squamous cell carcinoma [TCGA-HNSC] collection. The Cancer Imaging Archive (TCIA) 2020:A [\u003Ca href=\"https:\u002F\u002Fwiki.cancerimagingarchive.net\u002Fdisplay\u002FPublic\u002FTCGA-HNSC\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.7937\u002FK9\u002FTCIA.2016.LXKQ47MS\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref51\"\u003EBrouwer CL, Steenbakkers RJ, Bourhis J, Budach W, Grau C, Gr&#233;goire V, et al. CT-based delineation of organs at risk in the head and neck region: DAHANCA, EORTC, GORTEC, HKNPCSG, NCIC CTG, NCRI, NRG Oncology and TROG consensus guidelines. Radiother Oncol 2015 Oct;117(1):83-90. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.radonc.2015.07.041\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref52\"\u003EFelzenszwalb PF, Huttenlocher DP. Distance transforms of sampled functions. Theory Comput 2012;8(1):415-428. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.4086\u002Ftoc.2012.v008a019\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref53\"\u003EKingma DP, Ba J. Adam: a method for stochastic optimization. arXiv.org : Computer Science - Machine Learning. 2014. \n &#160; URL: \u003Ca target=\"_blank\" href=\"http:\u002F\u002Farxiv.org\u002Fabs\u002F1412.6980\"\u003Ehttp:\u002F\u002Farxiv.org\u002Fabs\u002F1412.6980\u003C\u002Fa\u003E [accessed 2021-05-17]\n \u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref54\"\u003EWu Z, Shen C, van den Hengel A. Bridging category-level and instance-level semantic image segmentation. arXiv.org : Computer Science - Computer Vision and Pattern Recognition. 2016. \n &#160; URL: \u003Ca target=\"_blank\" href=\"http:\u002F\u002Farxiv.org\u002Fabs\u002F1605.06885v1\"\u003Ehttp:\u002F\u002Farxiv.org\u002Fabs\u002F1605.06885v1\u003C\u002Fa\u003E [accessed 2021-05-17]\n \u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref55\"\u003ELorensen WE, Cline HE. Marching cubes: a high resolution 3D surface construction algorithm. SIGGRAPH Comput Graph 1987 Aug;21(4):163-169. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1145\u002F37402.37422\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref56\"\u003EWang Z, Wei L, Wang L, Gao Y, Chen W, Shen D. Hierarchical vertex regression-based segmentation of head and neck CT images for radiotherapy planning. IEEE Trans Image Process 2018 Feb;27(2):923-937. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1109\u002Ftip.2017.2768621\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref57\"\u003ETorosdagli N, Liberton DK, Verma P, Sincan M, Lee J, Pattanaik S, et al. Robust and fully automated segmentation of mandible from CT scans. In: Proceedings of the IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017). 2017 Presented at: IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017); April 18-21, 2017; Melbourne, VIC, Australia. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1109\u002Fisbi.2017.7950734\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref58\"\u003ELiang S, Thung K, Nie D, Zhang Y, Shen D. Multi-view spatial aggregation framework for joint localization and segmentation of organs at risk in head and neck CT images. IEEE Trans Med Imaging 2020 Sep;39(9):2794-2805 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1109\u002FTMI.2020.2975853\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1109\u002Ftmi.2020.2975853\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref59\"\u003EQiu B, Guo J, Kraeima J, Glas HH, Borra RJ, Witjes MJ, et al. Recurrent convolutional neural networks for mandible segmentation from computed tomography. arXiv.org: Electrical Engineering and Systems Science - Image and Video Processing. 2020. \n &#160; URL: \u003Ca target=\"_blank\" href=\"https:\u002F\u002Farxiv.org\u002Fabs\u002F2003.06486\"\u003Ehttps:\u002F\u002Farxiv.org\u002Fabs\u002F2003.06486\u003C\u002Fa\u003E [accessed 2021-05-27]\n \u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref60\"\u003ESun S, Liu Y, Bai N, Tang H, Chen X, Huang Q, et al. Attentionanatomy: a unified framework for whole-body organs at risk segmentation using multiple partially annotated datasets. In: Proceedings of the IEEE 17th International Symposium on Biomedical Imaging (ISBI). 2020 Presented at: IEEE 17th International Symposium on Biomedical Imaging (ISBI); April 3-7, 2020; Iowa City, IA, USA p. A. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1109\u002Fisbi45749.2020.9098588\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref61\"\u003Evan Dijk LV, Van den Bosch L, Aljabar P, Peressutti D, Both S, Steenbakkers RJ, et al. Improving automatic delineation for head and neck organs at risk by Deep Learning Contouring. Radiother Oncol 2020 Jan;142:115-123 [\u003Ca href=\"https:\u002F\u002Flinkinghub.elsevier.com\u002Fretrieve\u002Fpii\u002FS0167-8140(19)33111-1\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.radonc.2019.09.022\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=31653573&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref62\"\u003EWong J, Fong A, McVicar N, Smith S, Giambattista J, Wells D, et al. Comparing deep learning-based auto-segmentation of organs at risk and clinical target volumes to expert inter-observer variability in radiotherapy planning. Radiother Oncol 2020 Mar;144:152-158 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.radonc.2019.10.019\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.radonc.2019.10.019\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=31812930&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref63\"\u003EChan JW, Kearney V, Haaf S, Wu S, Bogdanov M, Reddick M, et al. A convolutional neural network algorithm for automatic segmentation of head and neck organs at risk using deep lifelong learning. Med Phys 2019 May 04;46(5):2204-2213 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1002\u002Fmp.13495\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1002\u002Fmp.13495\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30887523&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref64\"\u003EGao Y, Huang R, Chen M, Wang Z, Deng J, Chen Y, et al. FocusNet: imbalanced large and small organ segmentation with an end-to-end deep neural network for head and neck CT images. In: Medical Image Computing and Computer Assisted Intervention &#8211; MICCAI 2019. Switzerland: Springer; 2019:829-838.\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref65\"\u003EJiang J, Sharif E, Um H, Berry S, Veeraraghavan H. Local block-wise self attention for normal organ segmentation. arXiv.org: Computer Science - Computer Vision and Pattern Recognition. 2019. \n &#160; URL: \u003Ca target=\"_blank\" href=\"https:\u002F\u002Farxiv.org\u002Fabs\u002F1909.05054\"\u003Ehttps:\u002F\u002Farxiv.org\u002Fabs\u002F1909.05054\u003C\u002Fa\u003E [accessed 2021-05-16]\n \u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref66\"\u003ELei W, Wang H, Gu R, Zhang S, Wang G. DeepIGeoS-V2: Deep interactive segmentation of multiple organs from head and neck images with lightweight CNNs. In: Large-Scale Annotation of Biomedical Data and Expert Label Synthesis and Hardware Aware Learning for Medical Imaging and Computer Assisted Intervention. Switzerland: Springer; 2019:61-69.\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref67\"\u003ESun Y, Shi H, Zhang S, Wang P, Zhao W, Zhou X, et al. Accurate and rapid CT image segmentation of the eyes and surrounding organs for precise radiotherapy. Med Phys 2019 May 22;46(5):2214-2222 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1002\u002Fmp.13463\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1002\u002Fmp.13463\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=30815885&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref68\"\u003ETong N, Gou S, Yang S, Cao M, Sheng K. Shape constrained fully convolutional DenseNet with adversarial training for multiorgan segmentation on head and neck CT and low-field MR images. Med Phys 2019 Jun 06;46(6):2669-2682 [\u003Ca href=\"http:\u002F\u002Feuropepmc.org\u002Fabstract\u002FMED\u002F31002188\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1002\u002Fmp.13553\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=31002188&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref69\"\u003EXue Y, Tang H, Qiao Z, Gong G, Yin Y, Qian Z, et al. Shape-aware organ segmentation by predicting signed distance maps. arXiv.org: Computer Science - Computer Vision and Pattern Recognition 2020 Apr 03;34(07):12565-12572 [\u003Ca href=\"https:\u002F\u002Farxiv.org\u002Fabs\u002F1912.03849\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1609\u002Faaai.v34i07.6946\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref70\"\u003ENikolov S, Blackwell S, Mendes R, Fauw JD, Meyer C, Hughes C, DeepMind Radiographer Consortium, et al. Deep learning to achieve clinically applicable segmentation of head and neck anatomy for radiotherapy. arXiv.org: Computer Science - Computer Vision and Pattern Recognition. 2018. \n &#160; URL: \u003Ca target=\"_blank\" href=\"https:\u002F\u002Farxiv.org\u002Fabs\u002F1809.04430v1\"\u003Ehttps:\u002F\u002Farxiv.org\u002Fabs\u002F1809.04430v1\u003C\u002Fa\u003E [accessed 2021-05-16]\n \u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref71\"\u003EFritscher KD, Peroni M, Zaffino P, Spadea MF, Schubert R, Sharp G. Automatic segmentation of head and neck CT images for radiotherapy treatment planning using multiple atlases, statistical appearance models, and geodesic active contours. Med Phys 2014 May 24;41(5):051910 [\u003Ca href=\"http:\u002F\u002Feuropepmc.org\u002Fabstract\u002FMED\u002F24784389\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1118\u002F1.4871623\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=24784389&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref72\"\u003EQazi AA, Pekar V, Kim J, Xie J, Breen SL, Jaffray DA. Auto-segmentation of normal and target structures in head and neck CT images: a feature-driven model-based approach. Med Phys 2011 Nov 26;38(11):6160-6170 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1118\u002F1.3654160\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1118\u002F1.3654160\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=22047381&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref73\"\u003ETam CM, Yang X, Tian S, Jiang X, Beitler JJ, Li S. Automated delineation of organs-at-risk in head and neck CT images using multi-output support vector regression. In: Proceedings of the SPIE 10578, Medical Imaging 2018: Biomedical Applications in Molecular, Structural, and Functional Imaging. 2018 Presented at: SPIE 10578, Medical Imaging 2018: Biomedical Applications in Molecular, Structural, and Functional Imaging; March 12, 2018; Houston, Texas, United States p. -. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1117\u002F12.2292556\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref74\"\u003EWang Z, Liu X, Chen W. Head and neck CT atlases alignment based on anatomical priors constraint. J Med Imaging Health Infor 2019 Dec 01;9(9):2004-2011 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1166\u002Fjmihi.2019.2844\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1166\u002Fjmihi.2019.2844\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref75\"\u003EAyyalusamy A, Vellaiyan S, Subramanian S, Ilamurugu A, Satpathy S, Nauman M, et al. Auto-segmentation of head and neck organs at risk in radiotherapy and its dependence on anatomic similarity. Radiat Oncol J 2019 Jun;37(2):134-142 [\u003Ca href=\"https:\u002F\u002Fdx.doi.org\u002F10.3857\u002Froj.2019.00038\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.3857\u002Froj.2019.00038\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=31266293&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref76\"\u003EHaq R, Berry SL, Deasy JO, Hunt M, Veeraraghavan H. Dynamic multiatlas selection-based consensus segmentation of head and neck structures from CT images. Med Phys 2019 Dec 31;46(12):5612-5622 [\u003Ca href=\"http:\u002F\u002Feuropepmc.org\u002Fabstract\u002FMED\u002F31587300\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1002\u002Fmp.13854\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=31587300&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref77\"\u003EMcCarroll RE, Beadle BM, Balter PA, Burger H, Cardenas CE, Dalvie S, et al. Retrospective validation and clinical implementation of automated contouring of organs at risk in the head and neck: a step toward automated radiation treatment planning for low- and middle-income countries. J Glob Oncol 2018 Dec(4):1-11 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1200\u002FJGO.18.00055\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1200\u002Fjgo.18.00055\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref78\"\u003ELiu Q, Qin A, Liang J, Yan D. Evaluation of atlas-based auto-segmentation and deformable propagation of organs-at-risk for head-and-neck adaptive radiotherapy. Recent Pat Top Imaging 2016 May 24;5(2):79-87 [\u003Ca href=\"https:\u002F\u002Fwww.researchgate.net\u002Fprofile\u002FAn_Qin2\u002Fpublication\u002F304143072_Evaluation_of_Atlas-Based_Auto-Segmentation_and_Deformable_Propagation_of_Organs-at-Risk_for_Head-and-Neck_Adaptive_Radiotherapy\u002Flinks\u002F5bd8b8fda6fdcc3a8db1722c\u002FEvaluation-of-Atlas-Based-Auto-Segmentation-and-Deformable-Propagation-of-Organs-at-Risk-for-Head-and-Neck-Adaptive-Radiotherapy.pdf\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.2174\u002F2451827105999160415123925\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref79\"\u003ETao C, Yi J, Chen N, Ren W, Cheng J, Tung S, et al. Multi-subject atlas-based auto-segmentation reduces interobserver variation and improves dosimetric parameter consistency for organs at risk in nasopharyngeal carcinoma: a multi-institution clinical study. Radiother Oncol 2015 Jun;115(3):407-411 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.radonc.2015.05.012\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.radonc.2015.05.012\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=26025546&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref80\"\u003EWachinger C, Fritscher K, Sharp G, Golland P. Contour-driven atlas-based segmentation. IEEE Trans Med Imaging 2015 Dec;34(12):2492-2505 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1109\u002FTMI.2015.2442753\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1109\u002Ftmi.2015.2442753\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref81\"\u003EZhu M, Bzdusek K, Brink C, Eriksen JG, Hansen O, Jensen HA, et al. Multi-institutional quantitative evaluation and clinical validation of Smart Probabilistic Image Contouring Engine (SPICE) autosegmentation of target structures and normal tissues on computer tomography images in the head and neck, thorax, liver, and male pelvis areas. Int J Radiat Oncol Biol Phys 2013 Nov 15;87(4):809-816 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.ijrobp.2013.08.007\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.ijrobp.2013.08.007\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=24138920&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref82\"\u003ETeguh DN, Levendag PC, Voet PW, Al-Mamgani A, Han X, Wolf TK, et al. Clinical validation of atlas-based auto-segmentation of multiple target volumes and normal tissue (swallowing\u002Fmastication) structures in the head and neck. Int J Radiat Oncol Biol Phys 2011 Nov 15;81(4):950-957. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.ijrobp.2010.07.009\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=20932664&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref83\"\u003EHan X, Hibbard LS, O'Connell NP, Willcut V. Automatic segmentation of parotids in head and neck CT images using multi-atlas fusion. ResearchGate. 2011. \n &#160; URL: \u003Ca target=\"_blank\" href=\"https:\u002F\u002Fwww.researchgate.net\u002Fprofile\u002FLyndon-Hibbard\u002Fpublication\u002F228519091_Automatic_Segmentation_of_Parotids_in_Head_and_Neck_CT_Images_using_Multi-atlas_Fusion\u002Flinks\u002F0deec516d54dfccb97000000\u002FAutomatic-Segmentation-of-Parotids-in-Head-and-Neck-CT-Images-using-Multi-atlas-Fusion.pdf\"\u003Ehttps:\u002F&#8203;\u002Fwww.&#8203;researchgate.net\u002F&#8203;profile\u002F&#8203;Lyndon-Hibbard\u002F&#8203;publication\u002F&#8203;228519091_Automatic_Segmentation_of_Parotids_in_Head_and_Neck_CT_Images_using_Multi-atlas_Fusion\u002F&#8203;links\u002F&#8203;0deec516d54dfccb97000000\u002F&#8203;Automatic-Segmentation-of-Parotids-in-Head-and-Neck-CT-Images-using-Multi-atlas-Fusion.&#8203;pdf\u003C\u002Fa\u003E [accessed 2021-05-27]\n \u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref84\"\u003ESims R, Isambert A, Gr&#233;goire V, Bidault F, Fresco L, Sage J, et al. A pre-clinical assessment of an atlas-based automatic segmentation tool for the head and neck. Radiother Oncol 2009 Dec;93(3):474-478. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.radonc.2009.08.013\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=19758720&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref85\"\u003EHan X, Hoogeman MS, Levendag PC, Hibbard LS, Teguh DN, Voet P, et al. Atlas-based auto-segmentation of head and neck CT images. In: Medical Image Computing and Computer-Assisted Intervention &#8211; MICCAI 2008. Berlin, Heidelberg: Springer; 2008:434-441.\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref86\"\u003EHoogeman M, Han X, Teguh D, Voet P, Nowak P, Wolf T, et al. Atlas-based auto-segmentation of CT images in head and neck cancer: what is the best approach? Int J Radiat Oncol Biol Phys 2008 Sep;72(1):591 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.ijrobp.2008.06.196\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.ijrobp.2008.06.196\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref87\"\u003EHuang C, Badiei M, Seo H, Ma M, Liang X, Capaldi D, et al. Atlas based segmentations via semi-supervised diffeomorphic registrations. arXiv.org: Computer Science - Computer Vision and Pattern Recognition. 2019. \n &#160; URL: \u003Ca target=\"_blank\" href=\"https:\u002F\u002Farxiv.org\u002Fabs\u002F1911.10417\"\u003Ehttps:\u002F\u002Farxiv.org\u002Fabs\u002F1911.10417\u003C\u002Fa\u003E [accessed 2021-05-16]\n \u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref88\"\u003EHardcastle N, Tom&#233; WA, Cannon DM, Brouwer CL, Wittendorp PW, Dogan N, et al. A multi-institution evaluation of deformable image registration algorithms for automatic organ delineation in adaptive head and neck radiotherapy. Radiat Oncol 2012 Jun 15;7(1):90 [\u003Ca href=\"https:\u002F\u002Flink.springer.com\u002Farticle\u002F10.1186\u002F1748-717X-7-90\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1186\u002F1748-717x-7-90\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref89\"\u003ELa Macchia M, Fellin F, Amichetti M, Cianchetti M, Gianolini S, Paola V, et al. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer. Radiat Oncol 2012;7(1):160 [\u003Ca href=\"https:\u002F\u002Flink.springer.com\u002Farticle\u002F10.1186\u002F1748-717X-7-160\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1186\u002F1748-717x-7-160\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref90\"\u003EZhang T, Chi Y, Meldolesi E, Yan D. Automatic delineation of on-line head-and-neck computed tomography images: toward on-line adaptive radiotherapy. Int J Radiat Oncol Biol Phys 2007 Jun 01;68(2):522-530 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.ijrobp.2007.01.038\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.ijrobp.2007.01.038\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=17418960&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref91\"\u003EDice LR. Measures of the amount of ecologic association between species. Ecol Soc Am 1945;26(3):297-302. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.2307\u002F1932409\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref92\"\u003EHong T, Tome W, Chappel R, Harari P. Variations in target delineation for head and neck IMRT: an international multi-institutional study. Int J Radiat Oncol Biol Phys 2004 Sep;60:157-158 [\u003Ca href=\"http:\u002F\u002Fwww.sciencedirect.com\u002Fscience\u002Farticle\u002Fpii\u002FS0360301604011307\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fs0360-3016(04)01130-7\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref93\"\u003EWuthrick EJ, Zhang Q, Machtay M, Rosenthal DI, Nguyen-Tan PF, Fortin A, et al. Institutional clinical trial accrual volume and survival of patients with head and neck cancer. J Clin Oncol 2015 Jan 10;33(2):156-164 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1200\u002FJCO.2014.56.5218\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1200\u002Fjco.2014.56.5218\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref94\"\u003EVaassen F, Hazelaar C, Vaniqui A, Gooding M, van der Heyden B, Canters R, et al. Evaluation of measures for assessing time-saving of automatic organ-at-risk segmentation in radiotherapy. Phys Imaging Radiat Oncol 2020 Jan;13:1-6 [\u003Ca href=\"https:\u002F\u002Flinkinghub.elsevier.com\u002Fretrieve\u002Fpii\u002FS2405-6316(19)30063-6\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.phro.2019.12.001\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=33458300&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref95\"\u003EKiser KJ, Barman A, Stieb S, Fuller CD, Giancardo L. Novel autosegmentation spatial similarity metrics capture the time required to correct segmentations better than traditional metrics in a thoracic cavity segmentation workflow. medRxiv 2020. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1101\u002F2020.05.14.20102103\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref96\"\u003ESharp G, Fritscher KD, Pekar V, Peroni M, Shusharina N, Veeraraghavan H, et al. Vision 20\u002F20: perspectives on automated image segmentation for radiotherapy. Med Phys 2014 May 24;41(5):050902 [\u003Ca href=\"http:\u002F\u002Feuropepmc.org\u002Fabstract\u002FMED\u002F24784366\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1118\u002F1.4871620\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=24784366&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref97\"\u003EKosmin M, Ledsam J, Romera-Paredes B, Mendes R, Moinuddin S, de Souza D, et al. Rapid advances in auto-segmentation of organs at risk and target volumes in head and neck cancer. Radiother Oncol 2019 Jun;135:130-140 [\u003Ca href=\"https:\u002F\u002Fdoi.org\u002F10.1016\u002Fj.radonc.2019.03.004\" target=\"_blank\"\u003EFREE Full text\u003C\u002Fa\u003E] [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1016\u002Fj.radonc.2019.03.004\"\u003ECrossRef\u003C\u002Fa\u003E] [\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fentrez\u002Fquery.fcgi?cmd=Retrieve&amp;db=PubMed&amp;list_uids=31015159&amp;dopt=Abstract\" target=\"_blank\"\u003EMedline\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref98\"\u003EGuo D, Jin D, Zhu Z, Ho TY, Harrison HP, Chao CH, et al. Organ at risk segmentation for head and neck cancer using stratified learning and neural architecture search. In: Proceedings of the IEEE\u002FCVF Conference on Computer Vision and Pattern Recognition (CVPR). 2020 Presented at: IEEE\u002FCVF Conference on Computer Vision and Pattern Recognition (CVPR); June 13-19, 2020; Seattle, WA, USA. [\u003Ca target=\"_blank\" href=\"https:\u002F\u002Fdx.doi.org\u002F10.1109\u002Fcvpr42600.2020.00428\"\u003ECrossRef\u003C\u002Fa\u003E]\u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref99\"\u003ESurface distance. DeepMind. \n &#160; URL: \u003Ca target=\"_blank\" href=\"https:\u002F\u002Fgithub.com\u002Fdeepmind\u002Fsurface-distance\"\u003Ehttps:\u002F\u002Fgithub.com\u002Fdeepmind\u002Fsurface-distance\u003C\u002Fa\u003E [accessed 2021-05-27]\n \u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003Cli\u003E\u003Cspan id=\"ref100\"\u003ETCIA CT scan dataset. DeepMind. \n &#160; URL: \u003Ca target=\"_blank\" href=\"https:\u002F\u002Fgithub.com\u002Fdeepmind\u002Ftcia-ct-scan-dataset\"\u003Ehttps:\u002F\u002Fgithub.com\u002Fdeepmind\u002Ftcia-ct-scan-dataset\u003C\u002Fa\u003E [accessed 2021-05-27]\n \u003C\u002Fspan\u003E\u003C\u002Fli\u003E\u003C\u002Fol\u003E\u003C\u002Fdiv\u003E\u003Cbr\u002F\u003E\u003Chr\u002F\u003E\u003Ca name=\"Abbreviations\"\u003E&#8206;\u003C\u002Fa\u003E\u003Ch4 class=\"navigation-heading\" id=\"Abbreviations\" data-label=\"Abbreviations\"\u003EAbbreviations\u003C\u002Fh4\u003E\u003Ctable width=\"80%\" border=\"0\" align=\"center\"\u003E\u003Ctr\u003E\u003Ctd\u003E\u003Cb\u003ECT:\u003C\u002Fb\u003E computed tomography\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr\u003E\u003Ctd\u003E\u003Cb\u003EDSC:\u003C\u002Fb\u003E Dice similarity coefficient\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr\u003E\u003Ctd\u003E\u003Cb\u003ENHS:\u003C\u002Fb\u003E National Health Service\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr\u003E\u003Ctd\u003E\u003Cb\u003EPDDCA:\u003C\u002Fb\u003E public domain database for computational anatomy\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr\u003E\u003Ctd\u003E\u003Cb\u003ETCGA-HNSC:\u003C\u002Fb\u003E The Cancer Genome Atlas Head-Neck Squamous Cell Carcinoma\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr\u003E\u003Ctd\u003E\u003Cb\u003ETCIA:\u003C\u002Fb\u003E The Cancer Imaging Archive\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003Ctr\u003E\u003Ctd\u003E\u003Cb\u003EUCLH:\u003C\u002Fb\u003E University College London Hospitals\u003C\u002Ftd\u003E\u003C\u002Ftr\u003E\u003C\u002Ftable\u003E\u003Cbr\u002F\u003E\u003Chr\u002F\u003E\u003Cp style=\"font-style: italic\"\u003EEdited by R Kukafka; submitted 30.11.20; peer-reviewed by JA Ben&#237;tez-Andrades, R Vilela; comments to author 11.01.21; revised version received 10.02.21; accepted 30.04.21; published 12.07.21\u003C\u002Fp\u003E\u003Ca href=\"https:\u002F\u002Fsupport.jmir.org\u002Fhc\u002Fen-us\u002Farticles\u002F115002955531\" id=\"Copyright\" target=\"_blank\" class=\"navigation-heading h4 d-block\" aria-label=\"Copyright - what is a Creative Commons License?\" data-label=\"Copyright\"\u003ECopyright \u003Cspan class=\"fas fa-question-circle\"\u002F\u003E\u003C\u002Fa\u003E\u003Cp class=\"article-copyright\"\u003E&#169;Stanislav Nikolov, Sam Blackwell, Alexei Zverovitch, Ruheena Mendes, Michelle Livne, Jeffrey De Fauw, Yojan Patel, Clemens Meyer, Harry Askham, Bernadino Romera-Paredes, Christopher Kelly, Alan Karthikesalingam, Carlton Chu, Dawn Carnell, Cheng Boon, Derek D'Souza, Syed Ali Moinuddin, Bethany Garie, Yasmin McQuinlan, Sarah Ireland, Kiarna Hampton, Krystle Fuller, Hugh Montgomery, Geraint Rees, Mustafa Suleyman, Trevor Back, C&#237;an Owen Hughes, Joseph R Ledsam, Olaf Ronneberger. Originally published in the Journal of Medical Internet Research (https:\u002F\u002Fwww.jmir.org), 12.07.2021.\u003C\u002Fp\u003E\u003Csmall class=\"article-license\"\u003E\u003Cp class=\"abstract-paragraph\"\u003EThis is an open-access article distributed under the terms of the Creative Commons Attribution License (https:\u002F\u002Fcreativecommons.org\u002Flicenses\u002Fby\u002F4.0\u002F), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https:\u002F\u002Fwww.jmir.org\u002F, as well as this copyright and license information must be included.\u003C\u002Fp\u003E\u003C\u002Fsmall\u003E\u003Cbr\u002F\u003E\u003C\u002Fsection\u003E\u003C\u002Farticle\u003E\u003C\u002Fsection\u003E\u003C\u002Fsection\u003E\u003C\u002Fmain\u003E"}],fetch:{},error:a,state:{host:a,environment:d,journalPath:q,keys:{},domains:{},screensize:"desktop",accessibility:{filter:"none","font-weight":"inherit","font-size":.625,"text-align":"initial"},announcements:{data:[{announcement_id:525,title:"JMIR Publications Integrates With Web of Science to Recognize Peer Reviewers",description_short:"\u003Cp\u003EJMIR Publications, a leading open access publisher of academic journals, is pleased to announce a new partnership with the Web of Science Reviewer Recognition Service (Web of Science RRS) to provide official recognition for the critical contributions of its peer reviewers.\u003C\u002Fp\u003E\u003Cp\u003E\u003Cbr\u003E\u003C\u002Fp\u003E",date_posted:"2024-11-14T16:33:25.000Z",journal_id:b},{announcement_id:517,title:"Webinar Announcement: Navigating Academic Promotion - Key Strategies for Junior Faculty Success",description_short:"\u003Cp\u003EJoin us for an insightful webinar inspired by the article “Advice for Junior Faculty Regarding Academic Promotion.” This session is designed to provide junior faculty with practical strategies to navigate the academic promotion process smoothly and effectively. Whether you're just beginning your academic journey or aiming for your next promotion, this webinar offers valuable advice from both senior and junior faculty members.\u003C\u002Fp\u003E\u003Cp\u003E\u003Cbr\u003E\u003C\u002Fp\u003E",date_posted:"2024-10-22T15:09:34.000Z",journal_id:b},{announcement_id:512,title:"JMIR Publications CEO and Executive Editor Gunther Eysenbach Achieves #1 Ranking as Most Cited Researcher in Medical Informatics for Fifth Consecutive Year",description_short:"\u003Cp\u003EJMIR Publications is proud to announce that Gunther Eysenbach, founder, CEO and executive editor, has once again been named the #1 most cited researcher in the subfield of medical informatics by Stanford\u002FElsevier’s Top 2% Scientists rankings\u003C\u002Fp\u003E",date_posted:"2024-10-11T17:08:49.000Z",journal_id:b},{announcement_id:509,title:"Call for Papers: Theme Issue: The Emergence of Medical Futures Studies",description_short:"\u003Cp\u003E\u003Cstrong\u003EThe \u003Cem\u003EJournal of Medical Internet Research\u003C\u002Fem\u003E is pleased to announce a call for papers for the theme issue The Emergence of Medical Futures Studies. This is the first ever call for papers on this topic.\u003C\u002Fstrong\u003E\u003C\u002Fp\u003E",date_posted:"2024-10-08T12:00:52.000Z",journal_id:b},{announcement_id:493,title:"JMIR Publications + PREreview Live Review: Thursday, July 18- 12 pm ET",description_short:"\u003Cp\u003EJMIR Publications and PREreview are pleased to announce our next Preprint Live Review on Thursday, July 18\u003C\u002Fp\u003E",date_posted:"2024-07-04T13:05:37.000Z",journal_id:b},{announcement_id:476,title:"Journal of Medical Internet Research Receives a Journal Impact Factor of 5.8",description_short:"\u003Cp\u003EJMIR Publications announced today that its flagship journal, the \u003Cem\u003EJournal of Medical Internet Research\u003C\u002Fem\u003E, reported a Journal Impact Factor (JIF) of 5.8 as published in the 2024 Journal Citation Report (JCR) from Clarivate.\u003C\u002Fp\u003E",date_posted:"2024-06-26T16:13:36.000Z",journal_id:b},{announcement_id:475,title:"JMIR Publications Journals Shine in the 2024 Release of Journal Impact Factor by Clarivate",description_short:"\u003Cp\u003EJMIR Publications is pleased to announce the outstanding performance of its scholarly journals in the 2024 release of Journal Citation Reports (JCR) by Clarivate\u003C\u002Fp\u003E",date_posted:"2024-06-24T17:11:56.000Z",journal_id:b},{announcement_id:471,title:"New Scopus CiteScore Rankings Affirm JMIR Publications Journals are Leading in Their Respective Disciplines",description_short:"\u003Cp\u003E\u003Cstrong\u003EExciting News! JMIR Publications Achieves Impressive Results in the Latest Scopus CiteScore Release\u003C\u002Fstrong\u003E\u003C\u002Fp\u003E\u003Cp\u003EJMIR Publications is thrilled to announce an outstanding performance in the recently released Scopus CiteScore rankings. In all, 23 of its journals received a CiteScore this year, a testament to the high-quality research published across our diverse portfolio\u003C\u002Fp\u003E",date_posted:"2024-06-18T12:34:19.000Z",journal_id:b},{announcement_id:466,title:"JMIR Publications and Swedish Consortium Bibsam Join Forces in a Landmark Agreement to Advance Open Access",description_short:"\u003Cp\u003EOpen access publisher JMIR Publications, in partnership with Sweden's academic consortium Bibsam, with sales support by Accucoms, recently announced a landmark national agreement to eliminate the burden of article processing charges (APCs) for researchers in Sweden.&nbsp;\u003C\u002Fp\u003E\u003Cp\u003E\u003Cbr\u003E\u003C\u002Fp\u003E",date_posted:"2024-06-06T17:47:30.000Z",journal_id:b},{announcement_id:464,title:"JMIR Publications + PREreview Live Review: June 20, 2024 - 11 am ET",description_short:"\u003Cp\u003EJMIR Publications and PREreview are pleased to announce our next Preprint Live Review on Thursday, June 20\u003C\u002Fp\u003E",date_posted:"2024-05-31T08:51:19.000Z",journal_id:b}],pagination:{from:b,to:u,total:151,perPage:u,firstPage:b,lastPage:v}},article:{data:{article_id:26151,published_at:"2021-07-12T08:45:51.000Z",submitted_at:af,section_id:ag,journal_id:b,year:ah,issue:"7",volume:w,identifier:"26151",url:ai,pdf_url:"https:\u002F\u002Fwww.jmir.org\u002F2021\u002F7\u002Fe26151\u002FPDF",html_url:"https:\u002F\u002Fwww.jmir.org\u002F2021\u002F7\u002Fe26151",xml_url:"https:\u002F\u002Fwww.jmir.org\u002F2021\u002F7\u002Fe26151\u002FXML",title:"Clinically Applicable Segmentation of Head and Neck Anatomy for Radiotherapy: Deep Learning Algorithm Development and Validation Study",public_id:"J Med Internet Res 2021;23(7):e26151",thumbnail:"https:\u002F\u002Fasset.jmir.pub\u002Fassets\u002F391d0ca13f4f8f74602e2dbe63d25e08.png",doi:"10.2196\u002F26151",pmid:34255661,pmcid:"8314151",issue_title:"July",transfer:a,pages:[],authors:[{first_name:"Stanislav",last_name:"Nikolov",degrees:aj,deceased:a,orcid:"0000-0001-8234-0751",equal_contrib:b,matchedAffiliations:[b]},{first_name:"Sam",last_name:"Blackwell",degrees:aj,deceased:a,orcid:"0000-0001-8730-3036",equal_contrib:b,matchedAffiliations:[b]},{first_name:"Alexei",last_name:"Zverovitch",degrees:g,deceased:a,orcid:"0000-0002-0567-5440",equal_contrib:b,matchedAffiliations:[e]},{first_name:"Ruheena",last_name:"Mendes",degrees:I,deceased:a,orcid:"0000-0003-4754-1181",equal_contrib:f,matchedAffiliations:[h]},{first_name:"Michelle",last_name:"Livne",degrees:g,deceased:a,orcid:"0000-0002-8277-4733",equal_contrib:f,matchedAffiliations:[e]},{first_name:"Jeffrey",last_name:"De Fauw",degrees:"BSc",deceased:a,orcid:"0000-0001-6971-5678",equal_contrib:f,matchedAffiliations:[b]},{first_name:"Yojan",last_name:"Patel",degrees:"BA",deceased:a,orcid:"0000-0001-6397-6279",equal_contrib:f,matchedAffiliations:[e]},{first_name:"Clemens",last_name:"Meyer",degrees:x,deceased:a,orcid:"0000-0003-1165-6104",equal_contrib:f,matchedAffiliations:[b]},{first_name:"Harry",last_name:"Askham",degrees:x,deceased:a,orcid:"0000-0003-1530-4683",equal_contrib:f,matchedAffiliations:[b]},{first_name:"Bernadino",last_name:"Romera-Paredes",degrees:g,deceased:a,orcid:"0000-0003-3604-3590",equal_contrib:f,matchedAffiliations:[b]},{first_name:"Christopher",last_name:"Kelly",degrees:g,deceased:a,orcid:"0000-0002-1246-844X",equal_contrib:f,matchedAffiliations:[e]},{first_name:"Alan",last_name:"Karthikesalingam",degrees:g,deceased:a,orcid:"0000-0001-5074-898X",equal_contrib:f,matchedAffiliations:[e]},{first_name:"Carlton",last_name:"Chu",degrees:g,deceased:a,orcid:"0000-0001-8282-6364",equal_contrib:f,matchedAffiliations:[b]},{first_name:"Dawn",last_name:"Carnell",degrees:"MD",deceased:a,orcid:"0000-0002-2898-3219",equal_contrib:f,matchedAffiliations:[h]},{first_name:"Cheng",last_name:"Boon",degrees:I,deceased:a,orcid:"0000-0003-2652-9263",equal_contrib:f,matchedAffiliations:[y]},{first_name:"Derek",last_name:"D'Souza",degrees:x,deceased:a,orcid:"0000-0002-4393-7683",equal_contrib:f,matchedAffiliations:[h]},{first_name:"Syed Ali",last_name:"Moinuddin",degrees:x,deceased:a,orcid:"0000-0002-8955-8224",equal_contrib:f,matchedAffiliations:[h]},{first_name:"Bethany",last_name:"Garie",degrees:ak,deceased:a,orcid:"0000-0003-3538-9063",equal_contrib:f,matchedAffiliations:[b]},{first_name:"Yasmin",last_name:"McQuinlan",degrees:"BRT",deceased:a,orcid:"0000-0002-8464-0640",equal_contrib:f,matchedAffiliations:[b]},{first_name:"Sarah",last_name:"Ireland",degrees:ak,deceased:a,orcid:"0000-0002-2975-2447",equal_contrib:f,matchedAffiliations:[b]},{first_name:"Kiarna",last_name:"Hampton",degrees:"MPH",deceased:a,orcid:"0000-0002-4384-6108",equal_contrib:f,matchedAffiliations:[b]},{first_name:"Krystle",last_name:"Fuller",degrees:"BAppSc (RT)",deceased:a,orcid:"0000-0003-0706-6857",equal_contrib:f,matchedAffiliations:[b]},{first_name:"Hugh",last_name:"Montgomery",degrees:"BSc, MB BS, MD",deceased:a,orcid:"0000-0001-8797-5019",equal_contrib:f,matchedAffiliations:[m]},{first_name:"Geraint",last_name:"Rees",degrees:g,deceased:a,orcid:"0000-0002-9623-7007",equal_contrib:f,matchedAffiliations:[m]},{first_name:"Mustafa",last_name:"Suleyman",degrees:c,deceased:a,orcid:"0000-0002-5415-4457",equal_contrib:f,matchedAffiliations:[z]},{first_name:"Trevor",last_name:"Back",degrees:g,deceased:a,orcid:"0000-0002-0567-8043",equal_contrib:f,matchedAffiliations:[b]},{first_name:al,last_name:am,degrees:an,deceased:a,orcid:"0000-0001-6901-0985",equal_contrib:b,matchedAffiliations:[e]},{first_name:"Joseph R",last_name:"Ledsam",degrees:I,deceased:a,orcid:"0000-0001-9917-7196",equal_contrib:b,matchedAffiliations:[i]},{first_name:"Olaf",last_name:"Ronneberger",degrees:g,deceased:a,orcid:"0000-0002-4266-1515",equal_contrib:b,matchedAffiliations:[b]}],affiliations:[{aff_id:108377,author_id:227388,phone:a,fax:c,corresp_aff:j,aff_type:a,seq:b,article_id:a,institution_line_1:"DeepMind",institution_line_2:c,institution_line_3:c,address_line_1:a,address_line_2:a,city:n,prov_state:a,postal_code:a,country:k},{aff_id:108406,author_id:227390,phone:a,fax:c,corresp_aff:j,aff_type:a,seq:b,article_id:a,institution_line_1:ao,institution_line_2:c,institution_line_3:c,address_line_1:ap,address_line_2:c,city:n,prov_state:a,postal_code:aq,country:k},{aff_id:108380,author_id:227391,phone:a,fax:c,corresp_aff:j,aff_type:a,seq:b,article_id:a,institution_line_1:"University College London Hospitals NHS Foundation Trust",institution_line_2:c,institution_line_3:c,address_line_1:a,address_line_2:a,city:n,prov_state:a,postal_code:a,country:k},{aff_id:108381,author_id:227402,phone:a,fax:c,corresp_aff:j,aff_type:a,seq:b,article_id:a,institution_line_1:"Clatterbridge Cancer Centre NHS Foundation Trust",institution_line_2:c,institution_line_3:c,address_line_1:a,address_line_2:a,city:"Liverpool",prov_state:a,postal_code:a,country:k},{aff_id:108382,author_id:227410,phone:a,fax:c,corresp_aff:j,aff_type:a,seq:b,article_id:a,institution_line_1:"University College London",institution_line_2:c,institution_line_3:c,address_line_1:a,address_line_2:a,city:n,prov_state:a,postal_code:a,country:k},{aff_id:108383,author_id:227412,phone:a,fax:c,corresp_aff:j,aff_type:a,seq:b,article_id:a,institution_line_1:"Google",institution_line_2:c,institution_line_3:c,address_line_1:a,address_line_2:a,city:n,prov_state:a,postal_code:a,country:k},{aff_id:108407,author_id:227415,phone:a,fax:c,corresp_aff:j,aff_type:a,seq:b,article_id:a,institution_line_1:"Google AI",institution_line_2:c,institution_line_3:c,address_line_1:a,address_line_2:a,city:"Tokyo",prov_state:a,postal_code:a,country:"Japan"}],primaryAuthor:{first_name:al,last_name:am,email:"cianh@google.com",degrees:an,primaryAffiliation:{fax:"1 650-253-0001",phone:"1 650-253-0000",country:k,postal_code:aq,prov_state:a,city:n,address_line_1:ap,address_line_2:c,institution_line_1:ao,institution_line_2:c,institution_line_3:c}},abstract:"Background: Over half a million individuals are diagnosed with head and neck cancer each year globally. Radiotherapy is an important curative treatment for this disease, but it requires manual time to delineate radiosensitive organs at risk. This planning process can delay treatment while also introducing interoperator variability, resulting in downstream radiation dose differences. Although auto-segmentation algorithms offer a potentially time-saving solution, the challenges in defining, quantifying, and achieving expert performance remain.\nObjective: Adopting a deep learning approach, we aim to demonstrate a 3D U-Net architecture that achieves expert-level performance in delineating 21 distinct head and neck organs at risk commonly segmented in clinical practice.\nMethods: The model was trained on a data set of 663 deidentified computed tomography scans acquired in routine clinical practice and with both segmentations taken from clinical practice and segmentations created by experienced radiographers as part of this research, all in accordance with consensus organ at risk definitions.\nResults: We demonstrated the model’s clinical applicability by assessing its performance on a test set of 21 computed tomography scans from clinical practice, each with 21 organs at risk segmented by 2 independent experts. We also introduced surface Dice similarity coefficient, a new metric for the comparison of organ delineation, to quantify the deviation between organ at risk surface contours rather than volumes, better reflecting the clinical task of correcting errors in automated organ segmentations. The model’s generalizability was then demonstrated on 2 distinct open-source data sets, reflecting different centers and countries to model training.\nConclusions: Deep learning is an effective and clinically applicable technique for the segmentation of the head and neck anatomy for radiotherapy. With appropriate validation studies and regulatory approvals, this system could improve the efficiency, consistency, and safety of radiotherapy pathways.\n",keywords:"artificial intelligence; machine learning; radiotherapy; convolutional neural networks; segmentation; contouring; surface dsc; unet",date_submitted:af,title_html:a,sections:[{title:"Artificial Intelligence",section_id:ag,journal_id:b,colour:r,count:1389},{title:"Machine Learning",section_id:500,journal_id:i,colour:A,count:1478},{title:"Clinical Informatics",section_id:58,journal_id:b,colour:r,count:1033},{title:"Imaging Informatics",section_id:412,journal_id:i,colour:A,count:195},{title:"Decision Support for Health Professionals",section_id:186,journal_id:i,colour:A,count:1194},{title:"Clinical Information and Decision Making",section_id:67,journal_id:b,colour:r,count:1415},{title:"Innovations and Technology in Cancer Care",section_id:297,journal_id:B,colour:ar,count:425}],preprint:l,articleKD:G,isOldOjphiMigrated:G}},articles:{recent:[],openReview:[]},articleTypes:{},authentication:{data:a,jwt:a},countries:{data:[]},departments:{data:[]},help:{data:{}},journal:{data:{journal_id:b,title:as,tag:at,description:a,path:q,slug:q,seq:b,enabled:b,environment:d,url:au,batch:b,year:av,colour:r,impact:J,order:b,published:aw,transfers:a,cite_score:ax,settings:{aboutJournal:"\u003Cp\u003EThe \u003Cem\u003EJournal of Medical Internet Research\u003C\u002Fem\u003E (JMIR)&nbsp;is the pioneer open access eHealth journal, and is the flagship journal of JMIR Publications. It is a leading health services and digital health journal globally in terms of quality\u002Fvisibility \u003Ca href=\"..\u002F..\u002F..\u002F..\u002F..\u002Fannouncements\u002F476\"\u003E(Journal Impact Factor&trade; 5.8 (Clarivate, 2024))\u003C\u002Fa\u003E,&nbsp;ranking Q1 in both the 'Medical Informatics' and 'Health Care Sciences &amp; Services'&nbsp;categories,&nbsp;and is also the largest journal in the field.&nbsp;The journal is \u003Ca href=\"https:\u002F\u002Fscholar.google.com\u002Fcitations?view_op=top_venues&amp;hl=en&amp;vq=eng_medicalinformatics\" rel=\"noopener\" target=\"_blank\"\u003Eranked #1 on Google Scholar\u003C\u002Fa\u003E in the 'Medical Informatics' discipline.&nbsp;The journal focuses on emerging technologies, medical devices, apps, engineering, telehealth and informatics applications for patient education, prevention, population health and clinical care.\u003C\u002Fp\u003E\r\n\u003Cp\u003EJMIR is indexed in all major literature indices including \u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fnlmcatalog\u002F100959882\"\u003ENational Library of Medicine(NLM)\u002FMEDLINE\u003C\u002Fa\u003E, \u003Ca href=\"https:\u002F\u002Fv2.sherpa.ac.uk\u002Fid\u002Fpublisher\u002F2600\"\u003ESherpa\u002FRomeo,\u003C\u002Fa\u003E&nbsp;\u003Ca href=\"https:\u002F\u002Fpubmed.ncbi.nlm.nih.gov\u002F?term=%22Journal+of+medical+Internet+research%22%5BJournal%5D&amp;sort=\"\u003EPubMed,\u003C\u002Fa\u003E&nbsp;\u003Ca href=\"https:\u002F\u002Fwww.ncbi.nlm.nih.gov\u002Fpmc\u002Fjournals\u002F224\u002F\"\u003EPMC\u003C\u002Fa\u003E,&nbsp;\u003Ca href=\"https:\u002F\u002Fwww.scopus.com\u002Fsourceid\u002F23709\"\u003EScopus\u003C\u002Fa\u003E, Psycinfo, \u003Ca href=\"https:\u002F\u002Fmjl.clarivate.com\u002Fjournal-profile\"\u003EClarivate (which includes Web of Science (WoS)\u002FESCI\u002FSCIE)\u003C\u002Fa\u003E, EBSCO\u002FEBSCO Essentials,&nbsp;\u003Ca href=\"https:\u002F\u002Fdoaj.org\u002Ftoc\u002F1438-8871?source=%7B%22query%22%3A%7B%22bool%22%3A%7B%22must%22%3A%5B%7B%22terms%22%3A%7B%22index.issn.exact%22%3A%5B%221438-8871%22%5D%7D%7D%5D%7D%7D%2C%22size%22%3A100%2C%22sort%22%3A%5B%7B%22created_date%22%3A%7B%22order%22%3A%22desc%22%7D%7D%5D%2C%22_source%22%3A%7B%7D%2C%22track_total_hits%22%3Atrue%7D\"\u003EDOAJ\u003C\u002Fa\u003E, GoOA and others. The \u003Cem\u003EJournal of Medical Internet Research\u003C\u002Fem\u003E received a CiteScore of \u003Ca href=\"..\u002F..\u002F..\u002F..\u002F..\u002Fannouncements\u002F471\"\u003E14.4\u003C\u002Fa\u003E, placing it in the 95th percentile (#7 of 138) as a Q1 journal in the field of Health Informatics.&nbsp;It is a selective journal complemented by almost \u003Ca href=\"https:\u002F\u002Fjmir.zendesk.com\u002Fhc\u002Fen-us\u002Farticles\u002F115001442707\" target=\"_blank\"\u003E30 specialty JMIR sister journals\u003C\u002Fa\u003E, which have a broader scope, and which together receive over 10,000 submissions a year.&nbsp;\u003C\u002Fp\u003E\r\n\u003Cp\u003EAs an open access journal, we are read by clinicians, allied health professionals, informal caregivers, and patients alike, and have (as with all JMIR journals) a focus on readable and applied science reporting the design and evaluation of health innovations and emerging technologies. We publish original research, viewpoints, and reviews (both literature reviews and medical device\u002Ftechnology\u002Fapp reviews). Peer-review reports are \u003Ca href=\"https:\u002F\u002Fjmir.zendesk.com\u002Fhc\u002Fen-us\u002Farticles\u002F115001714547-How-do-I-request-a-manuscript-transfer-to-another-journal-\" target=\"_blank\"\u003Eportable\u003C\u002Fa\u003E across JMIR journals and papers can be transferred, so authors save time by not having to resubmit a paper to a different journal but can simply transfer it between journals.&nbsp;\u003C\u002Fp\u003E\r\n\u003Cp\u003EWe are also a leader in participatory and open science approaches, and offer the option to publish new submissions immediately as \u003Ca href=\"http:\u002F\u002Fpreprints.jmir.org\"\u003Epreprints\u003C\u002Fa\u003E, which receive DOIs for immediate citation (eg, in grant proposals), and for open peer-review purposes. We also invite patients to participate (eg, as peer-reviewers) and have patient representatives on editorial boards.\u003C\u002Fp\u003E\r\n\u003Cp\u003EAs all JMIR journals, the journal encourages Open Science principles and strongly encourages publication of a protocol before data collection. Authors who have published a protocol in&nbsp;\u003Ca href=\"https:\u002F\u002Fresearchprotocols.org\u002F\"\u003EJMIR Research Protocols\u003C\u002Fa\u003E&nbsp;get a discount of 20% on the Article Processing Fee when publishing a subsequent results paper in any JMIR journal.\u003C\u002Fp\u003E\r\n\u003Cp\u003EBe a widely cited leader in the digital health revolution and&nbsp;\u003Ca href=\"..\u002F..\u002F..\u002F..\u002F..\u002Fauthor\" target=\"_blank\"\u003Esubmit your paper today\u003C\u002Fa\u003E!\u003C\u002Fp\u003E",announcementLink:"https:\u002F\u002Fwww.jmir.org\u002Fannouncements\u002F476",copyrightNotice:c,focusScopeDesc:"\u003Cp\u003EThe \"\u003Cem\u003EJournal of Medical Internet Research\u003C\u002Fem\u003E\" (JMIR; ISSN 1438-8871, Medline-abbreviation: \u003Cem\u003EJ Med Internet Res\u003C\u002Fem\u003E) \u003Ca href=\"..\u002F..\u002F2019\u002F12\u002Fe17578\u002F\"\u003E(founded in 1999, now in its 25th year!\u003C\u002Fa\u003E\u003Cspan\u003E)\u003C\u002Fspan\u003E is a leading health informatics and health services\u002Fhealth policy journal (ranking in the first quartile Q1 by Impact Factor in these disciplines) focusing on digital health, data science, health informatics and emerging technologies for health, medicine, and biomedical research. The journal is \u003Ca href=\"https:\u002F\u002Fscholar.google.com\u002Fcitations?view_op=top_venues&amp;hl=en&amp;vq=eng_medicalinformatics\" target=\"_blank\" rel=\"noopener\"\u003Eranked #1 on Google Scholar\u003C\u002Fa\u003E\u003Cspan\u003E in the 'Medical Informatics' discipline.&nbsp;\u003C\u002Fspan\u003E\u003C\u002Fp\u003E\r\n\u003Cp\u003EJMIR was the first open access journal covering health informatics, and the first international scientific peer-reviewed journal on all aspects of research, information and communication in the healthcare field using Internet and Internet-related technologies; a broad field, which is known as \"eHealth\" [see also \u003Ca href=\"..\u002F..\u002F2001\u002F2\u002Fe20\u002F\"\u003EWhat is eHealth\u003C\u002Fa\u003E and \u003Ca href=\"..\u002F..\u002F2001\u002F2\u002Fe22\u002F\"\u003EWhat is eHealth (2)\u003C\u002Fa\u003E], or now also \"\u003Cstrong\u003Edigital health\u003C\u002Fstrong\u003E\", which includes mHealth (mobile health). This field also has significant overlaps with what is called \"consumer health informatics\", health 2.0\u002Fmedicine 2.0, or participatory medicine. This focus makes JMIR unique among other medical or medical informatics journals, which tend to focus on clinical informatics or clinical applications. As eHealth\u002FmHealth is a highly interdisciplinary field we are not only inviting research papers from the medical sciences, but also from the computer, behavioral, social and communication sciences, psychology, library sciences, informatics, human-computer interaction studies, and related fields.\u003C\u002Fp\u003E\r\n\u003Cp\u003EThe term \"Internet\" is used in its broadest sense, so we are also interested in high impact studies and applications of digital medicine, mobile technologies, social media, novel wearable devices and sensors, connected home appliances, domotics etc.\u003C\u002Fp\u003E\r\n\u003Cp\u003EThe journal invites manuscripts that deal with the following topics (\u003Ca href=\"..\u002F..\u002Fthemes\"\u003Ethe main themes\u002Ftopics covered by this journal and sample papers can also be found here\u003C\u002Fa\u003E):\u003C\u002Fp\u003E\r\n\u003Cul\u003E\r\n\u003Cli\u003Enovel digital health approaches, methods, and devices\u003C\u002Fli\u003E\r\n\u003Cli\u003Elarge digital medicine \u002F digital therapeutics trials with clinical impact\u003C\u002Fli\u003E\r\n\u003Cli\u003Edata science, open data\u003C\u002Fli\u003E\r\n\u003Cli\u003Estudies evaluating the impact of Internet\u002Fsocial media use or specific eHealth\u002FmHealth interventions on individual health-related or social outcomes\u003C\u002Fli\u003E\r\n\u003Cli\u003Eevaluations and implementations of innovative mhealth (mobile health) applications, social media apps, ubiquitous computing, or innovative and emerging technologies in health\u003C\u002Fli\u003E\r\n\u003Cli\u003Edescriptions of the design and impact of Internet and mobile applications and websites or social media for consumers\u002Fpatients or medical professionals\u003C\u002Fli\u003E\r\n\u003Cli\u003Euse of the Internet, social media and mhealth in the context of clinical information and communication, including telemedicine\u003C\u002Fli\u003E\r\n\u003Cli\u003Euse of the Internet, social media, and mhealth in medical research and the basic sciences such as molecular biology or chemistry (e.g. bioinformatics, online factual databases)\u003C\u002Fli\u003E\r\n\u003Cli\u003Emedical information management and librarian sciences\u003C\u002Fli\u003E\r\n\u003Cli\u003Ee-learning and knowledge translation, online-courses, social media, web-based and mobile programs for undergraduate and continuing education,\u003C\u002Fli\u003E\r\n\u003Cli\u003EeHealth\u002FmHealth and social media applications for public health and population health technology (disease monitoring, teleprevention, teleepidemiology)\u003C\u002Fli\u003E\r\n\u003Cli\u003Eevidence-based medicine and the Internet and mhealth (e.g. online development or dissemination of clinical guidelines, measuring agreement about management of a given clinical problem among physicians, etc.)\u003C\u002Fli\u003E\r\n\u003Cli\u003Ethe impact of eHealth\u002FmHealth\u002FpHealth\u002FiHealth, social media, the Internet, or health care technologies on public health, the health care system and policy\u003C\u002Fli\u003E\r\n\u003Cli\u003Emethodological aspects of doing Internet\u002Fmhealth\u002Fsocial media research, e.g. methodology of web-based surveys\u003C\u002Fli\u003E\r\n\u003Cli\u003Edesign and validation of novel web-based instruments\u003C\u002Fli\u003E\r\n\u003Cli\u003Eecological momentary assessment, sensors, mobile technologies for gathering and analyzing data in real-time\u003C\u002Fli\u003E\r\n\u003Cli\u003Eanalysis of e-communities, social media communities, or virtual social networks\u003C\u002Fli\u003E\r\n\u003Cli\u003Ecomparisons of effectiveness of health communication and information on the Internet\u002FmHealth\u002Fsocial media compared with other methods of health communication,\u003C\u002Fli\u003E\r\n\u003Cli\u003Eeffects of the Internet\u002Fmhealth\u002Fsocial media and information\u002Fcommunication technology on the patient-physician relationship and impact on public health, e.g. the studies investigating how the patient-physician relationship changes as a result of the new ways of getting medical information\u003C\u002Fli\u003E\r\n\u003Cli\u003Eethical and legal problems as well as cross-border and cross-cultural issues of eHealth\u002FmHealth\u003C\u002Fli\u003E\r\n\u003Cli\u003Esystematic studies examining the quality of medical information available in various online venues\u003C\u002Fli\u003E\r\n\u003Cli\u003Emethods of evaluation, quality assessment and improvement of Internet information or eHealth applications\u003C\u002Fli\u003E\r\n\u003Cli\u003Eproposals for standards in the field of medical publishing on the Internet, including self-regulation issues, policies and guidelines to provide reliable healthcare information\u003C\u002Fli\u003E\r\n\u003Cli\u003Eresults and methodological aspects of Internet-based and social media studies, including medical surveys, psychological tests, quality-of-life studies, gathering and\u002For disseminating epidemiological data, use of the Internet\u002Fmobile apps\u002Fsocial media for clinical studies (e-trials), drug reaction reporting and surveillance systems etc.\u003C\u002Fli\u003E\r\n\u003Cli\u003Eelectronic medical publishing, Open Access publishing, altmetrics, and use of the Internet or social media for scholarly publishing (e.g. collaborative peer review)\u003C\u002Fli\u003E\r\n\u003Cli\u003Einformation needs of patients, consumers and health professionals, including studies evaluating search and retrieval behavior of patients\u003C\u002Fli\u003E\r\n\u003Cli\u003Eweb-based studies, e.g. online psychological experiments\u003C\u002Fli\u003E\r\n\u003Cli\u003Eevaluations of mhealth (mobile) applications, as well as ambient \u002F ubiquitous computing approaches, sensors, domotics, and other cutting edge technologies\u003C\u002Fli\u003E\r\n\u003Cli\u003Epersonal health records, patient portals, consumer health informatics applications\u003C\u002Fli\u003E\r\n\u003Cli\u003Ebehavior change technologies\u003C\u002Fli\u003E\r\n\u003Cli\u003EReviews, viewpoint papers and commentaries touching on the issues and themes listed above are also welcome, but should be grounded in data and\u002For a thorough literature review\u003C\u002Fli\u003E\r\n\u003C\u002Ful\u003E\r\n\u003Cp\u003EIn addition, the Journal will occasionally publish original research, reviews and tutorials on more generic, related topics such as:\u003C\u002Fp\u003E\r\n\u003Cul\u003E\r\n\u003Cli\u003EInternet standards\u003C\u002Fli\u003E\r\n\u003Cli\u003Ecybermetrics\u003C\u002Fli\u003E\r\n\u003Cli\u003Esecurity and confidentiality issues\u003C\u002Fli\u003E\r\n\u003Cli\u003EInternet demographics\u003C\u002Fli\u003E\r\n\u003Cli\u003Esocial impact of the Internet\u003C\u002Fli\u003E\r\n\u003Cli\u003Edigital imaging and multimedia\u003C\u002Fli\u003E\r\n\u003Cli\u003Ehealth care records\u003C\u002Fli\u003E\r\n\u003Cli\u003Ehigh-speed networks\u003C\u002Fli\u003E\r\n\u003Cli\u003Etelecommunication\u003C\u002Fli\u003E\r\n\u003Cli\u003Eelectronic publishing\u003C\u002Fli\u003E\r\n\u003Cli\u003Esoftware development\u003C\u002Fli\u003E\r\n\u003C\u002Ful\u003E\r\n\u003Cp\u003EThe \u003Cem\u003EJournal of Medical Internet Research\u003C\u002Fem\u003E is one of the flagship journals of \u003Ca href=\"https:\u002F\u002Fjmirpublications.com\u002F\"\u003EJMIR Publications\u003C\u002Fa\u003E and is \u003Cem\u003Ehighly selective\u003C\u002Fem\u003E. We are not a megajournal that publishes everything regardless of impact. To ensure a rapid turnaround time, we encourage that authors consider \u003Ca href=\"https:\u002F\u002Fsupport.jmir.org\u002Fhc\u002Fen-us\u002Farticles\u002F115001442707-Which-journal-titles-are-JMIR-Publications-currently-publishing-Journal-Portfolio-\"\u003Eother JMIR journal titles\u003C\u002Fa\u003E as well. While it is possible to transfer submissions from one journal to another before, during or after the review process (based on editorial suggestions), authors can avoid delays in decision-making by submitting to the right journal.\u003C\u002Fp\u003E\r\n\u003Cp style=\"padding-left: 40px;\"\u003EIn order to be considered for\u003Cem\u003E J Med Internet Res\u003C\u002Fem\u003E, \u003Cstrong\u003Eclinical informatics\u003C\u002Fstrong\u003E papers should have a clear connections to the major themes in this journal of consumer\u002Fpatient empowerment and participatory healthcare, and\u002For evaluate the use of mobile\u002FInternet-based\u002Femerging technologies such as patient portals. Other clinical informatics studies with no relationship to consumer health informatics, or more technical papers are best submitted to \u003Ca href=\"https:\u002F\u002Fsupport.jmir.org\u002Fhc\u002Fen-us\u002Farticles\u002F115001442707-Which-journal-titles-are-JMIR-Publications-currently-publishing-Journal-Portfolio-\"\u003Eother JMIR journal titles\u003C\u002Fa\u003E, such as \u003Ca href=\"http:\u002F\u002Fi-jmr.org\u002F\"\u003EInteractive Journal of Medical Research (i-JMR, a general medical journal with focus on innovation)\u003C\u002Fa\u003E, \u003Ca href=\"http:\u002F\u002Fmhealth.jmir.org\u002F\"\u003EJMIR mHealth and uHealth\u003C\u002Fa\u003E, \u003Ca href=\"http:\u002F\u002Fmedinform.jmir.org\u002F\"\u003EJMIR Medical Informatics\u003C\u002Fa\u003E, or \u003Ca href=\"https:\u002F\u002Fhumanfactors.jmir.org\"\u003EJMIR Human Factors\u003C\u002Fa\u003E.\u003C\u002Fp\u003E\r\n\u003Cp style=\"padding-left: 40px;\"\u003E\u003Cstrong\u003EMachine-learning papers\u003C\u002Fstrong\u003E: Machine learning papers are now mostly published in JMIR Medical Informatics (see e-collection \u003Ca href=\"https:\u002F\u002Fmedinform.jmir.org\u002Fthemes\u002F500-machine-learning\"\u003EMachine Learning\u003C\u002Fa\u003E), JMIR Formative Research or JMIR AI, or another sister journal, unless they have reached clinical maturity and are being used and validated in routine clinical use. Our flagship journal J Med Internet Res no longer publishes ML papers unless 1) they show a direct clinical effect or impact on care, 2) are validated using an independent dataset not used for training, 3) are written in a language that can be understood by a healthcare professional, and provide open source or a publicly available tool that can be used by others to validate or apply the findings. We also request that 4) reporting strictly adheres to the \"\u003Ca href=\"..\u002F..\u002F2016\u002F12\u002Fe323\u002F\"\u003EGuidelines for Developing and Reporting Machine Learning Predictive Models in Biomedical Research\u003C\u002Fa\u003E\". Highly technical papers (with mathematical formulas) are unsuitable for J Med Internet Res or this information needs to be provided in a Multimedia Appendix.&nbsp;\u003C\u002Fp\u003E\r\n\u003Cp style=\"padding-left: 40px;\"\u003E\u003Cstrong\u003EDigital psychiatry and digital mental health\u003C\u002Fstrong\u003E papers are best suited for \u003Ca href=\"http:\u002F\u002Fmental.jmir.org\u002F\"\u003EJMIR Mental Health\u003C\u002Fa\u003E if they are impactful, otherwise JMIR Formative Research publishes early stage work.\u003C\u002Fp\u003E\r\n\u003Cp style=\"padding-left: 40px;\"\u003EStudies related to public health informatics and surveillance systems should preferably be submitted to \u003Ca href=\"https:\u002F\u002Fpublichealth.jmir.org\"\u003EJMIR Public Health &amp; Surveillance\u003C\u002Fa\u003E. JPHS is also highly selective.\u003C\u002Fp\u003E\r\n\u003Cp style=\"padding-left: 40px;\"\u003EPapers with focus on \u003Cstrong\u003Egames in health or gamification aspects of apps\u003C\u002Fstrong\u003E and theoretical issues\u002Fcommentary on gaming are now primarily published in \u002F transferred to \u003Ca href=\"http:\u002F\u002Fgames.jmir.org\u002F\"\u003EJMIR Serious Games\u003C\u002Fa\u003E.\u003C\u002Fp\u003E\r\n\u003Cp style=\"padding-left: 40px;\"\u003EStudies evaluating systematically the \u003Cstrong\u003Equality of health information\u003C\u002Fstrong\u003E or present tools for social listening may be best suited for \u003Ca href=\"https:\u002F\u002Finfodemiology.jmir.org\"\u003EJMIR Infodemiology\u003C\u002Fa\u003E.\u003C\u002Fp\u003E\r\n\u003Cp style=\"padding-left: 40px;\"\u003E\u003Cstrong\u003EFormative work such as usability studies, pilot studies, and feasibility studies\u003C\u002Fstrong\u003E are no longer published in our flagship journals and should be submitted to \u003Ca href=\"http:\u002F\u002Fformative.jmir.org\u002F\"\u003EJMIR Formative Research\u003C\u002Fa\u003E.\u003C\u002Fp\u003E\r\n\u003Cp style=\"padding-left: 40px;\"\u003EProtocols and proposals can be submitted to \u003Ca href=\"https:\u002F\u002Fwww.researchprotocols.org\u002F\"\u003EJMIR Research Protocols\u003C\u002Fa\u003E.\u003C\u002Fp\u003E\r\n\u003Cp\u003ESubmitted manuscripts are subject to a rigorous \u003Cstrong\u003Ebut speedy peer review \u003C\u002Fstrong\u003Eprocess. We aim for a standard review time of less than 2 months, and a \u003Ca href=\"https:\u002F\u002Fsupport.jmir.org\u002Fhc\u002Fen-us\u002Farticles\u002F115001310127-How-to-fast-track-expedite-a-paper-and-what-are-the-benefits-\"\u003Ereview time of 4 weeks for submission to initial decision for fast-tracked papers\u003C\u002Fa\u003E).\u003C\u002Fp\u003E\r\n\u003Cp\u003EThe \u003Ca href=\"https:\u002F\u002Fsupport.jmir.org\u002Fhc\u002Fen-us\u002Farticles\u002F115004367848-What-does-the-peer-review-process-at-JMIR-journals-look-like-\"\u003Ereview process\u003C\u002Fa\u003E is designed to help authors to improve their manuscripts by giving them constructive comments on how to improve their paper, and to publish only those articles which comply to general quality criteria of a scholarly paper, especially originality, clarity, references to related work and validity of results and conclusions.\u003C\u002Fp\u003E",googleAnalyticsId:"UA-186918-1",impactFactor:J,journalDescription:"\u003Cp\u003E\u003Cstrong\u003EThe leading peer-reviewed journal for digital medicine and health and health care in the internet age.&nbsp;\u003C\u002Fstrong\u003E\u003C\u002Fp\u003E",journalInitials:"JMIR",footer:"\u003Cul style=\"display: flex; flex-wrap: wrap; justify-content: center; list-style: none;\"\u003E\r\n\u003Cli style=\"margin-bottom: 10px; margin-right: 10px; margin-top: 10px;\"\u003E\r\n\u003Cp style=\"text-align: center;\"\u003E\u003Ca target=\"_blank\" rel=\"noopener\"\u003E\u003Cimg src=\"https:\u002F\u002Fasset.jmir.pub\u002Fresources\u002Fimages\u002Fpartners\u002Fcrossref.jpg\" alt=\"Crossref Member\" \u002F\u003E\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fli\u003E\r\n\u003Cli style=\"margin-bottom: 10px; margin-right: 10px; margin-top: 10px;\"\u003E\r\n\u003Cp style=\"text-align: center;\"\u003E\u003Ca target=\"_blank\" rel=\"noopener\"\u003E\u003Cimg src=\"https:\u002F\u002Fasset.jmir.pub\u002Fresources\u002Fimages\u002Fpartners\u002Fcope.jpg\" alt=\"Committee on Publication Ethics\" \u002F\u003E\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fli\u003E\r\n\u003Cli style=\"margin-bottom: 10px; margin-right: 10px; margin-top: 10px;\"\u003E\r\n\u003Cp style=\"text-align: center;\"\u003E\u003Cimg src=\"https:\u002F\u002Fasset.jmir.pub\u002Fresources\u002Fimages\u002Fpartners\u002Fopen-access.jpg\" alt=\"Open Access\" \u002F\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fli\u003E\r\n\u003Cli style=\"margin-bottom: 10px; margin-right: 10px; margin-top: 10px;\"\u003E\r\n\u003Cp style=\"text-align: center;\"\u003E\u003Ca target=\"_blank\" rel=\"noopener\"\u003E\u003Cimg src=\"https:\u002F\u002Fasset.jmir.pub\u002Fresources\u002Fimages\u002Fpartners\u002Foaspa.jpg\" alt=\"Open Access Scholarly Publishers Association\" \u002F\u003E\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fli\u003E\r\n\u003Cli style=\"margin-bottom: 10px; margin-right: 10px; margin-top: 10px;\"\u003E&nbsp;\u003C\u002Fli\u003E\r\n\u003Cli style=\"margin-bottom: 10px; margin-right: 10px; margin-top: 10px;\"\u003E&nbsp;\u003C\u002Fli\u003E\r\n\u003Cli style=\"margin-bottom: 10px; margin-right: 10px; margin-top: 10px;\"\u003E\r\n\u003Cp style=\"text-align: center;\"\u003E\u003Ca target=\"_blank\" rel=\"noopener\"\u003E\u003Cimg src=\"https:\u002F\u002Fasset.jmir.pub\u002Fresources\u002Fimages\u002Fpartners\u002Ftrend-MD.jpg\" alt=\"TrendMD Member\" \u002F\u003E\u003Cimg src=\"https:\u002F\u002Fasset.jmir.pub\u002Fresources\u002Fimages\u002Fpartners\u002FORCID.jpg\" alt=\"ORCID Member\" \u002F\u003E\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fli\u003E\r\n\u003Cli style=\"margin-bottom: 10px; margin-right: 10px; margin-top: 10px;\"\u003E&nbsp;\u003C\u002Fli\u003E\r\n\u003C\u002Ful\u003E\r\n\u003Cdiv data-v-d2fa5e4c=\"\" data-v-44a23348=\"\"\u003E\r\n\u003Csection class=\"partners-wrapper\" data-test=\"partnerships-section\" data-v-d2fa5e4c=\"\"\u003E\r\n\u003Ch2 class=\"text-center\" style=\"text-align: center;\" aria-label=\"Indexed in\" data-v-d2fa5e4c=\"\" tabindex=\"0\"\u003E&nbsp;\u003C\u002Fh2\u003E\r\n\u003Ch2 class=\"text-center\" style=\"text-align: center;\" aria-label=\"Indexed in\" data-v-d2fa5e4c=\"\" tabindex=\"0\"\u003EThis journal is indexed in\u003C\u002Fh2\u003E\r\n\u003Cdiv class=\"green-underline\" style=\"background-color: #367c3a; height: 3px; margin: 0 auto 40px; width: 100px;\" data-v-30c6e183=\"\"\u003E&nbsp;\u003C\u002Fdiv\u003E\r\n\u003Cul style=\"display: flex; flex-wrap: wrap; justify-content: center; list-style: none;\"\u003E\r\n\u003Cli style=\"margin-bottom: 10px; margin-right: 10px; margin-top: 10px;\"\u003E\r\n\u003Cp style=\"text-align: center;\"\u003E\u003Ca target=\"_blank\" rel=\"noopener\"\u003E\u003Cimg src=\"https:\u002F\u002F19668141.fs1.hubspotusercontent-na1.net\u002Fhubfs\u002F19668141\u002F00%20Marketing\u002FLogos\u002FExternal%20logos%20for%20journal%20pages\u002FPubMed.jpg\" alt=\"PubMed\" \u002F\u003E\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fli\u003E\r\n\u003Cli style=\"margin-bottom: 10px; margin-right: 10px; margin-top: 10px;\"\u003E\r\n\u003Cp style=\"text-align: center;\"\u003E\u003Ca target=\"_blank\" rel=\"noopener\"\u003E\u003Cimg class=\"image_resized\" style=\"aspect-ratio: 461\u002F81; width: 64.99%;\" src=\"https:\u002F\u002F19668141.fs1.hubspotusercontent-na1.net\u002Fhubfs\u002F19668141\u002F00%20Marketing\u002FLogos\u002FExternal%20logos%20for%20journal%20pages\u002FPMC.jpg\" alt=\"PubMed Central\" width=\"461\" height=\"81\" \u002F\u003E\u003Cimg style=\"aspect-ratio: 240\u002F81;\" src=\"https:\u002F\u002F19668141.fs1.hubspotusercontent-na1.net\u002Fhubfs\u002F19668141\u002F00%20Marketing\u002FLogos\u002FExternal%20logos%20for%20journal%20pages\u002FMEDLINE.jpg\" alt=\"MEDLINE\" width=\"240\" height=\"81\" \u002F\u003E\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fli\u003E\r\n\u003Cli style=\"margin-bottom: 10px; margin-right: 10px; margin-top: 10px;\"\u003E&nbsp;\u003C\u002Fli\u003E\r\n\u003Cli style=\"width: 100%;\"\u003E\r\n\u003Cp style=\"text-align: center;\"\u003E\u003Ca target=\"_blank\" rel=\"noopener noreferrer\" href=\"https:\u002F\u002Fwww.scopus.com\u002Fsourceid\u002F23709\"\u003E\u003Cimg class=\"image_resized\" style=\"aspect-ratio: 419\u002F153;\" src=\"https:\u002F\u002F19668141.fs1.hubspotusercontent-na1.net\u002Fhubfs\u002F19668141\u002F00%20Marketing\u002FLogos\u002FExternal%20logos%20for%20journal%20pages\u002FScopus-1.jpg\" width=\"239\" height=\"87\" \u002F\u003E\u003C\u002Fa\u003E\u003Ca target=\"_blank\" rel=\"noopener noreferrer\" href=\"https:\u002F\u002Fdoaj.org\u002Ftoc\u002Fc9b232aa100e40f6bb8aa3f0189f08a5\"\u003E\u003Cimg src=\"https:\u002F\u002F19668141.fs1.hubspotusercontent-na1.net\u002Fhubfs\u002F19668141\u002F00%20Marketing\u002FLogos\u002FExternal%20logos%20for%20journal%20pages\u002FDOAJ-1.jpg\" alt=\"DOAJ\" width=\"253\" height=\"84\" \u002F\u003E\u003Cimg class=\"image_resized\" style=\"aspect-ratio: 110\u002F110;\" src=\"https:\u002F\u002F19668141.fs1.hubspotusercontent-na1.net\u002Fhubfs\u002F19668141\u002F00%20Marketing\u002FLogos\u002FExternal%20logos%20for%20journal%20pages\u002FDOAJ%20seal.png\" alt=\"DOAJ Seal\" width=\"173\" height=\"75\" \u002F\u003E\u003C\u002Fa\u003E\u003Cimg src=\"https:\u002F\u002F19668141.fs1.hubspotusercontent-na1.net\u002Fhubfs\u002F19668141\u002F00%20Marketing\u002FLogos\u002FExternal%20logos%20for%20journal%20pages\u002FCINAHL.jpg\" alt=\"CINAHL (EBSCO)\" width=\"200\" height=\"81\" \u002F\u003E\u003Cimg class=\"image_resized\" style=\"aspect-ratio: 245\u002F81;\" src=\"https:\u002F\u002F19668141.fs1.hubspotusercontent-na1.net\u002Fhubfs\u002F19668141\u002F00%20Marketing\u002FLogos\u002FExternal%20logos%20for%20journal%20pages\u002FPsycInfo.jpg\" alt=\"PsycInfo\" width=\"245\" height=\"81\" \u002F\u003E\u003Ca target=\"_blank\" rel=\"noopener noreferrer\" href=\"https:\u002F\u002Fv2.sherpa.ac.uk\u002Fid\u002Fpublication\u002F32256\"\u003E\u003Cimg src=\"https:\u002F\u002F19668141.fs1.hubspotusercontent-na1.net\u002Fhubfs\u002F19668141\u002F00%20Marketing\u002FLogos\u002FExternal%20logos%20for%20journal%20pages\u002FSherpa%20Romeo.jpg\" alt=\"Sherpa Romeo\" width=\"287\" height=\"81\" \u002F\u003E\u003C\u002Fa\u003E\u003Cimg style=\"aspect-ratio: 287\u002F81;\" src=\"https:\u002F\u002F19668141.fs1.hubspotusercontent-na1.net\u002Fhubfs\u002F19668141\u002F00%20Marketing\u002FLogos\u002FExternal%20logos%20for%20journal%20pages\u002FEBSCO%20Essentials.jpg\" alt=\"EBSCO\u002FEBSCO Essentials\" width=\"395\" height=\"81\" \u002F\u003E\u003Cimg style=\"aspect-ratio: 287\u002F81;\" src=\"https:\u002F\u002F19668141.fs1.hubspotusercontent-na1.net\u002Fhubfs\u002F19668141\u002F00%20Marketing\u002FLogos\u002FExternal%20logos%20for%20journal%20pages\u002FGoOA.jpg\" width=\"498\" height=\"100\" \u002F\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fli\u003E\r\n\u003Cli\u003E&nbsp;\u003C\u002Fli\u003E\r\n\u003Cli style=\"width: 74.64%;\"\u003E\r\n\u003Cp style=\"text-align: center;\"\u003E\u003Ca target=\"_blank\" rel=\"noopener noreferrer\" href=\"https:\u002F\u002Fdoaj.org\u002Ftoc\u002Fc9b232aa100e40f6bb8aa3f0189f08a5\"\u003E\u003Cimg class=\"image_resized\" style=\"aspect-ratio: 498\u002F100;\" src=\"https:\u002F\u002F19668141.fs1.hubspotusercontent-na1.net\u002Fhubfs\u002F19668141\u002F00%20Marketing\u002FLogos\u002FExternal%20logos%20for%20journal%20pages\u002FSCIE.png\" alt=\"Web of Science - SCIE\" width=\"136\" height=\"135\" \u002F\u003E\u003C\u002Fa\u003E\u003C\u002Fp\u003E\r\n\u003C\u002Fli\u003E\r\n\u003Cli style=\"margin-bottom: 10px; margin-right: 10px; margin-top: 10px;\"\u003E\r\n\u003Cp style=\"text-align: center;\"\u003E&nbsp;\u003C\u002Fp\u003E\r\n\u003C\u002Fli\u003E\r\n\u003Cli\u003E&nbsp;\u003C\u002Fli\u003E\r\n\u003Cli\u003E\u003Ca target=\"_blank\" rel=\"noopener\"\u003E&nbsp;\u003C\u002Fa\u003E\u003C\u002Fli\u003E\r\n\u003C\u002Ful\u003E\r\n\u003Cp\u003E&nbsp;\u003C\u002Fp\u003E\r\n\u003C\u002Fsection\u003E\r\n\u003C\u002Fdiv\u003E",onlineIssn:"1438-8871",searchDescription:"Journal of Medical Internet Research - International Scientific Journal for Medical Research, Information and Communication on the Internet",searchKeywords:"Medical, Medicine, Internet, Research, Journal, ehealth, JMIR, open access publishing, medical research, medical informatics",articlesWidget:{enabled:l,count:C,label:"Recent Articles"},openReviewWidget:{enabled:l,count:C,label:"\u003Ca href=\"https:\u002F\u002Fpreprints.jmir.org\"\u003EPreprints\u003C\u002Fa\u003E Open for Peer-Review"},searchWidget:{enabled:l},partnershipsWidget:{enabled:l},submitButton:{enabled:l,label:"Submit Article"},editorInChief:"\u003Cp\u003EGunther Eysenbach, MD, MPH, FACMI, Founding Editor and Publisher; \u003Cspan\u003EAdjunct Professor, School of Health Information Science, University of Victoria, Canada\u003C\u002Fspan\u003E\u003C\u002Fp\u003E"}}},journals:{data:[{journal_id:b,title:as,tag:at,description:a,path:q,slug:q,seq:b,enabled:b,environment:d,url:au,batch:b,year:av,colour:r,impact:J,order:b,published:aw,transfers:a,cite_score:ax},{journal_id:m,title:"JMIR Research Protocols",tag:"Ongoing trials, grant proposals, formative research, methods, early results. June 2024 - Journal Impact Factor: 1.4 (Source: Journal Citation Reports™ 2024 from Clarivate™)",description:"JMIR Res Protoc publishes research protocols, current and ongoing trials, and grant proposals in all areas of medicine (with an initial focus on ehealth\u002Fmhealth). Publish your work in this journal to let others know what you are working on, to facilitate collaboration and\u002For recruitment, to avoid duplication of efforts, to create a citable record of a research design idea, and to aid systematic reviewers in compiling evidence. Research protocols or grant proposals that are funded and have undergone peer-review will receive an expedited review if you upload peer-review reports as supplementary files.",path:"resprot",slug:"researchprotocols",seq:e,enabled:b,environment:d,url:"https:\u002F\u002Fwww.researchprotocols.org",batch:b,year:K,colour:"#837a7a",impact:"1.4",order:v,published:4321,transfers:a,cite_score:"2.4"},{journal_id:ay,title:"JMIR Formative Research",tag:"Process evaluations, early results and feasibility\u002Fpilot studies of digital and non-digital interventions. June 2024 - Journal Impact Factor: 2.0 (Source: Journal Citation Reports™ 2024 from Clarivate™)",description:c,path:az,slug:az,seq:D,enabled:b,environment:d,url:"https:\u002F\u002Fformative.jmir.org",batch:e,year:L,colour:"#605959",impact:"2.0",order:M,published:3050,transfers:a,cite_score:"2.7"},{journal_id:M,title:"JMIR mHealth and uHealth",tag:"Focused on health and biomedical applications in mobile and tablet computing, pervasive and ubiquitous computing, wearable computing and domotics. June 2024 - Journal Impact Factor: 5.4. Q1 journal in \"Health Care Sciences & Services\" and \"Medical Informatics\" categories. (Source: Journal Citation Reports™ 2024 from Clarivate™)",description:"JMIR mhealth and uhealth is a new journal focussing on mobile and ubiquitous health technologies, including smartphones, augmented reality (Google Glasses), intelligent domestic devices, implantable devices, and other technologies designed to maintain health and improve life.",path:aA,slug:aA,seq:h,enabled:b,environment:d,url:"https:\u002F\u002Fmhealth.jmir.org",batch:e,year:N,colour:aB,impact:"5.4",order:e,published:2730,transfers:a,cite_score:"12.6"},{journal_id:45,title:"Online Journal of Public Health Informatics",tag:"A leading peer-reviewed, open access journal dedicated to the dissemination of high-quality research and innovation in the field of public health informatics.",description:a,path:aC,slug:aC,seq:O,enabled:b,environment:d,url:"https:\u002F\u002Fojphi.jmir.org",batch:a,year:aD,colour:"#3399FF",impact:c,order:O,published:1717,transfers:a,cite_score:a},{journal_id:P,title:"JMIR Public Health and Surveillance",tag:"A multidisciplinary journal that focuses on the intersection of public health and technology, public health informatics, mass media campaigns, surveillance, participatory epidemiology, and innovation in public health practice and research. June 2024 - Journal Impact Factor: 3.5. Q1 journal in \"Public, Environmental & Occupational Health\" category (Source: Journal Citation Reports™ 2024 from Clarivate™)",description:"Innovations in Public Health practice and research",path:aE,slug:aE,seq:z,enabled:b,environment:d,url:"https:\u002F\u002Fpublichealth.jmir.org",batch:b,year:s,colour:"#01538A",impact:Q,order:z,published:1639,transfers:a,cite_score:"13.7"},{journal_id:i,title:"JMIR Medical Informatics",tag:"Clinical informatics, decision support for health professionals, electronic health records, and eHealth infrastructures. June 2024 - Journal Impact Factor: 3.1 (Source: Journal Citation Reports™ 2024 from Clarivate™)",description:"Clinical informatics",path:aF,slug:aF,seq:m,enabled:b,environment:d,url:"https:\u002F\u002Fmedinform.jmir.org",batch:e,year:N,colour:A,impact:"3.1",order:u,published:1405,transfers:a,cite_score:"7.9"},{journal_id:R,title:"JMIR Mental Health",tag:"A journal focused on Internet interventions, technologies, and digital innovations for mental health and behavior change. Official journal of the Society for Digital Psychiatry. June 2024 - Journal Impact Factor: 4.8. Q1 journal in \"Psychiatry\" category. (Source: Journal Citation Reports™ 2024 from Clarivate™)",description:c,path:aG,slug:aG,seq:i,enabled:b,environment:d,url:"https:\u002F\u002Fmental.jmir.org",batch:b,year:S,colour:"#45936C",impact:aH,order:y,published:1073,transfers:a,cite_score:"10.8"},{journal_id:z,title:"JMIR Human Factors",tag:"Making health care interventions and technologies usable, safe, and effective. June 2024 - Journal Impact Factor: 2.6 (Source: Journal Citation Reports™ 2024 from Clarivate™)",description:"Usability Studies and Ergonomics",path:aI,slug:aI,seq:aJ,enabled:b,environment:d,url:"https:\u002F\u002Fhumanfactors.jmir.org",batch:e,year:S,colour:"#008C9E",impact:"2.6",order:aK,published:798,transfers:a,cite_score:"3.4"},{journal_id:v,title:"JMIR Serious Games",tag:"A multidisciplinary journal on gaming and gamification including simulation and immersive virtual reality for health education\u002Fpromotion, teaching, medicine, rehabilitation, and social change. June 2024 - Journal Impact Factor: 3.8. Q1 journal in \"Health Care Sciences & Services\" category. (Source: Journal Citation Reports™ 2024 from Clarivate™)",description:"Serious games for health and social change",path:aL,slug:aL,seq:P,enabled:b,environment:d,url:"https:\u002F\u002Fgames.jmir.org",batch:b,year:N,colour:"#4A5A67",impact:"3.8",order:m,published:630,transfers:a,cite_score:"7.3"},{journal_id:T,title:"JMIR Medical Education",tag:"Technology, innovation and openess in medical education in the information age. June 2024 - Journal Impact Factor: 3.2. Q1 journal in \"Education, Scientific Disciplines\" category. (Source: Journal Citation Reports™ 2024 from Clarivate™)",description:c,path:aM,slug:aM,seq:aK,enabled:b,environment:d,url:"https:\u002F\u002Fmededu.jmir.org",batch:e,year:s,colour:"#6678A6",impact:aN,order:P,published:552,transfers:a,cite_score:"6.9"},{journal_id:y,title:"Iproceedings",tag:"Electronic Proceedings, Presentations and Posters of Leading Conferences",description:c,path:aO,slug:aO,seq:M,enabled:b,environment:d,url:"https:\u002F\u002Fwww.iproc.org",batch:e,year:s,colour:"#6F7D80",impact:a,order:U,published:510,transfers:a,cite_score:a},{journal_id:h,title:"Interactive Journal of Medical Research",tag:"A new general medical journal for the 21st centrury, focusing on innovation in health and medical research. June 2024 - Journal Impact Factor: 1.9 (Source: Journal Citation Reports™ 2024 from Clarivate™)",description:c,path:"ijmr",slug:"i-jmr",seq:y,enabled:b,environment:d,url:"https:\u002F\u002Fwww.i-jmr.org",batch:e,year:K,colour:"#22B2C1",impact:"1.9",order:aP,published:429,transfers:a,cite_score:a},{journal_id:V,title:"JMIR Aging",tag:"Digital health technologies, apps, and informatics for patient education, medicine and nursing, preventative interventions, and clinical care \u002F home care for elderly populations. June 2024 - Journal Impact Factor: 5.0. Q1 journal in \"Geriatrics & Gerontology\", \"Gerontology\" and \"Medical Informatics\" categories. (Source: Journal Citation Reports™ 2024 from Clarivate™)",description:c,path:aQ,slug:aQ,seq:U,enabled:b,environment:d,url:"https:\u002F\u002Faging.jmir.org",batch:e,year:t,colour:"#979bc4",impact:"5.0",order:h,published:424,transfers:a,cite_score:"6.5"},{journal_id:aR,title:"JMIRx Med",tag:aS,description:a,path:aT,slug:aT,seq:W,enabled:b,environment:d,url:"https:\u002F\u002Fxmed.jmir.org",batch:a,year:X,colour:"#3187df",impact:c,order:R,published:420,transfers:a,cite_score:a},{journal_id:U,title:"JMIR Pediatrics and Parenting",tag:"Improving pediatric and adolescent health outcomes and empowering and educating parents. June 2024 - Journal Impact Factor: 2.1. Q1 journal in \"Pediatrics\" category. (Source: Journal Citation Reports™ 2024 from Clarivate™)",description:c,path:aU,slug:aU,seq:Y,enabled:b,environment:d,url:"https:\u002F\u002Fpediatrics.jmir.org",batch:e,year:t,colour:"#d2a9ad",impact:"2.1",order:C,published:401,transfers:a,cite_score:"5"},{journal_id:B,title:"JMIR Cancer",tag:"Patient-centered innovations, education and technology for cancer care, cancer survivorship, and cancer research. June 2024 - Journal Impact Factor: 3.3 (Source: Journal Citation Reports™ 2024 from Clarivate™)",description:c,path:aV,slug:aV,seq:C,enabled:b,environment:d,url:"https:\u002F\u002Fcancer.jmir.org",batch:e,year:s,colour:ar,impact:"3.3",order:aJ,published:381,transfers:a,cite_score:"4.1"},{journal_id:Y,title:"JMIR Dermatology",tag:"Technologies, devices, apps, and informatics applications for patient education in dermatology, including preventative interventions, and clinical care for dermatological populations",description:c,path:aW,slug:aW,seq:E,enabled:b,environment:d,url:"https:\u002F\u002Fderma.jmir.org",batch:e,year:t,colour:"#ecac7d",impact:a,order:w,published:307,transfers:a,cite_score:"1.2"},{journal_id:w,title:"JMIR Diabetes",tag:"Emerging Technologies, Medical Devices, Apps, Sensors, and Informatics to Help People with Diabetes.",description:c,path:aX,slug:aX,seq:v,enabled:b,environment:d,url:"https:\u002F\u002Fdiabetes.jmir.org",batch:e,year:Z,colour:"#5c89c7",impact:c,order:T,published:266,transfers:a,cite_score:"4"},{journal_id:_,title:"JMIR Rehabilitation and Assistive Technologies",tag:"Development and Evaluation of Rehabilitation, Physiotherapy and Assistive Technologies, Robotics, Prosthetics and Implants, Mobility and Communication Tools, Home Automation and Telerehabilitation",description:c,path:aY,slug:aY,seq:aP,enabled:b,environment:d,url:"https:\u002F\u002Frehab.jmir.org",batch:e,year:S,colour:"#15638E",impact:a,order:B,published:263,transfers:a,cite_score:"4.2"},{journal_id:E,title:"JMIR Cardio",tag:"Electronic, mobile, digital health approaches in cardiology and for cardiovascular health.\r \r",description:c,path:aZ,slug:aZ,seq:B,enabled:b,environment:d,url:"https:\u002F\u002Fcardio.jmir.org",batch:e,year:L,colour:"#791f20",impact:a,order:D,published:194,transfers:a,cite_score:Q},{journal_id:38,title:"JMIR Infodemiology",tag:"Focusing on determinants and distribution of health information and misinformation on the internet, and its effect on public and individual health. June 2024 - Journal Impact Factor: 3.5. Q1 journal in \"Health Care Sciences & Services\" and \"Public, Environmental, and Occupational Health\" categories. (Source: Journal Citation Reports™ 2024 from Clarivate™)",description:"Focusing on determinants and distribution of health information and misinformation on the internet, and its effect on public and individual health.",path:a_,slug:a_,seq:37,enabled:b,environment:d,url:"https:\u002F\u002Finfodemiology.jmir.org",batch:e,year:ah,colour:"#32A852",impact:Q,order:i,published:149,transfers:a,cite_score:aH},{journal_id:a$,title:"JMIR AI",tag:"The leading peer-reviewed journal for digital medicine, and health & health care in the internet age",description:"JMIR AI is a new journal that focuses on the applications of AI in health settings. This includes contemporary developments as well as historical examples, with an emphasis on sound methodological evaluations of AI techniques and authoritative analyses. It is intended to be the main source of reliable information for health informatics professionals to learn about how AI techniques can be applied and evaluated.",path:ba,slug:ba,seq:O,enabled:b,environment:d,url:"https:\u002F\u002Fai.jmir.org",batch:e,year:bb,colour:$,impact:c,order:F,published:118,transfers:a,cite_score:a},{journal_id:aa,title:"JMIR Perioperative Medicine",tag:"Technologies for pre- and post-operative education, preventative interventions and clinical care for surgery and anaesthesiology patients, as well as informatics applications in anesthesia, surgery, critical care and pain medicine",description:c,path:bc,slug:bc,seq:V,enabled:b,environment:d,url:"https:\u002F\u002Fperiop.jmir.org",batch:e,year:t,colour:"#187662",impact:a,order:ab,published:115,transfers:a,cite_score:"0.5"},{journal_id:W,title:"JMIR Nursing",tag:"Modern nursing in the age of information technology, patient empowerment and preventative, predictive, personal medicine",description:a,path:bd,slug:bd,seq:aa,enabled:b,environment:d,url:"https:\u002F\u002Fnursing.jmir.org",batch:e,year:t,colour:"#429a99",impact:a,order:ay,published:114,transfers:a,cite_score:"5.2"},{journal_id:be,title:"Journal of Participatory Medicine",tag:"The Journal of Participatory Medicine is a peer-reviewed, open access journal with the mission to advance the understanding and practice of participatory medicine among health care professionals and patients.\n\nIt is the Official Journal of the Society for Participatory Medicine.",description:c,path:bf,slug:bf,seq:w,enabled:b,environment:d,url:"https:\u002F\u002Fjopm.jmir.org",batch:e,year:aD,colour:"#2ea3f2",impact:a,order:ac,published:96,transfers:a,cite_score:aN},{journal_id:ac,title:"JMIR Biomedical Engineering",tag:"Engineering for health technologies, medical devices, and innovative medical treatments and procedures.",description:c,path:bg,slug:bg,seq:ad,enabled:b,environment:d,url:"https:\u002F\u002Fbiomedeng.jmir.org",batch:e,year:Z,colour:bh,impact:a,order:E,published:90,transfers:a,cite_score:a},{journal_id:ad,title:"JMIR Bioinformatics and Biotechnology",tag:"Methods, devices, web-based platforms, open data and open software tools for big data analytics, understanding biological\u002Fmedical data, and information retrieval in biology and medicine.",description:c,path:bi,slug:bi,seq:F,enabled:b,environment:d,url:"https:\u002F\u002Fbioinform.jmir.org",batch:e,year:X,colour:bh,impact:a,order:be,published:51,transfers:a,cite_score:"2.9"},{journal_id:e,title:"Medicine 2.0",tag:"Official proceedings publication of the Medicine 2.0 Congress",description:a,path:"med20",slug:"medicine20",seq:u,enabled:b,environment:d,url:"https:\u002F\u002Fwww.medicine20.com",batch:b,year:K,colour:aB,impact:a,order:V,published:E,transfers:a,cite_score:a},{journal_id:43,title:"Asian\u002FPacific Island Nursing Journal",tag:"The official journal of the Asian American \u002F Pacific Islander Nurses Association (AAPINA), devoted to the exchange of knowledge in relation to Asian and Pacific Islander health and nursing care. Created to fill the gap between nursing science and behavioral\u002Fsocial sciences, the journal offers a forum for empirical, theoretical and methodological issues related to Asian American \u002F Pacific Islander ethnic, cultural values and beliefs and biological and physiological phenomena that can affect nursing care.",description:"The official journal of the Asian American \u002F Pacific Islander Nurses Association, this is a peer-reviewed, open access journal for the exchange of knowledge in relation to Asian and Pacific Islander health and nursing care. It will serve as a voice for nursing and other health care providers for research, education, and practice.",path:bj,slug:bj,seq:a$,enabled:b,environment:d,url:"https:\u002F\u002Fapinj.jmir.org",batch:e,year:L,colour:bk,impact:c,order:bl,published:ab,transfers:a,cite_score:"1.8"},{journal_id:42,title:"JMIR Neurotechnology",tag:"Cross-disciplinary journal that connects the broad domains of clinical neuroscience and all related technologies. The journal provides a space for the publication of research exploring how technologies can be applied in clinical neuroscience (e.g., neurology, neurosurgery, neuroradiology) to prevent, diagnose, and treat neurological disorders.",description:"JMIR Neuro is an innovative new journal which aims to bridge clinical neurology & neurosurgery with advances in the web space and digital technologies",path:bm,slug:bm,seq:bl,enabled:b,environment:d,url:"https:\u002F\u002Fneuro.jmir.org",batch:e,year:bb,colour:bk,impact:a,order:ad,published:ac,transfers:a,cite_score:a},{journal_id:35,title:"JMIRx Bio",tag:aS,description:c,path:bn,slug:bn,seq:aR,enabled:b,environment:d,url:"https:\u002F\u002Fxbio.jmir.org",batch:e,year:2023,colour:"#bf2433",impact:a,order:_,published:D,transfers:a,cite_score:a},{journal_id:46,title:"JMIR XR and Spatial Computing (JMXR)",tag:"A new peer-reviewed journal for extended reality and spatial computing in health and health care. ",description:a,path:bo,slug:bo,seq:bp,enabled:b,environment:d,url:"https:\u002F\u002Fxr.jmir.org",batch:e,year:2024,colour:"#887ECB",impact:a,order:bp,published:F,transfers:a,cite_score:a},{journal_id:ab,title:"JMIR Data",tag:"A muldisciplinary journal to publish open datasets for analysis and re-analysis",description:c,path:bq,slug:bq,seq:T,enabled:b,environment:d,url:"https:\u002F\u002Fdata.jmir.org",batch:e,year:X,colour:"#5A6672",impact:a,order:W,published:m,transfers:a,cite_score:a},{journal_id:D,title:"JMIR Challenges",tag:"JMIR Challenges is a new platform connecting \"solution-seekers\" (sponsors such as companies or other researchers) with \"solution-providers\" (entrants, such as innovators, researchers, or developers in the ehealth space)",description:c,path:br,slug:br,seq:_,enabled:b,environment:d,url:"https:\u002F\u002Fchallenges.jmir.org",batch:e,year:Z,colour:$,impact:a,order:aa,published:e,transfers:a,cite_score:a},{journal_id:F,title:"JMIR Preprints",tag:"A preprint server for pre-publication\u002Fpre-peer-review preprints intended for community review as well as ahead-of-print (accepted) manuscripts",description:"Publish your paper for open peer-review.",path:bs,slug:bs,seq:R,enabled:b,environment:d,url:"https:\u002F\u002Fpreprints.jmir.org",batch:b,year:s,colour:$,impact:a,order:Y,published:b,transfers:a,cite_score:a}]},license:{},page:{data:a},pages:{data:{}},preprint:{data:a},searchArticles:{data:{}},searchAuthors:{data:[],pagination:{}},searchHelp:{data:{}},sections:{data:[],allSections:[],journalSections:[]},submission:{data:{id:a,files:{toc:[],figures:[],appendicies:[],other:[]}},file:a,affiliation:a,suggestedAffiliations:a,events:{extraction:[]},errors:{}},subscription:{},theme:{data:[]},themes:{random:a,data:[],sortType:"pubDate",sortOrder:"desc"}},serverRendered:l,routePath:ai,config:{environment:d,_app:{basePath:"\u002F",assetsPath:"\u002F_nuxt\u002F",cdnURL:a}}}}(null,1,"","production",2,-128,"PhD",3,7,0,"United Kingdom",true,5,"London","California Correctional Health Care Services","California","jmir","#247CB3",2015,2018,10,15,23,"MSc",4,6,"#82ABB9",21,12,22,26,18,false,"Clinical Social Worker","MB ChB","5.8",2012,2017,13,2013,39,9,"3.5",16,2014,20,30,31,33,2020,29,2016,17,"#666666",32,25,24,19,"Blythe","2020-11-30T10:25:30.000Z",797,2021,"\u002F2021\u002F7\u002Fe26151","MEng","BMRSc (RT)","Cían Owen","Hughes","MBChB, MRCS, MSc","Google Health","6 Pancras Square","N1C 4AG","#584677","Journal of Medical Internet Research","The leading peer-reviewed journal for digital medicine and health and health care in the internet age. June 2024 - Journal Impact Factor: 5.8. Q1 journal in \"Medical Informatics\" and \"Health Care Sciences & Services\" categories.(Source: Journal Citation Reports™ 2024 from Clarivate™)","https:\u002F\u002Fwww.jmir.org",1999,9001,"14.4",27,"formative","mhealth","#65AD8C","ojphi",2009,"publichealth","medinform","mental","4.8","humanfactors",8,11,"games","mededu","3.2","iproc",14,"aging",34,"Overlay journal for preprints with post-review manuscript marketplace","xmed","pediatrics","cancer","derma","diabetes","rehab","cardio","infodemiology",41,"ai",2022,"periop","nursing",28,"jopm","biomedeng","#474760","bioinform","apinj","#3399ff",36,"neuro","xbio","xr",40,"data","challenges","preprints"));</script><script src="/_nuxt/d762be0.js" defer></script><script src="/_nuxt/f3339c8.js" defer></script><script src="/_nuxt/76715db.js" defer></script><script src="/_nuxt/114bf28.js" defer></script><script src="/_nuxt/557a776.js" defer></script><script src="/_nuxt/f65c9d0.js" defer></script><script src="/_nuxt/b7cb769.js" defer></script><script src="/_nuxt/65075b6.js" defer></script><script src="/_nuxt/8499a81.js" defer></script><script src="/_nuxt/60620ca.js" defer></script><script src="/_nuxt/7bab8a9.js" defer></script><script src="/_nuxt/6c336c8.js" defer></script><script src="/_nuxt/e63f738.js" defer></script><script src="/_nuxt/2f3783d.js" defer></script><script src="/_nuxt/8679c54.js" defer></script><script src="/_nuxt/84ba7a4.js" defer></script><script src="/_nuxt/2042386.js" defer></script><script src="/_nuxt/7aa873e.js" defer></script><script src="/_nuxt/4f5e72f.js" defer></script><script src="/_nuxt/512387a.js" defer></script><script src="/_nuxt/69df2f4.js" defer></script><script src="/_nuxt/c8abcfd.js" defer></script><script src="/_nuxt/ef88041.js" defer></script><script src="/_nuxt/4e67ec6.js" defer></script><script src="/_nuxt/edec9c1.js" defer></script><script src="/_nuxt/a0160a8.js" defer></script><script src="/_nuxt/88d69a9.js" defer></script><script src="/_nuxt/456ce06.js" defer></script><script src="/_nuxt/3049f10.js" defer></script><script src="/_nuxt/fec3470.js" defer></script><script src="/_nuxt/48052c9.js" defer></script><script src="/_nuxt/f76b897.js" defer></script><script src="/_nuxt/f2fb177.js" defer></script><script src="/_nuxt/fdd6c0c.js" defer></script><script src="/_nuxt/51c3064.js" defer></script><script src="/_nuxt/67892e3.js" defer></script><script src="/_nuxt/6c2c095.js" defer></script><script src="/_nuxt/8977131.js" defer></script><script src="/_nuxt/e3a1965.js" defer></script><script src="/_nuxt/c7860cc.js" defer></script><script src="/_nuxt/ceb5b90.js" defer></script><script src="/_nuxt/2715511.js" defer></script><script src="/_nuxt/7819e44.js" defer></script><script src="/_nuxt/8d00881.js" defer></script><script src="/_nuxt/6171ac4.js" defer></script><script src="/_nuxt/9a8e2c4.js" defer></script><script src="/_nuxt/3c04820.js" defer></script><script src="/_nuxt/713c552.js" defer></script><script src="/_nuxt/ee7c71d.js" defer></script><script src="/_nuxt/0c77ea6.js" defer></script><script src="/_nuxt/c4453ca.js" defer></script><script data-n-head="ssr" src="https://d1bxh8uas1mnw7.cloudfront.net/assets/embed.js" data-body="true" defer></script><script data-n-head="ssr" src="https://badge.dimensions.ai/badge.js" data-body="true" defer></script><script data-n-head="ssr" type="text/javascript" charset="utf-8" data-body="true">_linkedin_partner_id = "3149908"; window._linkedin_data_partner_ids = window._linkedin_data_partner_ids || []; window._linkedin_data_partner_ids.push(_linkedin_partner_id);</script><script data-n-head="ssr" type="text/javascript" charset="utf-8" data-body="true">(function(l) { if (!l){window.lintrk = function(a,b){window.lintrk.q.push([a,b])}; window.lintrk.q=[]} var s = document.getElementsByTagName("script")[0]; var b = document.createElement("script"); b.type = "text/javascript";b.async = true; b.src = "https://snap.licdn.com/li.lms-analytics/insight.min.js"; s.parentNode.insertBefore(b, s);})(window.lintrk);</script></body></html>

Pages: 1 2 3 4 5 6 7 8 9 10