CINXE.COM

ImageNet Large Scale Visual Recognition Challenge | International Journal of Computer Vision

<!DOCTYPE html> <html lang="en" class="no-js"> <head> <meta charset="UTF-8"> <meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta name="applicable-device" content="pc,mobile"> <meta name="viewport" content="width=device-width, initial-scale=1"> <meta name="robots" content="max-image-preview:large"> <meta name="access" content="No"> <meta name="360-site-verification" content="1268d79b5e96aecf3ff2a7dac04ad990" /> <title>ImageNet Large Scale Visual Recognition Challenge | International Journal of Computer Vision</title> <meta name="twitter:site" content="@SpringerLink"/> <meta name="twitter:card" content="summary_large_image"/> <meta name="twitter:image:alt" content="Content cover image"/> <meta name="twitter:title" content="ImageNet Large Scale Visual Recognition Challenge"/> <meta name="twitter:description" content="International Journal of Computer Vision - The ImageNet Large Scale Visual Recognition Challenge is a benchmark in object category classification and detection on hundreds of object categories and..."/> <meta name="twitter:image" content="https://static-content.springer.com/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig1_HTML.gif"/> <meta name="journal_id" content="11263"/> <meta name="dc.title" content="ImageNet Large Scale Visual Recognition Challenge"/> <meta name="dc.source" content="International Journal of Computer Vision 2015 115:3"/> <meta name="dc.format" content="text/html"/> <meta name="dc.publisher" content="Springer"/> <meta name="dc.date" content="2015-04-11"/> <meta name="dc.type" content="OriginalPaper"/> <meta name="dc.language" content="En"/> <meta name="dc.copyright" content="2015 Springer Science+Business Media New York"/> <meta name="dc.rights" content="2015 Springer Science+Business Media New York"/> <meta name="dc.rightsAgent" content="journalpermissions@springernature.com"/> <meta name="dc.description" content="The ImageNet Large Scale Visual Recognition Challenge is a benchmark in object category classification and detection on hundreds of object categories and millions of images. The challenge has been run annually from 2010 to present, attracting participation from more than fifty institutions. This paper describes the creation of this benchmark dataset and the advances in object recognition that have been possible as a result. We discuss the challenges of collecting large-scale ground truth annotation, highlight key breakthroughs in categorical object recognition, provide a detailed analysis of the current state of the field of large-scale image classification and object detection, and compare the state-of-the-art computer vision accuracy with human accuracy. We conclude with lessons learned in the 5&amp;nbsp;years of the challenge, and propose future directions and improvements."/> <meta name="prism.issn" content="1573-1405"/> <meta name="prism.publicationName" content="International Journal of Computer Vision"/> <meta name="prism.publicationDate" content="2015-04-11"/> <meta name="prism.volume" content="115"/> <meta name="prism.number" content="3"/> <meta name="prism.section" content="OriginalPaper"/> <meta name="prism.startingPage" content="211"/> <meta name="prism.endingPage" content="252"/> <meta name="prism.copyright" content="2015 Springer Science+Business Media New York"/> <meta name="prism.rightsAgent" content="journalpermissions@springernature.com"/> <meta name="prism.url" content="https://link.springer.com/article/10.1007/s11263-015-0816-y"/> <meta name="prism.doi" content="doi:10.1007/s11263-015-0816-y"/> <meta name="citation_pdf_url" content="https://link.springer.com/content/pdf/10.1007/s11263-015-0816-y.pdf"/> <meta name="citation_fulltext_html_url" content="https://link.springer.com/article/10.1007/s11263-015-0816-y"/> <meta name="citation_journal_title" content="International Journal of Computer Vision"/> <meta name="citation_journal_abbrev" content="Int J Comput Vis"/> <meta name="citation_publisher" content="Springer US"/> <meta name="citation_issn" content="1573-1405"/> <meta name="citation_title" content="ImageNet Large Scale Visual Recognition Challenge"/> <meta name="citation_volume" content="115"/> <meta name="citation_issue" content="3"/> <meta name="citation_publication_date" content="2015/12"/> <meta name="citation_online_date" content="2015/04/11"/> <meta name="citation_firstpage" content="211"/> <meta name="citation_lastpage" content="252"/> <meta name="citation_article_type" content="Article"/> <meta name="citation_language" content="en"/> <meta name="dc.identifier" content="doi:10.1007/s11263-015-0816-y"/> <meta name="DOI" content="10.1007/s11263-015-0816-y"/> <meta name="size" content="837900"/> <meta name="citation_doi" content="10.1007/s11263-015-0816-y"/> <meta name="citation_springer_api_url" content="http://api.springer.com/xmldata/jats?q=doi:10.1007/s11263-015-0816-y&amp;api_key="/> <meta name="description" content="The ImageNet Large Scale Visual Recognition Challenge is a benchmark in object category classification and detection on hundreds of object categories and m"/> <meta name="dc.creator" content="Russakovsky, Olga"/> <meta name="dc.creator" content="Deng, Jia"/> <meta name="dc.creator" content="Su, Hao"/> <meta name="dc.creator" content="Krause, Jonathan"/> <meta name="dc.creator" content="Satheesh, Sanjeev"/> <meta name="dc.creator" content="Ma, Sean"/> <meta name="dc.creator" content="Huang, Zhiheng"/> <meta name="dc.creator" content="Karpathy, Andrej"/> <meta name="dc.creator" content="Khosla, Aditya"/> <meta name="dc.creator" content="Bernstein, Michael"/> <meta name="dc.creator" content="Berg, Alexander C."/> <meta name="dc.creator" content="Fei-Fei, Li"/> <meta name="dc.subject" content="Computer Imaging, Vision, Pattern Recognition and Graphics"/> <meta name="dc.subject" content="Artificial Intelligence"/> <meta name="dc.subject" content="Image Processing and Computer Vision"/> <meta name="dc.subject" content="Pattern Recognition"/> <meta name="citation_reference" content="citation_journal_title=Pattern Analysis and Machine Intelligence; citation_title=Face description with local binary patterns: Application to face recognition; citation_author=T Ahonen, A Hadid, M Pietikinen; citation_volume=28; citation_issue=14; citation_publication_date=2006; citation_pages=2037-2041; citation_doi=10.1109/TPAMI.2006.244; citation_id=CR1"/> <meta name="citation_reference" content="citation_journal_title=IEEE Transactions on Pattern Analysis and Machine Intelligence; citation_title=Measuring the objectness of image windows; citation_author=B Alexe, T Deselares, V Ferrari; citation_volume=34; citation_issue=11; citation_publication_date=2012; citation_pages=2189-2202; citation_doi=10.1109/TPAMI.2012.28; citation_id=CR2"/> <meta name="citation_reference" content="Arandjelovic, R., &amp; Zisserman, A. (2012). Three things everyone should know to improve object retrieval. In CVPR."/> <meta name="citation_reference" content="Arbel&#225;ez, P., Pont-Tuset, J., Barron, J., Marques, F., &amp; Malik, J. (2014). Multiscale combinatorial grouping. In Computer vision and pattern recognition."/> <meta name="citation_reference" content="citation_journal_title=IEEE Transaction on Pattern Analysis and Machine Intelligence; citation_title=Contour detection and hierarchical image segmentation; citation_author=P Arbelaez, M Maire, C Fowlkes, J Malik; citation_volume=33; citation_publication_date=2011; citation_pages=898-916; citation_doi=10.1109/TPAMI.2010.161; citation_id=CR5"/> <meta name="citation_reference" content="Batra, D., Agrawal, H., Banik, P., Chavali, N., Mathialagan, C. S., &amp; Alfadda, A. (2013). Cloudcv: Large-scale distributed computer vision as a cloud service."/> <meta name="citation_reference" content="Bell, S., Upchurch, P., Snavely, N., &amp; Bala, K. (2013). OpenSurfaces: A richly annotated catalog of surface appearance. In ACM transactions on graphics (SIGGRAPH)."/> <meta name="citation_reference" content="Berg, A., Farrell, R., Khosla, A., Krause, J., Fei-Fei, L., Li, J., &amp; Maji, S. (2013). Fine-grained competition. https://sites.google.com/site/fgcomp2013/ ."/> <meta name="citation_reference" content="Chatfield, K., Simonyan, K., Vedaldi, A., &amp; Zisserman, A. (2014). Return of the devil in the details: Delving deep into convolutional nets. CoRR, abs/1405.3531."/> <meta name="citation_reference" content="Chen, Q., Song, Z., Huang, Z., Hua, Y., &amp; Yan, S. (2014). Contextualizing object detection and classification. In CVPR."/> <meta name="citation_reference" content="citation_journal_title=Journal of Machine Learning Research; citation_title=Online passive-aggressive algorithms; citation_author=K Crammer, O Dekel, J Keshet, S Shalev-Shwartz, Y Singer; citation_volume=7; citation_publication_date=2006; citation_pages=551-585; citation_id=CR11"/> <meta name="citation_reference" content="Criminisi, A. (2004). Microsoft Research Cambridge (MSRC) object recognition image database (version 2.0). http://research.microsoft.com/vision/cambridge/recognition ."/> <meta name="citation_reference" content="Dean, T., Ruzon, M., Segal, M., Shlens, J., Vijayanarasimhan, S., &amp; Yagnik, J. (2013). Fast, accurate detection of 100,000 object classes on a single machine. In CVPR."/> <meta name="citation_reference" content="Deng, J., Dong, W., Socher, R., Li, L.-J., Li, K., &amp; Fei-Fei, L. (2009). ImageNet: A large-scale hierarchical image database. In CVPR."/> <meta name="citation_reference" content="Deng, J., Russakovsky, O., Krause, J., Bernstein, M., Berg, A. C., &amp; Fei-Fei, L. (2014). Scalable multi-label annotation. In CHI."/> <meta name="citation_reference" content="Donahue, J., Jia, Y., Vinyals, O., Hoffman, J., Zhang, N., Tzeng, E., &amp; Darrell, T. (2013). Decaf: A deep convolutional activation feature for generic visual recognition. CoRR, abs/1310.1531."/> <meta name="citation_reference" content="Dubout, C., &amp; Fleuret, F. (2012). Exact acceleration of linear object detectors. In Proceedings of the European conference on computer vision (ECCV)."/> <meta name="citation_reference" content="Everingham, M., Gool, L. V., Williams, C., Winn, J., &amp; Zisserman, A. (2005&#8211;2012). PASCAL Visual Object Classes Challenge (VOC). http://www.pascal-network.org/challenges/VOC/voc2012/workshop/index.html ."/> <meta name="citation_reference" content="citation_journal_title=International Journal of Computer Vision; citation_title=The Pascal Visual Object Classes (VOC) challenge; citation_author=M Everingham, L Gool, CKI Williams, J Winn, A Zisserman; citation_volume=88; citation_issue=2; citation_publication_date=2010; citation_pages=303-338; citation_doi=10.1007/s11263-009-0275-4; citation_id=CR19"/> <meta name="citation_reference" content="citation_journal_title=International Journal of Computer Vision; citation_title=The Pascal Visual Object Classes (VOC) challenge&#8212;A retrospective; citation_author=M Everingham, SMA Eslami, L Gool, CKI Williams, J Winn, A Zisserman; citation_volume=111; citation_publication_date=2014; citation_pages=98-136; citation_doi=10.1007/s11263-014-0733-5; citation_id=CR20"/> <meta name="citation_reference" content="Fei-Fei, L., &amp; Perona, P. (2005). A Bayesian hierarchical model for learning natural scene categories. In CVPR."/> <meta name="citation_reference" content="Fei-Fei, L., Fergus, R., &amp; Perona, P. (2004). Learning generative visual models from few examples: An incremental bayesian approach tested on 101 object categories. In CVPR."/> <meta name="citation_reference" content="citation_journal_title=IEEE Transactions on Pattern Analysis and Machine Intelligence; citation_title=Object detection with discriminatively trained part based models; citation_author=P Felzenszwalb, R Girshick, D McAllester, D Ramanan; citation_volume=32; citation_issue=9; citation_publication_date=2010; citation_pages=1627-1645; citation_doi=10.1109/TPAMI.2009.167; citation_id=CR23"/> <meta name="citation_reference" content="Frome, A., Corrado, G., Shlens, J., Bengio, S., Dean, J., Ranzato, M., &amp; Mikolov, T. (2013). Devise: A deep visual-semantic embedding model. In Advances in neural information processing systems, NIPS."/> <meta name="citation_reference" content="citation_journal_title=International Journal of Robotics Research; citation_title=Vision meets robotics: The kitti dataset; citation_author=A Geiger, P Lenz, C Stiller, R Urtasun; citation_volume=32; citation_publication_date=2013; citation_pages=1231-1237; citation_doi=10.1177/0278364913491297; citation_id=CR25"/> <meta name="citation_reference" content="Girshick, R. B., Donahue, J., Darrell, T., &amp; Malik, J. (2013). Rich feature hierarchies for accurate object detection and semantic segmentation (v4). CoRR."/> <meta name="citation_reference" content="Girshick, R., Donahue, J., Darrell, T., &amp; Malik., J. (2014). Rich feature hierarchies for accurate object detection and semantic segmentation. In CVPR."/> <meta name="citation_reference" content="Gould, S., Fulton, R., &amp; Koller, D. (2009). Decomposing a scene into geometric and semantically consistent regions. In ICCV."/> <meta name="citation_reference" content="Graham, B. (2013). Sparse arrays of signatures for online character recognition. CoRR."/> <meta name="citation_reference" content="Griffin, G., Holub, A., &amp; Perona, P. (2007). Caltech-256 object category dataset. Technical report 7694, Caltech."/> <meta name="citation_reference" content="Harada, T., &amp; Kuniyoshi, Y. (2012). Graphical Gaussian vector for image categorization. In NIPS."/> <meta name="citation_reference" content="Harel, J., Koch, C., &amp; Perona, P. (2007). Graph-based visual saliency. In NIPS."/> <meta name="citation_reference" content="He, K., Zhang, X., Ren, S., &amp; Su, J. (2014). Spatial pyramid pooling in deep convolutional networks for visual recognition. In ECCV."/> <meta name="citation_reference" content="Hinton, G. E., Srivastava, N., Krizhevsky, A., Sutskever, I., &amp; Salakhutdinov, R. (2012). Improving neural networks by preventing co-adaptation of feature detectors. CoRR, abs/1207.0580."/> <meta name="citation_reference" content="Hoiem, D., Chodpathumwan, Y., &amp; Dai, Q. (2012). Diagnosing error in object detectors. In ECCV."/> <meta name="citation_reference" content="Howard, A. (2014). Some improvements on deep convolutional neural network based image classification. In ICLR."/> <meta name="citation_reference" content="Huang, G. B., Ramesh, M., Berg, T., &amp; Learned-Miller, E. (2007). Labeled faces in the wild: A database for studying face recognition in unconstrained environments. Technical report 07&#8211;49, University of Massachusetts, Amherst."/> <meta name="citation_reference" content="Iandola, F. N., Moskewicz, M. W., Karayev, S., Girshick, R. B., Darrell, T., &amp; Keutzer, K. (2014). Densenet: Implementing efficient convnet descriptor pyramids. CoRR."/> <meta name="citation_reference" content="Jia, Y. (2013). Caffe: An open source convolutional architecture for fast feature embedding. http://caffe.berkeleyvision.org/ ."/> <meta name="citation_reference" content="Jojic, N., Frey, B. J., &amp; Kannan, A. (2003). Epitomic analysis of appearance and shape. In ICCV."/> <meta name="citation_reference" content="Kanezaki, A., Inaba, S., Ushiku, Y., Yamashita, Y., Muraoka, H., Kuniyoshi, Y., &amp; Harada, T. (2014). Hard negative classes for multiple object detection. In ICRA."/> <meta name="citation_reference" content="Khosla, A., Jayadevaprakash, N., Yao, B., &amp; Fei-Fei, L. (2011). Novel dataset for fine-grained image categorization. In First workshop on fine-grained visual categorization, CVPR."/> <meta name="citation_reference" content="Krizhevsky, A., Sutskever, I., &amp; Hinton, G. (2012). ImageNet classification with deep convolutional neural networks. In NIPS."/> <meta name="citation_reference" content="Kuettel, D., Guillaumin, M., &amp; Ferrari, V. (2012). Segmentation propagation in ImageNet. In ECCV."/> <meta name="citation_reference" content="Lazebnik, S., Schmid, C., &amp; Ponce, J. (2006). Beyond bags of features: Spatial pyramid matching for recognizing natural scene categories. In CVPR."/> <meta name="citation_reference" content="Lin, M., Chen, Q., &amp; Yan, S. (2014a). Network in network. In ICLR."/> <meta name="citation_reference" content="Lin, Y., Lv, F., Cao, L., Zhu, S., Yang, M., Cour, T., Yu, K., &amp; Huang, T. (2011). Large-scale image classification: Fast feature extraction and SVM training. In CVPR."/> <meta name="citation_reference" content="Lin, T.-Y., Maire, M., Belongie, S., Hays, J., Perona, P., Ramanan, D., Dollr, P., &amp; Zitnick, C. L. (2014b). Microsoft COCO: Common objects in context. In ECCV."/> <meta name="citation_reference" content="citation_journal_title=IEEE Transactions on Pattern Analysis and Machine Intelligence; citation_title=Nonparametric scene parsing via label transfer; citation_author=C Liu, J Yuen, A Torralba; citation_volume=32; citation_publication_date=2011; citation_pages=2368-2382; citation_doi=10.1109/TPAMI.2011.131; citation_id=CR49"/> <meta name="citation_reference" content="citation_journal_title=International Journal of Computer Vision; citation_title=Distinctive image features from scale-invariant keypoints; citation_author=DG Lowe; citation_volume=60; citation_issue=2; citation_publication_date=2004; citation_pages=91-110; citation_doi=10.1023/B:VISI.0000029664.99615.94; citation_id=CR50"/> <meta name="citation_reference" content="Maji, S., &amp; Malik, J. (2009). Object detection using a max-margin hough transform. In CVPR."/> <meta name="citation_reference" content="Manen, S., Guillaumin, M., &amp; Van Gool, L. (2013). Prime object proposals with randomized Prim&#8217;s algorithm. In ICCV."/> <meta name="citation_reference" content="Mensink, T., Verbeek, J., Perronnin, F., &amp; Csurka, G. (2012). Metric learning for large scale image classification: Generalizing to new classes at near-zero cost. In ECCV."/> <meta name="citation_reference" content="Mikolov, T., Chen, K., Corrado, G., &amp; Dean, J. (2013). Efficient estimation of word representations in vector space. In ICLR."/> <meta name="citation_reference" content="citation_journal_title=Commun. ACM; citation_title=Wordnet: A lexical database for English; citation_author=GA Miller; citation_volume=38; citation_issue=11; citation_publication_date=1995; citation_pages=39-41; citation_doi=10.1145/219717.219748; citation_id=CR55"/> <meta name="citation_reference" content="Oliva, A., &amp; Torralba, A. (2001). Modeling the shape of the scene: A holistic representation of the spatial envelope. In IJCV."/> <meta name="citation_reference" content="Ordonez, V., Deng, J., Choi, Y., Berg, A. C., &amp; Berg, T. L. (2013). From large scale image categorization to entry-level categories. In IEEE international conference on computer vision (ICCV)."/> <meta name="citation_reference" content="Ouyang, W., &amp; Wang, X. (2013). Joint deep learning for pedestrian detection. In ICCV."/> <meta name="citation_reference" content="Ouyang, W., Luo, P., Zeng, X., Qiu, S., Tian, Y., Li, H., Yang, S., Wang, Z., Xiong, Y., Qian, C., Zhu, Z., Wang, R., Loy, C. C., Wang, X., &amp; Tang, X. (2014). Deepid-net: multi-stage and deformable deep convolutional neural networks for object detection. CoRR, abs/1409.3505."/> <meta name="citation_reference" content="Papandreou, G. (2014). Deep epitomic convolutional neural networks. CoRR."/> <meta name="citation_reference" content="Papandreou, G., Chen, L.-C., &amp; Yuille, A. L. (2014). Modeling image patches with a generic dictionary of mini-epitomes."/> <meta name="citation_reference" content="Perronnin, F., &amp; Dance, C. R. (2007). Fisher kernels on visual vocabularies for image categorization. In CVPR."/> <meta name="citation_reference" content="Perronnin, F., Akata, Z., Harchaoui, Z., &amp; Schmid, C. (2012). Towards good practice in large-scale learning for image classification. In CVPR."/> <meta name="citation_reference" content="Perronnin, F., S&#225;nchez, J., &amp; Mensink, T. (2010). Improving the fisher kernel for large-scale image classification. In ECCV (4)."/> <meta name="citation_reference" content="Russakovsky, O., Deng, J., Huang, Z., Berg, A., &amp; Fei-Fei, L. (2013). Detecting avocados to zucchinis: What have we done, &amp; where are we going? In ICCV."/> <meta name="citation_reference" content="Russell, B., Torralba, A., Murphy, K., &amp; Freeman, W. T. (2007). LabelMe: A database and web-based tool for image annotation. In IJCV."/> <meta name="citation_reference" content="Sanchez, J., &amp; Perronnin, F. (2011). High-dim. signature compression for large-scale image classification. In CVPR."/> <meta name="citation_reference" content="Sanchez, J., Perronnin, F., &amp; de Campos, T. (2012). Modeling spatial layout of images beyond spatial pyramids. In PRL."/> <meta name="citation_reference" content="Scheirer, W., Kumar, N., Belhumeur, P. N., &amp; Boult, T. E. (2012). Multi-attribute spaces: Calibration for attribute fusion and similarity search. In CVPR."/> <meta name="citation_reference" content="Schmidhuber, J. (2012). Multi-column deep neural networks for image classification. In CVPR."/> <meta name="citation_reference" content="Sermanet, P., Eigen, D., Zhang, X., Mathieu, M., Fergus, R., &amp; LeCun, Y. (2013). Overfeat: Integrated recognition, localization and detection using convolutional networks. CoRR, abs/1312.6229."/> <meta name="citation_reference" content="Sheng, V. S., Provost, F., &amp; Ipeirotis, P. G. (2008). Get another label? Improving data quality and data mining using multiple, noisy labelers. In SIGKDD."/> <meta name="citation_reference" content="Simonyan, K., &amp; Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. CoRR, abs/1409.1556."/> <meta name="citation_reference" content="Simonyan, K., Vedaldi, A., &amp; Zisserman, A. (2013). Deep fisher networks for large-scale image classification. In NIPS."/> <meta name="citation_reference" content="Sorokin, A., &amp; Forsyth, D. (2008). Utility data annotation with Amazon Mechanical Turk. In InterNet08."/> <meta name="citation_reference" content="Su, H., Deng, J., &amp; Fei-Fei, L. (2012). Crowdsourcing annotations for visual object detection. In AAAI human computation workshop."/> <meta name="citation_reference" content="Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., Erhan, D., &amp; Rabinovich, A. (2014). Going deeper with convolutions. Technical report."/> <meta name="citation_reference" content="Tang, Y. (2013). Deep learning using support vector machines. CoRR, abs/1306.0239."/> <meta name="citation_reference" content="citation_journal_title=Nature; citation_title=Speed of processing in the human visual system; citation_author=S Thorpe, D Fize, C Marlot; citation_volume=381; citation_issue=6582; citation_publication_date=1996; citation_pages=520-522; citation_doi=10.1038/381520a0; citation_id=CR79"/> <meta name="citation_reference" content="Torralba, A., &amp; Efros, A. A. (2011). Unbiased look at dataset bias. In CVPR&#8217;11."/> <meta name="citation_reference" content="citation_journal_title=IEEE Transactions on Pattern Analysis and Machine Intelligence; citation_title=80 million tiny images: A large data set for nonparametric object and scene recognition; citation_author=A Torralba, R Fergus, W Freeman; citation_volume=30; citation_publication_date=2008; citation_pages=1958-1970; citation_doi=10.1109/TPAMI.2008.128; citation_id=CR81"/> <meta name="citation_reference" content="citation_journal_title=International Journal of Computer Vision; citation_title=Selective search for object recognition; citation_author=J Uijlings, K Sande, T Gevers, A Smeulders; citation_volume=104; citation_publication_date=2013; citation_pages=154-171; citation_doi=10.1007/s11263-013-0620-5; citation_id=CR82"/> <meta name="citation_reference" content="Urtasun, R., Fergus, R., Hoiem, D., Torralba, A., Geiger, A., Lenz, P., Silberman, N., Xiao, J., &amp; Fidler, S. (2013&#8211;2014). Reconstruction meets recognition challenge. http://ttic.uchicago.edu/rurtasun/rmrc/ ."/> <meta name="citation_reference" content="van de Sande, K. E. A., Snoek, C. G. M., &amp; Smeulders, A. W. M. (2014). Fisher and vlad with flair. In Proceedings of the IEEE conference on computer vision and pattern recognition."/> <meta name="citation_reference" content="van de Sande, K. E. A., Uijlings, J. R. R., Gevers, T., &amp; Smeulders, A. W. M. (2011b). Segmentation as selective search for object recognition. In ICCV."/> <meta name="citation_reference" content="citation_journal_title=IEEE Transactions on Pattern Analysis and Machine Intelligence; citation_title=Evaluating color descriptors for object and scene recognition; citation_author=KEA Sande, T Gevers, CGM Snoek; citation_volume=32; citation_issue=9; citation_publication_date=2010; citation_pages=1582-1596; citation_doi=10.1109/TPAMI.2009.154; citation_id=CR86"/> <meta name="citation_reference" content="citation_journal_title=IEEE Transactions on Multimedia; citation_title=Empowering visual categorization with the GPU; citation_author=KEA Sande, T Gevers, CGM Snoek; citation_volume=13; citation_issue=1; citation_publication_date=2011; citation_pages=60-70; citation_doi=10.1109/TMM.2010.2091400; citation_id=CR87"/> <meta name="citation_reference" content="Vittayakorn, S., &amp; Hays, J. (2011). Quality assessment for crowdsourced object annotations. In BMVC."/> <meta name="citation_reference" content="von Ahn, L., &amp; Dabbish, L. (2005). Esp: Labeling images with a computer game. In AAAI spring symposium: Knowledge collection from volunteer contributors."/> <meta name="citation_reference" content="citation_journal_title=International Journal of Computer Vision; citation_title=Efficiently scaling up crowdsourced video annotation; citation_author=C Vondrick, D Patterson, D Ramanan; citation_volume=1010; citation_publication_date=2012; citation_pages=184-204; citation_id=CR90"/> <meta name="citation_reference" content="Wan, L., Zeiler, M., Zhang, S., LeCun, Y., &amp; Fergus, R. (2013). Regularization of neural networks using dropconnect. In Proceedings of the international conference on machine learning (ICML&#8217;13)."/> <meta name="citation_reference" content="Wang, M., Xiao, T., Li, J., Hong, C., Zhang, J., &amp; Zhang, Z. (2014). Minerva: A scalable and highly efficient training platform for deep learning. In APSys."/> <meta name="citation_reference" content="Wang, J., Yang, J., Yu, K., Lv, F., Huang, T., &amp; Gong, Y. (2010). Locality-constrained linear coding for image classification. In CVPR."/> <meta name="citation_reference" content="Wang, X., Yang, M., Zhu, S., &amp; Lin, Y. (2013). Regionlets for generic object detection. In ICCV."/> <meta name="citation_reference" content="Welinder, P., Branson, S., Belongie, S., &amp; Perona, P. (2010). The multidimensional wisdom of crowds. In NIPS."/> <meta name="citation_reference" content="Xiao, J., Hays, J., Ehinger, K., Oliva, A., &amp; Torralba., A. (2010). SUN database: Large-scale scene recognition from Abbey to Zoo. In CVPR."/> <meta name="citation_reference" content="Yang, J., Yu, K., Gong, Y., &amp; Huang, T. (2009). Linear spatial pyramid matching using sparse coding for image classification. In CVPR."/> <meta name="citation_reference" content="citation_title=Introduction to a large scale general purpose ground truth dataset: methodology, annotation tool, and benchmarks; citation_publication_date=2007; citation_id=CR98; citation_author=B Yao; citation_author=X Yang; citation_author=S-C Zhu; citation_publisher=Springer"/> <meta name="citation_reference" content="Zeiler, M. D., &amp; Fergus, R. (2013). Visualizing and understanding convolutional networks. CoRR, abs/1311.2901."/> <meta name="citation_reference" content="Zeiler, M. D., Taylor, G. W., &amp; Fergus, R. (2011). Adaptive deconvolutional networks for mid and high level feature learning. In ICCV."/> <meta name="citation_reference" content="Zhou, B., Lapedriza, A., Xiao, J., Torralba, A., &amp; Oliva, A. (2014). Learning deep features for scene recognition using places database. In NIPS."/> <meta name="citation_reference" content="Zhou, X., Yu, K., Zhang, T., &amp; Huang, T. (2010). Image classification using super-vector coding of local image descriptors. In ECCV."/> <meta name="citation_author" content="Russakovsky, Olga"/> <meta name="citation_author_email" content="olga@cs.stanford.edu"/> <meta name="citation_author_institution" content="Stanford University, Stanford, USA"/> <meta name="citation_author" content="Deng, Jia"/> <meta name="citation_author_institution" content="University of Michigan, Ann Arbor, USA"/> <meta name="citation_author" content="Su, Hao"/> <meta name="citation_author_institution" content="Stanford University, Stanford, USA"/> <meta name="citation_author" content="Krause, Jonathan"/> <meta name="citation_author_institution" content="Stanford University, Stanford, USA"/> <meta name="citation_author" content="Satheesh, Sanjeev"/> <meta name="citation_author_institution" content="Stanford University, Stanford, USA"/> <meta name="citation_author" content="Ma, Sean"/> <meta name="citation_author_institution" content="Stanford University, Stanford, USA"/> <meta name="citation_author" content="Huang, Zhiheng"/> <meta name="citation_author_institution" content="Stanford University, Stanford, USA"/> <meta name="citation_author" content="Karpathy, Andrej"/> <meta name="citation_author_institution" content="Stanford University, Stanford, USA"/> <meta name="citation_author" content="Khosla, Aditya"/> <meta name="citation_author_institution" content="Massachusetts Institute of Technology, Cambridge, USA"/> <meta name="citation_author" content="Bernstein, Michael"/> <meta name="citation_author_institution" content="Stanford University, Stanford, USA"/> <meta name="citation_author" content="Berg, Alexander C."/> <meta name="citation_author_institution" content="UNC Chapel Hill, Chapel Hill, USA"/> <meta name="citation_author" content="Fei-Fei, Li"/> <meta name="citation_author_institution" content="Stanford University, Stanford, USA"/> <meta name="format-detection" content="telephone=no"/> <meta name="citation_cover_date" content="2015/12/01"/> <meta property="og:url" content="https://link.springer.com/article/10.1007/s11263-015-0816-y"/> <meta property="og:type" content="article"/> <meta property="og:site_name" content="SpringerLink"/> <meta property="og:title" content="ImageNet Large Scale Visual Recognition Challenge - International Journal of Computer Vision"/> <meta property="og:description" content="The ImageNet Large Scale Visual Recognition Challenge is a benchmark in object category classification and detection on hundreds of object categories and millions of images. The challenge has been run annually from 2010 to present, attracting participation from more than fifty institutions. This paper describes the creation of this benchmark dataset and the advances in object recognition that have been possible as a result. We discuss the challenges of collecting large-scale ground truth annotation, highlight key breakthroughs in categorical object recognition, provide a detailed analysis of the current state of the field of large-scale image classification and object detection, and compare the state-of-the-art computer vision accuracy with human accuracy. We conclude with lessons learned in the 5 years of the challenge, and propose future directions and improvements."/> <meta property="og:image" content="https://static-content.springer.com/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig1_HTML.gif"/> <meta name="format-detection" content="telephone=no"> <link rel="apple-touch-icon" sizes="180x180" href=/oscar-static/img/favicons/darwin/apple-touch-icon-92e819bf8a.png> <link rel="icon" type="image/png" sizes="192x192" href=/oscar-static/img/favicons/darwin/android-chrome-192x192-6f081ca7e5.png> <link rel="icon" type="image/png" sizes="32x32" href=/oscar-static/img/favicons/darwin/favicon-32x32-1435da3e82.png> <link rel="icon" type="image/png" sizes="16x16" href=/oscar-static/img/favicons/darwin/favicon-16x16-ed57f42bd2.png> <link rel="shortcut icon" data-test="shortcut-icon" href=/oscar-static/img/favicons/darwin/favicon-c6d59aafac.ico> <meta name="theme-color" content="#e6e6e6"> <!-- Please see discussion: https://github.com/springernature/frontend-open-space/issues/316--> <!--TODO: Implement alternative to CTM in here if the discussion concludes we do not continue with CTM as a practice--> <link rel="stylesheet" media="print" href=/oscar-static/app-springerlink/css/print-b8af42253b.css> <style> html{text-size-adjust:100%;line-height:1.15}body{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;line-height:1.8;margin:0}details,main{display:block}h1{font-size:2em;margin:.67em 0}a{background-color:transparent;color:#025e8d}sub{bottom:-.25em;font-size:75%;line-height:0;position:relative;vertical-align:baseline}img{border:0;height:auto;max-width:100%;vertical-align:middle}button,input{font-family:inherit;font-size:100%;line-height:1.15;margin:0;overflow:visible}button{text-transform:none}[type=button],[type=submit],button{-webkit-appearance:button}[type=search]{-webkit-appearance:textfield;outline-offset:-2px}summary{display:list-item}[hidden]{display:none}button{cursor:pointer}svg{height:1rem;width:1rem} </style> <style>@media only print, only all and (prefers-color-scheme: no-preference), only all and (prefers-color-scheme: light), only all and (prefers-color-scheme: dark) { body{background:#fff;color:#222;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;line-height:1.8;min-height:100%}a{color:#025e8d;text-decoration:underline;text-decoration-skip-ink:auto}button{cursor:pointer}img{border:0;height:auto;max-width:100%;vertical-align:middle}html{box-sizing:border-box;font-size:100%;height:100%;overflow-y:scroll}h1{font-size:2.25rem}h2{font-size:1.75rem}h1,h2,h4{font-weight:700;line-height:1.2}h4{font-size:1.25rem}body{font-size:1.125rem}*{box-sizing:inherit}p{margin-bottom:2rem;margin-top:0}p:last-of-type{margin-bottom:0}.c-ad{text-align:center}@media only screen and (min-width:480px){.c-ad{padding:8px}}.c-ad--728x90{display:none}.c-ad--728x90 .c-ad__inner{min-height:calc(1.5em + 94px)}@media only screen and (min-width:876px){.js .c-ad--728x90{display:none}}.c-ad__label{color:#333;font-size:.875rem;font-weight:400;line-height:1.5;margin-bottom:4px}.c-ad__label,.c-status-message{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.c-status-message{align-items:center;box-sizing:border-box;display:flex;position:relative;width:100%}.c-status-message :last-child{margin-bottom:0}.c-status-message--boxed{background-color:#fff;border:1px solid #ccc;line-height:1.4;padding:16px}.c-status-message__heading{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:.875rem;font-weight:700}.c-status-message__icon{fill:currentcolor;display:inline-block;flex:0 0 auto;height:1.5em;margin-right:8px;transform:translate(0);vertical-align:text-top;width:1.5em}.c-status-message__icon--top{align-self:flex-start}.c-status-message--info .c-status-message__icon{color:#003f8d}.c-status-message--boxed.c-status-message--info{border-bottom:4px solid #003f8d}.c-status-message--error .c-status-message__icon{color:#c40606}.c-status-message--boxed.c-status-message--error{border-bottom:4px solid #c40606}.c-status-message--success .c-status-message__icon{color:#00b8b0}.c-status-message--boxed.c-status-message--success{border-bottom:4px solid #00b8b0}.c-status-message--warning .c-status-message__icon{color:#edbc53}.c-status-message--boxed.c-status-message--warning{border-bottom:4px solid #edbc53}.eds-c-header{background-color:#fff;border-bottom:2px solid #01324b;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:1rem;line-height:1.5;padding:8px 0 0}.eds-c-header__container{align-items:center;display:flex;flex-wrap:nowrap;gap:8px 16px;justify-content:space-between;margin:0 auto 8px;max-width:1280px;padding:0 8px;position:relative}.eds-c-header__nav{border-top:2px solid #c5e0f4;padding-top:4px;position:relative}.eds-c-header__nav-container{align-items:center;display:flex;flex-wrap:wrap;margin:0 auto 4px;max-width:1280px;padding:0 8px;position:relative}.eds-c-header__nav-container>:not(:last-child){margin-right:32px}.eds-c-header__link-container{align-items:center;display:flex;flex:1 0 auto;gap:8px 16px;justify-content:space-between}.eds-c-header__list{list-style:none;margin:0;padding:0}.eds-c-header__list-item{font-weight:700;margin:0 auto;max-width:1280px;padding:8px}.eds-c-header__list-item:not(:last-child){border-bottom:2px solid #c5e0f4}.eds-c-header__item{color:inherit}@media only screen and (min-width:768px){.eds-c-header__item--menu{display:none;visibility:hidden}.eds-c-header__item--menu:first-child+*{margin-block-start:0}}.eds-c-header__item--inline-links{display:none;visibility:hidden}@media only screen and (min-width:768px){.eds-c-header__item--inline-links{display:flex;gap:16px 16px;visibility:visible}}.eds-c-header__item--divider:before{border-left:2px solid #c5e0f4;content:"";height:calc(100% - 16px);margin-left:-15px;position:absolute;top:8px}.eds-c-header__brand{padding:16px 8px}.eds-c-header__brand a{display:block;line-height:1;text-decoration:none}.eds-c-header__brand img{height:1.5rem;width:auto}.eds-c-header__link{color:inherit;display:inline-block;font-weight:700;padding:16px 8px;position:relative;text-decoration-color:transparent;white-space:nowrap;word-break:normal}.eds-c-header__icon{fill:currentcolor;display:inline-block;font-size:1.5rem;height:1em;transform:translate(0);vertical-align:bottom;width:1em}.eds-c-header__icon+*{margin-left:8px}.eds-c-header__expander{background-color:#f0f7fc}.eds-c-header__search{display:block;padding:24px 0}@media only screen and (min-width:768px){.eds-c-header__search{max-width:70%}}.eds-c-header__search-container{position:relative}.eds-c-header__search-label{color:inherit;display:inline-block;font-weight:700;margin-bottom:8px}.eds-c-header__search-input{background-color:#fff;border:1px solid #000;padding:8px 48px 8px 8px;width:100%}.eds-c-header__search-button{background-color:transparent;border:0;color:inherit;height:100%;padding:0 8px;position:absolute;right:0}.has-tethered.eds-c-header__expander{border-bottom:2px solid #01324b;left:0;margin-top:-2px;top:100%;width:100%;z-index:10}@media only screen and (min-width:768px){.has-tethered.eds-c-header__expander--menu{display:none;visibility:hidden}}.has-tethered .eds-c-header__heading{display:none;visibility:hidden}.has-tethered .eds-c-header__heading:first-child+*{margin-block-start:0}.has-tethered .eds-c-header__search{margin:auto}.eds-c-header__heading{margin:0 auto;max-width:1280px;padding:16px 16px 0}.eds-c-pagination{align-items:center;display:flex;flex-wrap:wrap;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:.875rem;gap:16px 0;justify-content:center;line-height:1.4;list-style:none;margin:0;padding:32px 0}@media only screen and (min-width:480px){.eds-c-pagination{padding:32px 16px}}.eds-c-pagination__item{margin-right:8px}.eds-c-pagination__item--prev{margin-right:16px}.eds-c-pagination__item--next .eds-c-pagination__link,.eds-c-pagination__item--prev .eds-c-pagination__link{padding:16px 8px}.eds-c-pagination__item--next{margin-left:8px}.eds-c-pagination__item:last-child{margin-right:0}.eds-c-pagination__link{align-items:center;color:#222;cursor:pointer;display:inline-block;font-size:1rem;margin:0;padding:16px 24px;position:relative;text-align:center;transition:all .2s ease 0s}.eds-c-pagination__link:visited{color:#222}.eds-c-pagination__link--disabled{border-color:#555;color:#555;cursor:default}.eds-c-pagination__link--active{background-color:#01324b;background-image:none;border-radius:8px;color:#fff}.eds-c-pagination__link--active:focus,.eds-c-pagination__link--active:hover,.eds-c-pagination__link--active:visited{color:#fff}.eds-c-pagination__link-container{align-items:center;display:flex}.eds-c-pagination__icon{fill:#222;height:1.5rem;width:1.5rem}.eds-c-pagination__icon--disabled{fill:#555}.eds-c-pagination__visually-hidden{clip:rect(0,0,0,0);border:0;clip-path:inset(50%);height:1px;overflow:hidden;padding:0;position:absolute!important;white-space:nowrap;width:1px}.c-breadcrumbs{color:#333;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:1rem;list-style:none;margin:0;padding:0}.c-breadcrumbs>li{display:inline}svg.c-breadcrumbs__chevron{fill:#333;height:10px;margin:0 .25rem;width:10px}.c-breadcrumbs--contrast,.c-breadcrumbs--contrast .c-breadcrumbs__link{color:#fff}.c-breadcrumbs--contrast svg.c-breadcrumbs__chevron{fill:#fff}@media only screen and (max-width:479px){.c-breadcrumbs .c-breadcrumbs__item{display:none}.c-breadcrumbs .c-breadcrumbs__item:last-child,.c-breadcrumbs .c-breadcrumbs__item:nth-last-child(2){display:inline}}.c-skip-link{background:#01324b;bottom:auto;color:#fff;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:1rem;padding:8px;position:absolute;text-align:center;transform:translateY(-100%);width:100%;z-index:9999}@media (prefers-reduced-motion:reduce){.c-skip-link{transition:top .3s ease-in-out 0s}}@media print{.c-skip-link{display:none}}.c-skip-link:active,.c-skip-link:hover,.c-skip-link:link,.c-skip-link:visited{color:#fff}.c-skip-link:focus{transform:translateY(0)}.l-with-sidebar{display:flex;flex-wrap:wrap}.l-with-sidebar>*{margin:0}.l-with-sidebar__sidebar{flex-basis:var(--with-sidebar--basis,400px);flex-grow:1}.l-with-sidebar>:not(.l-with-sidebar__sidebar){flex-basis:0px;flex-grow:999;min-width:var(--with-sidebar--min,53%)}.l-with-sidebar>:first-child{padding-right:4rem}@supports (gap:1em){.l-with-sidebar>:first-child{padding-right:0}.l-with-sidebar{gap:var(--with-sidebar--gap,4rem)}}.c-header__link{color:inherit;display:inline-block;font-weight:700;padding:16px 8px;position:relative;text-decoration-color:transparent;white-space:nowrap;word-break:normal}.app-masthead__colour-4{--background-color:#ff9500;--gradient-light:rgba(0,0,0,.5);--gradient-dark:rgba(0,0,0,.8)}.app-masthead{background:var(--background-color,#0070a8);position:relative}.app-masthead:after{background:radial-gradient(circle at top right,var(--gradient-light,rgba(0,0,0,.4)),var(--gradient-dark,rgba(0,0,0,.7)));bottom:0;content:"";left:0;position:absolute;right:0;top:0}@media only screen and (max-width:479px){.app-masthead:after{background:linear-gradient(225deg,var(--gradient-light,rgba(0,0,0,.4)),var(--gradient-dark,rgba(0,0,0,.7)))}}.app-masthead__container{color:var(--masthead-color,#fff);margin:0 auto;max-width:1280px;padding:0 16px;position:relative;z-index:1}.u-button{align-items:center;background-color:#01324b;background-image:none;border:4px solid transparent;border-radius:32px;cursor:pointer;display:inline-flex;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:.875rem;font-weight:700;justify-content:center;line-height:1.3;margin:0;padding:16px 32px;position:relative;transition:all .2s ease 0s;width:auto}.u-button svg,.u-button--contrast svg,.u-button--primary svg,.u-button--secondary svg,.u-button--tertiary svg{fill:currentcolor}.u-button,.u-button:visited{color:#fff}.u-button,.u-button:hover{box-shadow:0 0 0 1px #01324b;text-decoration:none}.u-button:hover{border:4px solid #fff}.u-button:focus{border:4px solid #fc0;box-shadow:none;outline:0;text-decoration:none}.u-button:focus,.u-button:hover{background-color:#fff;background-image:none;color:#01324b}.app-masthead--pastel .c-pdf-download .u-button--primary:focus svg path,.app-masthead--pastel .c-pdf-download .u-button--primary:hover svg path,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:focus svg path,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:hover svg path,.u-button--primary:focus svg path,.u-button--primary:hover svg path,.u-button:focus svg path,.u-button:hover svg path{fill:#01324b}.u-button--primary{background-color:#01324b;background-image:none;border:4px solid transparent;box-shadow:0 0 0 1px #01324b;color:#fff;font-weight:700}.u-button--primary:visited{color:#fff}.u-button--primary:hover{border:4px solid #fff;box-shadow:0 0 0 1px #01324b;text-decoration:none}.u-button--primary:focus{border:4px solid #fc0;box-shadow:none;outline:0;text-decoration:none}.u-button--primary:focus,.u-button--primary:hover{background-color:#fff;background-image:none;color:#01324b}.u-button--secondary{background-color:#fff;border:4px solid #fff;color:#01324b;font-weight:700}.u-button--secondary:visited{color:#01324b}.u-button--secondary:hover{border:4px solid #01324b;box-shadow:none}.u-button--secondary:focus,.u-button--secondary:hover{background-color:#01324b;color:#fff}.app-masthead--pastel .c-pdf-download .u-button--secondary:focus svg path,.app-masthead--pastel .c-pdf-download .u-button--secondary:hover svg path,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary:focus svg path,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary:hover svg path,.u-button--secondary:focus svg path,.u-button--secondary:hover svg path,.u-button--tertiary:focus svg path,.u-button--tertiary:hover svg path{fill:#fff}.u-button--tertiary{background-color:#ebf1f5;border:4px solid transparent;box-shadow:none;color:#666;font-weight:700}.u-button--tertiary:visited{color:#666}.u-button--tertiary:hover{border:4px solid #01324b;box-shadow:none}.u-button--tertiary:focus,.u-button--tertiary:hover{background-color:#01324b;color:#fff}.u-button--contrast{background-color:transparent;background-image:none;color:#fff;font-weight:400}.u-button--contrast:visited{color:#fff}.u-button--contrast,.u-button--contrast:focus,.u-button--contrast:hover{border:4px solid #fff}.u-button--contrast:focus,.u-button--contrast:hover{background-color:#fff;background-image:none;color:#000}.u-button--contrast:focus svg path,.u-button--contrast:hover svg path{fill:#000}.u-button--disabled,.u-button:disabled{background-color:transparent;background-image:none;border:4px solid #ccc;color:#000;cursor:default;font-weight:400;opacity:.7}.u-button--disabled svg,.u-button:disabled svg{fill:currentcolor}.u-button--disabled:visited,.u-button:disabled:visited{color:#000}.u-button--disabled:focus,.u-button--disabled:hover,.u-button:disabled:focus,.u-button:disabled:hover{border:4px solid #ccc;text-decoration:none}.u-button--disabled:focus,.u-button--disabled:hover,.u-button:disabled:focus,.u-button:disabled:hover{background-color:transparent;background-image:none;color:#000}.u-button--disabled:focus svg path,.u-button--disabled:hover svg path,.u-button:disabled:focus svg path,.u-button:disabled:hover svg path{fill:#000}.u-button--small,.u-button--xsmall{font-size:.875rem;padding:2px 8px}.u-button--small{padding:8px 16px}.u-button--large{font-size:1.125rem;padding:10px 35px}.u-button--full-width{display:flex;width:100%}.u-button--icon-left svg{margin-right:8px}.u-button--icon-right svg{margin-left:8px}.u-clear-both{clear:both}.u-container{margin:0 auto;max-width:1280px;padding:0 16px}.u-justify-content-space-between{justify-content:space-between}.u-display-none{display:none}.js .u-js-hide,.u-hide{display:none;visibility:hidden}.u-visually-hidden{clip:rect(0,0,0,0);border:0;clip-path:inset(50%);height:1px;overflow:hidden;padding:0;position:absolute!important;white-space:nowrap;width:1px}.u-icon{fill:currentcolor;display:inline-block;height:1em;transform:translate(0);vertical-align:text-top;width:1em}.u-list-reset{list-style:none;margin:0;padding:0}.u-ma-16{margin:16px}.u-mt-0{margin-top:0}.u-mt-24{margin-top:24px}.u-mt-32{margin-top:32px}.u-mb-8{margin-bottom:8px}.u-mb-32{margin-bottom:32px}.u-button-reset{background-color:transparent;border:0;padding:0}.u-sans-serif{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.u-serif{font-family:Merriweather,serif}h1,h2,h4{-webkit-font-smoothing:antialiased}p{overflow-wrap:break-word;word-break:break-word}.u-h4{font-size:1.25rem;font-weight:700;line-height:1.2}.u-mbs-0{margin-block-start:0!important}.c-article-header{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.c-article-identifiers{color:#6f6f6f;display:flex;flex-wrap:wrap;font-size:1rem;line-height:1.3;list-style:none;margin:0 0 8px;padding:0}.c-article-identifiers__item{border-right:1px solid #6f6f6f;list-style:none;margin-right:8px;padding-right:8px}.c-article-identifiers__item:last-child{border-right:0;margin-right:0;padding-right:0}@media only screen and (min-width:876px){.c-article-title{font-size:1.875rem;line-height:1.2}}.c-article-author-list{display:inline;font-size:1rem;list-style:none;margin:0 8px 0 0;padding:0;width:100%}.c-article-author-list__item{display:inline;padding-right:0}.c-article-author-list__show-more{display:none;margin-right:4px}.c-article-author-list__button,.js .c-article-author-list__item--hide,.js .c-article-author-list__show-more{display:none}.js .c-article-author-list--long .c-article-author-list__show-more,.js .c-article-author-list--long+.c-article-author-list__button{display:inline}@media only screen and (max-width:767px){.js .c-article-author-list__item--hide-small-screen{display:none}.js .c-article-author-list--short .c-article-author-list__show-more,.js .c-article-author-list--short+.c-article-author-list__button{display:inline}}#uptodate-client,.js .c-article-author-list--expanded .c-article-author-list__show-more{display:none!important}.js .c-article-author-list--expanded .c-article-author-list__item--hide-small-screen{display:inline!important}.c-article-author-list__button,.c-button-author-list{background:#ebf1f5;border:4px solid #ebf1f5;border-radius:20px;color:#666;font-size:.875rem;line-height:1.4;padding:2px 11px 2px 8px;text-decoration:none}.c-article-author-list__button svg,.c-button-author-list svg{margin:1px 4px 0 0}.c-article-author-list__button:hover,.c-button-author-list:hover{background:#025e8d;border-color:transparent;color:#fff}.c-article-body .c-article-access-provider{padding:8px 16px}.c-article-body .c-article-access-provider,.c-notes{border:1px solid #d5d5d5;border-image:initial;border-left:none;border-right:none;margin:24px 0}.c-article-body .c-article-access-provider__text{color:#555}.c-article-body .c-article-access-provider__text,.c-notes__text{font-size:1rem;margin-bottom:0;padding-bottom:2px;padding-top:2px;text-align:center}.c-article-body .c-article-author-affiliation__address{color:inherit;font-weight:700;margin:0}.c-article-body .c-article-author-affiliation__authors-list{list-style:none;margin:0;padding:0}.c-article-body .c-article-author-affiliation__authors-item{display:inline;margin-left:0}.c-article-authors-search{margin-bottom:24px;margin-top:0}.c-article-authors-search__item,.c-article-authors-search__title{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.c-article-authors-search__title{color:#626262;font-size:1.05rem;font-weight:700;margin:0;padding:0}.c-article-authors-search__item{font-size:1rem}.c-article-authors-search__text{margin:0}.c-code-block{border:1px solid #fff;font-family:monospace;margin:0 0 24px;padding:20px}.c-code-block__heading{font-weight:400;margin-bottom:16px}.c-code-block__line{display:block;overflow-wrap:break-word;white-space:pre-wrap}.c-article-share-box{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;margin-bottom:24px}.c-article-share-box__description{font-size:1rem;margin-bottom:8px}.c-article-share-box__no-sharelink-info{font-size:.813rem;font-weight:700;margin-bottom:24px;padding-top:4px}.c-article-share-box__only-read-input{border:1px solid #d5d5d5;box-sizing:content-box;display:inline-block;font-size:.875rem;font-weight:700;height:24px;margin-bottom:8px;padding:8px 10px}.c-article-share-box__additional-info{color:#626262;font-size:.813rem}.c-article-share-box__button{background:#fff;box-sizing:content-box;text-align:center}.c-article-share-box__button--link-like{background-color:transparent;border:0;color:#025e8d;cursor:pointer;font-size:.875rem;margin-bottom:8px;margin-left:10px}.c-article-associated-content__container .c-article-associated-content__collection-label{font-size:.875rem;line-height:1.4}.c-article-associated-content__container .c-article-associated-content__collection-title{line-height:1.3}.c-reading-companion{clear:both;min-height:389px}.c-reading-companion__figures-list,.c-reading-companion__references-list{list-style:none;min-height:389px;padding:0}.c-reading-companion__references-list--numeric{list-style:decimal inside}.c-reading-companion__figure-item{border-top:1px solid #d5d5d5;font-size:1rem;padding:16px 8px 16px 0}.c-reading-companion__figure-item:first-child{border-top:none;padding-top:8px}.c-reading-companion__reference-item{font-size:1rem}.c-reading-companion__reference-item:first-child{border-top:none}.c-reading-companion__reference-item a{word-break:break-word}.c-reading-companion__reference-citation{display:inline}.c-reading-companion__reference-links{font-size:.813rem;font-weight:700;list-style:none;margin:8px 0 0;padding:0;text-align:right}.c-reading-companion__reference-links>a{display:inline-block;padding-left:8px}.c-reading-companion__reference-links>a:first-child{display:inline-block;padding-left:0}.c-reading-companion__figure-title{display:block;font-size:1.25rem;font-weight:700;line-height:1.2;margin:0 0 8px}.c-reading-companion__figure-links{display:flex;justify-content:space-between;margin:8px 0 0}.c-reading-companion__figure-links>a{align-items:center;display:flex}.c-article-section__figure-caption{display:block;margin-bottom:8px;word-break:break-word}.c-article-section__figure .video,p.app-article-masthead__access--above-download{margin:0 0 16px}.c-article-section__figure-description{font-size:1rem}.c-article-section__figure-description>*{margin-bottom:0}.c-cod{display:block;font-size:1rem;width:100%}.c-cod__form{background:#ebf0f3}.c-cod__prompt{font-size:1.125rem;line-height:1.3;margin:0 0 24px}.c-cod__label{display:block;margin:0 0 4px}.c-cod__row{display:flex;margin:0 0 16px}.c-cod__row:last-child{margin:0}.c-cod__input{border:1px solid #d5d5d5;border-radius:2px;flex-shrink:0;margin:0;padding:13px}.c-cod__input--submit{background-color:#025e8d;border:1px solid #025e8d;color:#fff;flex-shrink:1;margin-left:8px;transition:background-color .2s ease-out 0s,color .2s ease-out 0s}.c-cod__input--submit-single{flex-basis:100%;flex-shrink:0;margin:0}.c-cod__input--submit:focus,.c-cod__input--submit:hover{background-color:#fff;color:#025e8d}.save-data .c-article-author-institutional-author__sub-division,.save-data .c-article-equation__number,.save-data .c-article-figure-description,.save-data .c-article-fullwidth-content,.save-data .c-article-main-column,.save-data .c-article-satellite-article-link,.save-data .c-article-satellite-subtitle,.save-data .c-article-table-container,.save-data .c-blockquote__body,.save-data .c-code-block__heading,.save-data .c-reading-companion__figure-title,.save-data .c-reading-companion__reference-citation,.save-data .c-site-messages--nature-briefing-email-variant .serif,.save-data .c-site-messages--nature-briefing-email-variant.serif,.save-data .serif,.save-data .u-serif,.save-data h1,.save-data h2,.save-data h3{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.c-pdf-download__link{display:flex;flex:1 1 0%;padding:13px 24px}.c-pdf-download__link:hover{text-decoration:none}@media only screen and (min-width:768px){.c-context-bar--sticky .c-pdf-download__link{align-items:center;flex:1 1 183px}}@media only screen and (max-width:320px){.c-context-bar--sticky .c-pdf-download__link{padding:16px}}.c-article-body .c-article-recommendations-list,.c-book-body .c-article-recommendations-list{display:flex;flex-direction:row;gap:16px 16px;margin:0;max-width:100%;padding:16px 0 0}.c-article-body .c-article-recommendations-list__item,.c-book-body .c-article-recommendations-list__item{flex:1 1 0%}@media only screen and (max-width:767px){.c-article-body .c-article-recommendations-list,.c-book-body .c-article-recommendations-list{flex-direction:column}}.c-article-body .c-article-recommendations-card__authors{display:none;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:.875rem;line-height:1.5;margin:0 0 8px}@media only screen and (max-width:767px){.c-article-body .c-article-recommendations-card__authors{display:block;margin:0}}.c-article-body .c-article-history{margin-top:24px}.app-article-metrics-bar p{margin:0}.app-article-masthead{display:flex;flex-direction:column;gap:16px 16px;padding:16px 0 24px}.app-article-masthead__info{display:flex;flex-direction:column;flex-grow:1}.app-article-masthead__brand{border-top:1px solid hsla(0,0%,100%,.8);display:flex;flex-direction:column;flex-shrink:0;gap:8px 8px;min-height:96px;padding:16px 0 0}.app-article-masthead__brand img{border:1px solid #fff;border-radius:8px;box-shadow:0 4px 15px 0 hsla(0,0%,50%,.25);height:auto;left:0;position:absolute;width:72px}.app-article-masthead__journal-link{display:block;font-size:1.125rem;font-weight:700;margin:0 0 8px;max-width:400px;padding:0 0 0 88px;position:relative}.app-article-masthead__journal-title{-webkit-box-orient:vertical;-webkit-line-clamp:3;display:-webkit-box;overflow:hidden}.app-article-masthead__submission-link{align-items:center;display:flex;font-size:1rem;gap:4px 4px;margin:0 0 0 88px}.app-article-masthead__access{align-items:center;display:flex;flex-wrap:wrap;font-size:.875rem;font-weight:300;gap:4px 4px;margin:0}.app-article-masthead__buttons{display:flex;flex-flow:column wrap;gap:16px 16px}.app-article-masthead__access svg,.app-masthead--pastel .c-pdf-download .u-button--primary svg,.app-masthead--pastel .c-pdf-download .u-button--secondary svg,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary svg,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary svg{fill:currentcolor}.app-article-masthead a{color:#fff}.app-masthead--pastel .c-pdf-download .u-button--primary,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary{background-color:#025e8d;background-image:none;border:2px solid transparent;box-shadow:none;color:#fff;font-weight:700}.app-masthead--pastel .c-pdf-download .u-button--primary:visited,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:visited{color:#fff}.app-masthead--pastel .c-pdf-download .u-button--primary:hover,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:hover{text-decoration:none}.app-masthead--pastel .c-pdf-download .u-button--primary:focus,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:focus{border:4px solid #fc0;box-shadow:none;outline:0;text-decoration:none}.app-masthead--pastel .c-pdf-download .u-button--primary:focus,.app-masthead--pastel .c-pdf-download .u-button--primary:hover,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:focus,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:hover{background-color:#fff;background-image:none;color:#01324b}.app-masthead--pastel .c-pdf-download .u-button--primary:hover,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--primary:hover{background:0 0;border:2px solid #025e8d;box-shadow:none;color:#025e8d}.app-masthead--pastel .c-pdf-download .u-button--secondary,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary{background:0 0;border:2px solid #025e8d;color:#025e8d;font-weight:700}.app-masthead--pastel .c-pdf-download .u-button--secondary:visited,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary:visited{color:#01324b}.app-masthead--pastel .c-pdf-download .u-button--secondary:hover,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary:hover{background-color:#01324b;background-color:#025e8d;border:2px solid transparent;box-shadow:none;color:#fff}.app-masthead--pastel .c-pdf-download .u-button--secondary:focus,.c-context-bar--sticky .c-context-bar__container .c-pdf-download .u-button--secondary:focus{background-color:#fff;background-image:none;border:4px solid #fc0;color:#01324b}@media only screen and (min-width:768px){.app-article-masthead{flex-direction:row;gap:64px 64px;padding:24px 0}.app-article-masthead__brand{border:0;padding:0}.app-article-masthead__brand img{height:auto;position:static;width:auto}.app-article-masthead__buttons{align-items:center;flex-direction:row;margin-top:auto}.app-article-masthead__journal-link{display:flex;flex-direction:column;gap:24px 24px;margin:0 0 8px;padding:0}.app-article-masthead__submission-link{margin:0}}@media only screen and (min-width:1024px){.app-article-masthead__brand{flex-basis:400px}}.app-article-masthead .c-article-identifiers{font-size:.875rem;font-weight:300;line-height:1;margin:0 0 8px;overflow:hidden;padding:0}.app-article-masthead .c-article-identifiers--cite-list{margin:0 0 16px}.app-article-masthead .c-article-identifiers *{color:#fff}.app-article-masthead .c-cod{display:none}.app-article-masthead .c-article-identifiers__item{border-left:1px solid #fff;border-right:0;margin:0 17px 8px -9px;padding:0 0 0 8px}.app-article-masthead .c-article-identifiers__item--cite{border-left:0}.app-article-metrics-bar{display:flex;flex-wrap:wrap;font-size:1rem;padding:16px 0 0;row-gap:24px}.app-article-metrics-bar__item{padding:0 16px 0 0}.app-article-metrics-bar__count{font-weight:700}.app-article-metrics-bar__label{font-weight:400;padding-left:4px}.app-article-metrics-bar__icon{height:auto;margin-right:4px;margin-top:-4px;width:auto}.app-article-metrics-bar__arrow-icon{margin:4px 0 0 4px}.app-article-metrics-bar a{color:#000}.app-article-metrics-bar .app-article-metrics-bar__item--metrics{padding-right:0}.app-overview-section .c-article-author-list,.app-overview-section__authors{line-height:2}.app-article-metrics-bar{margin-top:8px}.c-book-toc-pagination+.c-book-section__back-to-top{margin-top:0}.c-article-body .c-article-access-provider__text--chapter{color:#222;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;padding:20px 0}.c-article-body .c-article-access-provider__text--chapter svg.c-status-message__icon{fill:#003f8d;vertical-align:middle}.c-article-body-section__content--separator{padding-top:40px}.c-pdf-download__link{max-height:44px}.app-article-access .u-button--primary,.app-article-access .u-button--primary:visited{color:#fff}.c-article-sidebar{display:none}@media only screen and (min-width:1024px){.c-article-sidebar{display:block}}.c-cod__form{border-radius:12px}.c-cod__label{font-size:.875rem}.c-cod .c-status-message{align-items:center;justify-content:center;margin-bottom:16px;padding-bottom:16px}@media only screen and (min-width:1024px){.c-cod .c-status-message{align-items:inherit}}.c-cod .c-status-message__icon{margin-top:4px}.c-cod .c-cod__prompt{font-size:1rem;margin-bottom:16px}.c-article-body .app-article-access,.c-book-body .app-article-access{display:block}@media only screen and (min-width:1024px){.c-article-body .app-article-access,.c-book-body .app-article-access{display:none}}.c-article-body .app-card-service{margin-bottom:32px}@media only screen and (min-width:1024px){.c-article-body .app-card-service{display:none}}.app-article-access .buybox__buy .u-button--secondary,.app-article-access .u-button--primary,.c-cod__row .u-button--primary{background-color:#025e8d;border:2px solid #025e8d;box-shadow:none;font-size:1rem;font-weight:700;gap:8px 8px;justify-content:center;line-height:1.5;padding:8px 24px}.app-article-access .buybox__buy .u-button--secondary,.app-article-access .u-button--primary:hover,.c-cod__row .u-button--primary:hover{background-color:#fff;color:#025e8d}.app-article-access .buybox__buy .u-button--secondary:hover{background-color:#025e8d;color:#fff}.buybox__buy .c-notes__text{color:#666;font-size:.875rem;padding:0 16px 8px}.c-cod__input{flex-basis:auto;width:100%}.c-article-title{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:2.25rem;font-weight:700;line-height:1.2;margin:12px 0}.c-reading-companion__figure-item figure{margin:0}@media only screen and (min-width:768px){.c-article-title{margin:16px 0}}.app-article-access{border:1px solid #c5e0f4;border-radius:12px}.app-article-access__heading{border-bottom:1px solid #c5e0f4;font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:1.125rem;font-weight:700;margin:0;padding:16px;text-align:center}.app-article-access .buybox__info svg{vertical-align:middle}.c-article-body .app-article-access p{margin-bottom:0}.app-article-access .buybox__info{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif;font-size:1rem;margin:0}.app-article-access{margin:0 0 32px}@media only screen and (min-width:1024px){.app-article-access{margin:0 0 24px}}.c-status-message{font-size:1rem}.c-article-body{font-size:1.125rem}.c-article-body dl,.c-article-body ol,.c-article-body p,.c-article-body ul{margin-bottom:32px;margin-top:0}.c-article-access-provider__text:last-of-type,.c-article-body .c-notes__text:last-of-type{margin-bottom:0}.c-article-body ol p,.c-article-body ul p{margin-bottom:16px}.c-article-section__figure-caption{font-family:Merriweather Sans,Helvetica Neue,Helvetica,Arial,sans-serif}.c-reading-companion__figure-item{border-top-color:#c5e0f4}.c-reading-companion__sticky{max-width:400px}.c-article-section .c-article-section__figure-description>*{font-size:1rem;margin-bottom:16px}.c-reading-companion__reference-item{border-top:1px solid #d5d5d5;padding:16px 0}.c-reading-companion__reference-item:first-child{padding-top:0}.c-article-share-box__button,.js .c-article-authors-search__item .c-article-button{background:0 0;border:2px solid #025e8d;border-radius:32px;box-shadow:none;color:#025e8d;font-size:1rem;font-weight:700;line-height:1.5;margin:0;padding:8px 24px;transition:all .2s ease 0s}.c-article-authors-search__item .c-article-button{width:100%}.c-pdf-download .u-button{background-color:#fff;border:2px solid #fff;color:#01324b;justify-content:center}.c-context-bar__container .c-pdf-download .u-button svg,.c-pdf-download .u-button svg{fill:currentcolor}.c-pdf-download .u-button:visited{color:#01324b}.c-pdf-download .u-button:hover{border:4px solid #01324b;box-shadow:none}.c-pdf-download .u-button:focus,.c-pdf-download .u-button:hover{background-color:#01324b}.c-pdf-download .u-button:focus svg path,.c-pdf-download .u-button:hover svg path{fill:#fff}.c-context-bar__container .c-pdf-download .u-button{background-image:none;border:2px solid;color:#fff}.c-context-bar__container .c-pdf-download .u-button:visited{color:#fff}.c-context-bar__container .c-pdf-download .u-button:hover{text-decoration:none}.c-context-bar__container .c-pdf-download .u-button:focus{box-shadow:none;outline:0;text-decoration:none}.c-context-bar__container .c-pdf-download .u-button:focus,.c-context-bar__container .c-pdf-download .u-button:hover{background-color:#fff;background-image:none;color:#01324b}.c-context-bar__container .c-pdf-download .u-button:focus svg path,.c-context-bar__container .c-pdf-download .u-button:hover svg path{fill:#01324b}.c-context-bar__container .c-pdf-download .u-button,.c-pdf-download .u-button{box-shadow:none;font-size:1rem;font-weight:700;line-height:1.5;padding:8px 24px}.c-context-bar__container .c-pdf-download .u-button{background-color:#025e8d}.c-pdf-download .u-button:hover{border:2px solid #fff}.c-pdf-download .u-button:focus,.c-pdf-download .u-button:hover{background:0 0;box-shadow:none;color:#fff}.c-context-bar__container .c-pdf-download .u-button:hover{border:2px solid #025e8d;box-shadow:none;color:#025e8d}.c-context-bar__container .c-pdf-download .u-button:focus,.c-pdf-download .u-button:focus{border:2px solid #025e8d}.c-article-share-box__button:focus:focus,.c-article__pill-button:focus:focus,.c-context-bar__container .c-pdf-download .u-button:focus:focus,.c-pdf-download .u-button:focus:focus{outline:3px solid #08c;will-change:transform}.c-pdf-download__link .u-icon{padding-top:0}.c-bibliographic-information__column button{margin-bottom:16px}.c-article-body .c-article-author-affiliation__list p,.c-article-body .c-article-author-information__list p,figure{margin:0}.c-article-share-box__button{margin-right:16px}.c-status-message--boxed{border-radius:12px}.c-article-associated-content__collection-title{font-size:1rem}.app-card-service__description,.c-article-body .app-card-service__description{color:#222;margin-bottom:0;margin-top:8px}.app-article-access__subscriptions a,.app-article-access__subscriptions a:visited,.app-book-series-listing__item a,.app-book-series-listing__item a:hover,.app-book-series-listing__item a:visited,.c-article-author-list a,.c-article-author-list a:visited,.c-article-buy-box a,.c-article-buy-box a:visited,.c-article-peer-review a,.c-article-peer-review a:visited,.c-article-satellite-subtitle a,.c-article-satellite-subtitle a:visited,.c-breadcrumbs__link,.c-breadcrumbs__link:hover,.c-breadcrumbs__link:visited{color:#000}.c-article-author-list svg{height:24px;margin:0 0 0 6px;width:24px}.c-article-header{margin-bottom:32px}@media only screen and (min-width:876px){.js .c-ad--conditional{display:block}}.u-lazy-ad-wrapper{background-color:#fff;display:none;min-height:149px}@media only screen and (min-width:876px){.u-lazy-ad-wrapper{display:block}}p.c-ad__label{margin-bottom:4px}.c-ad--728x90{background-color:#fff;border-bottom:2px solid #cedbe0} } </style> <style>@media only print, only all and (prefers-color-scheme: no-preference), only all and (prefers-color-scheme: light), only all and (prefers-color-scheme: dark) { .eds-c-header__brand img{height:24px;width:203px}.app-article-masthead__journal-link img{height:93px;width:72px}@media only screen and (min-width:769px){.app-article-masthead__journal-link img{height:161px;width:122px}} } </style> <link rel="stylesheet" data-test="critical-css-handler" data-inline-css-source="critical-css" href=/oscar-static/app-springerlink/css/core-darwin-3c86549cfc.css media="print" onload="this.media='all';this.onload=null"> <link rel="stylesheet" data-test="critical-css-handler" data-inline-css-source="critical-css" href="/oscar-static/app-springerlink/css/enhanced-darwin-article-72ba046d97.css" media="print" onload="this.media='only print, only all and (prefers-color-scheme: no-preference), only all and (prefers-color-scheme: light), only all and (prefers-color-scheme: dark)';this.onload=null"> <script type="text/javascript"> config = { env: 'live', site: '11263.springer.com', siteWithPath: '11263.springer.com' + window.location.pathname, twitterHashtag: '11263', cmsPrefix: 'https://studio-cms.springernature.com/studio/', publisherBrand: 'Springer', mustardcut: false }; </script> <script> window.dataLayer = [{"GA Key":"UA-26408784-1","DOI":"10.1007/s11263-015-0816-y","Page":"article","springerJournal":true,"Publishing Model":"Hybrid Access","page":{"attributes":{"environment":"live"}},"Country":"HK","japan":false,"doi":"10.1007-s11263-015-0816-y","Journal Id":11263,"Journal Title":"International Journal of Computer Vision","imprint":"Springer","Keywords":"Dataset, Large-scale, Benchmark, Object recognition, Object detection","kwrd":["Dataset","Large-scale","Benchmark","Object_recognition","Object_detection"],"Labs":"Y","ksg":"Krux.segments","kuid":"Krux.uid","Has Body":"Y","Features":[],"Open Access":"N","hasAccess":"N","bypassPaywall":"N","user":{"license":{"businessPartnerID":[],"businessPartnerIDString":""}},"Access Type":"no-access","Bpids":"","Bpnames":"","BPID":["1"],"VG Wort Identifier":"pw-vgzm.415900-10.1007-s11263-015-0816-y","Full HTML":"N","Subject Codes":["SCI","SCI22005","SCI21000","SCI22021","SCI2203X"],"pmc":["I","I22005","I21000","I22021","I2203X"],"session":{"authentication":{"loginStatus":"N"},"attributes":{"edition":"academic"}},"content":{"serial":{"eissn":"1573-1405","pissn":"0920-5691"},"type":"Article","category":{"pmc":{"primarySubject":"Computer Science","primarySubjectCode":"I","secondarySubjects":{"1":"Computer Imaging, Vision, Pattern Recognition and Graphics","2":"Artificial Intelligence","3":"Image Processing and Computer Vision","4":"Pattern Recognition"},"secondarySubjectCodes":{"1":"I22005","2":"I21000","3":"I22021","4":"I2203X"}},"sucode":"SC6","articleType":"Article"},"attributes":{"deliveryPlatform":"oscar"}},"Event Category":"Article"}]; </script> <script data-test="springer-link-article-datalayer"> window.dataLayer = window.dataLayer || []; window.dataLayer.push({ ga4MeasurementId: 'G-B3E4QL2TPR', ga360TrackingId: 'UA-26408784-1', twitterId: 'o47a7', baiduId: 'aef3043f025ccf2305af8a194652d70b', ga4ServerUrl: 'https://collect.springer.com', imprint: 'springerlink', page: { attributes:{ featureFlags: [{ name: 'darwin-orion', active: true }, { name: 'chapter-books-recs', active: true } ], darwinAvailable: true } } }); </script> <script> (function(w, d) { w.config = w.config || {}; w.config.mustardcut = false; if (w.matchMedia && w.matchMedia('only print, only all and (prefers-color-scheme: no-preference), only all and (prefers-color-scheme: light), only all and (prefers-color-scheme: dark)').matches) { w.config.mustardcut = true; d.classList.add('js'); d.classList.remove('grade-c'); d.classList.remove('no-js'); } })(window, document.documentElement); </script> <script class="js-entry"> if (window.config.mustardcut) { (function(w, d) { window.Component = {}; window.suppressShareButton = false; window.onArticlePage = true; var currentScript = d.currentScript || d.head.querySelector('script.js-entry'); function catchNoModuleSupport() { var scriptEl = d.createElement('script'); return (!('noModule' in scriptEl) && 'onbeforeload' in scriptEl) } var headScripts = [ {'src': '/oscar-static/js/polyfill-es5-bundle-572d4fec60.js', 'async': false} ]; var bodyScripts = [ {'src': '/oscar-static/js/global-article-es5-bundle-dad1690b0d.js', 'async': false, 'module': false}, {'src': '/oscar-static/js/global-article-es6-bundle-e7d03c4cb3.js', 'async': false, 'module': true} ]; function createScript(script) { var scriptEl = d.createElement('script'); scriptEl.src = script.src; scriptEl.async = script.async; if (script.module === true) { scriptEl.type = "module"; if (catchNoModuleSupport()) { scriptEl.src = ''; } } else if (script.module === false) { scriptEl.setAttribute('nomodule', true) } if (script.charset) { scriptEl.setAttribute('charset', script.charset); } return scriptEl; } for (var i = 0; i < headScripts.length; ++i) { var scriptEl = createScript(headScripts[i]); currentScript.parentNode.insertBefore(scriptEl, currentScript.nextSibling); } d.addEventListener('DOMContentLoaded', function() { for (var i = 0; i < bodyScripts.length; ++i) { var scriptEl = createScript(bodyScripts[i]); d.body.appendChild(scriptEl); } }); // Webfont repeat view var config = w.config; if (config && config.publisherBrand && sessionStorage.fontsLoaded === 'true') { d.documentElement.className += ' webfonts-loaded'; } })(window, document); } </script> <script data-src="https://cdn.optimizely.com/js/27195530232.js" data-cc-script="C03"></script> <script data-test="gtm-head"> window.initGTM = function() { if (window.config.mustardcut) { (function (w, d, s, l, i) { w[l] = w[l] || []; w[l].push({'gtm.start': new Date().getTime(), event: 'gtm.js'}); var f = d.getElementsByTagName(s)[0], j = d.createElement(s), dl = l != 'dataLayer' ? '&l=' + l : ''; j.async = true; j.src = 'https://www.googletagmanager.com/gtm.js?id=' + i + dl; f.parentNode.insertBefore(j, f); })(window, document, 'script', 'dataLayer', 'GTM-MRVXSHQ'); } } </script> <script> (function (w, d, t) { function cc() { var h = w.location.hostname; var e = d.createElement(t), s = d.getElementsByTagName(t)[0]; if (h.indexOf('springer.com') > -1 && h.indexOf('biomedcentral.com') === -1 && h.indexOf('springeropen.com') === -1) { if (h.indexOf('link-qa.springer.com') > -1 || h.indexOf('test-www.springer.com') > -1) { e.src = 'https://cmp.springer.com/production_live/en/consent-bundle-17-52.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } else { e.src = 'https://cmp.springer.com/production_live/en/consent-bundle-17-52.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } } else if (h.indexOf('biomedcentral.com') > -1) { if (h.indexOf('biomedcentral.com.qa') > -1) { e.src = 'https://cmp.biomedcentral.com/production_live/en/consent-bundle-15-36.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } else { e.src = 'https://cmp.biomedcentral.com/production_live/en/consent-bundle-15-36.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } } else if (h.indexOf('springeropen.com') > -1) { if (h.indexOf('springeropen.com.qa') > -1) { e.src = 'https://cmp.springernature.com/production_live/en/consent-bundle-16-34.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } else { e.src = 'https://cmp.springernature.com/production_live/en/consent-bundle-16-34.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-MRVXSHQ')"); } } else if (h.indexOf('springernature.com') > -1) { if (h.indexOf('beta-qa.springernature.com') > -1) { e.src = 'https://cmp.springernature.com/production_live/en/consent-bundle-49-43.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-NK22KLS')"); } else { e.src = 'https://cmp.springernature.com/production_live/en/consent-bundle-49-43.js'; e.setAttribute('onload', "initGTM(window,document,'script','dataLayer','GTM-NK22KLS')"); } } else { e.src = '/oscar-static/js/cookie-consent-es5-bundle-cb57c2c98a.js'; e.setAttribute('data-consent', h); } s.insertAdjacentElement('afterend', e); } cc(); })(window, document, 'script'); </script> <link rel="canonical" href="https://link.springer.com/article/10.1007/s11263-015-0816-y"/> <script type="application/ld+json">{"mainEntity":{"headline":"ImageNet Large Scale Visual Recognition Challenge","description":"The ImageNet Large Scale Visual Recognition Challenge is a benchmark in object category classification and detection on hundreds of object categories and millions of images. The challenge has been run annually from 2010 to present, attracting participation from more than fifty institutions. This paper describes the creation of this benchmark dataset and the advances in object recognition that have been possible as a result. We discuss the challenges of collecting large-scale ground truth annotation, highlight key breakthroughs in categorical object recognition, provide a detailed analysis of the current state of the field of large-scale image classification and object detection, and compare the state-of-the-art computer vision accuracy with human accuracy. We conclude with lessons learned in the 5 years of the challenge, and propose future directions and improvements.","datePublished":"2015-04-11T00:00:00Z","dateModified":"2015-04-11T00:00:00Z","pageStart":"211","pageEnd":"252","sameAs":"https://doi.org/10.1007/s11263-015-0816-y","keywords":["Dataset","Large-scale","Benchmark","Object recognition","Object detection","Computer Imaging","Vision","Pattern Recognition and Graphics","Artificial Intelligence","Image Processing and Computer Vision","Pattern Recognition"],"image":["https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig1_HTML.gif","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig2_HTML.gif","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig3_HTML.gif","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig4_HTML.gif","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig5_HTML.gif","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig6_HTML.gif","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Figa_HTML.gif","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig7_HTML.gif","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig8_HTML.gif","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Figb_HTML.gif","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig9_HTML.gif","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig10_HTML.gif","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig11_HTML.gif","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig12_HTML.gif","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig13_HTML.gif","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig14_HTML.gif","https://media.springernature.com/lw1200/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig15_HTML.gif"],"isPartOf":{"name":"International Journal of Computer Vision","issn":["1573-1405","0920-5691"],"volumeNumber":"115","@type":["Periodical","PublicationVolume"]},"publisher":{"name":"Springer US","logo":{"url":"https://www.springernature.com/app-sn/public/images/logo-springernature.png","@type":"ImageObject"},"@type":"Organization"},"author":[{"name":"Olga Russakovsky","affiliation":[{"name":"Stanford University","address":{"name":"Stanford University, Stanford, USA","@type":"PostalAddress"},"@type":"Organization"}],"email":"olga@cs.stanford.edu","@type":"Person"},{"name":"Jia Deng","affiliation":[{"name":"University of Michigan","address":{"name":"University of Michigan, Ann Arbor, USA","@type":"PostalAddress"},"@type":"Organization"}],"@type":"Person"},{"name":"Hao Su","affiliation":[{"name":"Stanford University","address":{"name":"Stanford University, Stanford, USA","@type":"PostalAddress"},"@type":"Organization"}],"@type":"Person"},{"name":"Jonathan Krause","affiliation":[{"name":"Stanford University","address":{"name":"Stanford University, Stanford, USA","@type":"PostalAddress"},"@type":"Organization"}],"@type":"Person"},{"name":"Sanjeev Satheesh","affiliation":[{"name":"Stanford University","address":{"name":"Stanford University, Stanford, USA","@type":"PostalAddress"},"@type":"Organization"}],"@type":"Person"},{"name":"Sean Ma","affiliation":[{"name":"Stanford University","address":{"name":"Stanford University, Stanford, USA","@type":"PostalAddress"},"@type":"Organization"}],"@type":"Person"},{"name":"Zhiheng Huang","affiliation":[{"name":"Stanford University","address":{"name":"Stanford University, Stanford, USA","@type":"PostalAddress"},"@type":"Organization"}],"@type":"Person"},{"name":"Andrej Karpathy","affiliation":[{"name":"Stanford University","address":{"name":"Stanford University, Stanford, USA","@type":"PostalAddress"},"@type":"Organization"}],"@type":"Person"},{"name":"Aditya Khosla","affiliation":[{"name":"Massachusetts Institute of Technology","address":{"name":"Massachusetts Institute of Technology, Cambridge, USA","@type":"PostalAddress"},"@type":"Organization"}],"@type":"Person"},{"name":"Michael Bernstein","affiliation":[{"name":"Stanford University","address":{"name":"Stanford University, Stanford, USA","@type":"PostalAddress"},"@type":"Organization"}],"@type":"Person"},{"name":"Alexander C. Berg","affiliation":[{"name":"UNC Chapel Hill","address":{"name":"UNC Chapel Hill, Chapel Hill, USA","@type":"PostalAddress"},"@type":"Organization"}],"@type":"Person"},{"name":"Li Fei-Fei","affiliation":[{"name":"Stanford University","address":{"name":"Stanford University, Stanford, USA","@type":"PostalAddress"},"@type":"Organization"}],"@type":"Person"}],"isAccessibleForFree":false,"hasPart":{"isAccessibleForFree":false,"cssSelector":".main-content","@type":"WebPageElement"},"@type":"ScholarlyArticle"},"@context":"https://schema.org","@type":"WebPage"}</script> </head> <body class="" > <!-- Google Tag Manager (noscript) --> <noscript> <iframe src="https://www.googletagmanager.com/ns.html?id=GTM-MRVXSHQ" height="0" width="0" style="display:none;visibility:hidden"></iframe> </noscript> <!-- End Google Tag Manager (noscript) --> <!-- Google Tag Manager (noscript) --> <noscript data-test="gtm-body"> <iframe src="https://www.googletagmanager.com/ns.html?id=GTM-MRVXSHQ" height="0" width="0" style="display:none;visibility:hidden"></iframe> </noscript> <!-- End Google Tag Manager (noscript) --> <div class="u-visually-hidden" aria-hidden="true" data-test="darwin-icons"> <?xml version="1.0" encoding="UTF-8"?><!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN" "http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd"><svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink"><symbol id="icon-eds-i-accesses-medium" viewBox="0 0 24 24"><path d="M15.59 1a1 1 0 0 1 .706.291l5.41 5.385a1 1 0 0 1 .294.709v13.077c0 .674-.269 1.32-.747 1.796a2.549 2.549 0 0 1-1.798.742H15a1 1 0 0 1 0-2h4.455a.549.549 0 0 0 .387-.16.535.535 0 0 0 .158-.378V7.8L15.178 3H5.545a.543.543 0 0 0-.538.451L5 3.538v8.607a1 1 0 0 1-2 0V3.538A2.542 2.542 0 0 1 5.545 1h10.046ZM8 13c2.052 0 4.66 1.61 6.36 3.4l.124.141c.333.41.516.925.516 1.459 0 .6-.232 1.178-.64 1.599C12.666 21.388 10.054 23 8 23c-2.052 0-4.66-1.61-6.353-3.393A2.31 2.31 0 0 1 1 18c0-.6.232-1.178.64-1.6C3.34 14.61 5.948 13 8 13Zm0 2c-1.369 0-3.552 1.348-4.917 2.785A.31.31 0 0 0 3 18c0 .083.031.161.09.222C4.447 19.652 6.631 21 8 21c1.37 0 3.556-1.35 4.917-2.785A.31.31 0 0 0 13 18a.32.32 0 0 0-.048-.17l-.042-.052C11.553 16.348 9.369 15 8 15Zm0 1a2 2 0 1 1 0 4 2 2 0 0 1 0-4Z"/></symbol><symbol id="icon-eds-i-altmetric-medium" viewBox="0 0 24 24"><path d="M12 1c5.978 0 10.843 4.77 10.996 10.712l.004.306-.002.022-.002.248C22.843 18.23 17.978 23 12 23 5.925 23 1 18.075 1 12S5.925 1 12 1Zm-1.726 9.246L8.848 12.53a1 1 0 0 1-.718.461L8.003 13l-4.947.014a9.001 9.001 0 0 0 17.887-.001L16.553 13l-2.205 3.53a1 1 0 0 1-1.735-.068l-.05-.11-2.289-6.106ZM12 3a9.001 9.001 0 0 0-8.947 8.013l4.391-.012L9.652 7.47a1 1 0 0 1 1.784.179l2.288 6.104 1.428-2.283a1 1 0 0 1 .722-.462l.129-.008 4.943.012A9.001 9.001 0 0 0 12 3Z"/></symbol><symbol id="icon-eds-i-arrow-bend-down-medium" viewBox="0 0 24 24"><path d="m11.852 20.989.058.007L12 21l.075-.003.126-.017.111-.03.111-.044.098-.052.104-.074.082-.073 6-6a1 1 0 0 0-1.414-1.414L13 17.585v-12.2C13 4.075 11.964 3 10.667 3H4a1 1 0 1 0 0 2h6.667c.175 0 .333.164.333.385v12.2l-4.293-4.292a1 1 0 0 0-1.32-.083l-.094.083a1 1 0 0 0 0 1.414l6 6c.035.036.073.068.112.097l.11.071.114.054.105.035.118.025Z"/></symbol><symbol id="icon-eds-i-arrow-bend-down-small" viewBox="0 0 16 16"><path d="M1 2a1 1 0 0 0 1 1h5v8.585L3.707 8.293a1 1 0 0 0-1.32-.083l-.094.083a1 1 0 0 0 0 1.414l5 5 .063.059.093.069.081.048.105.048.104.035.105.022.096.01h.136l.122-.018.113-.03.103-.04.1-.053.102-.07.052-.043 5.04-5.037a1 1 0 1 0-1.415-1.414L9 11.583V3a2 2 0 0 0-2-2H2a1 1 0 0 0-1 1Z"/></symbol><symbol id="icon-eds-i-arrow-bend-up-medium" viewBox="0 0 24 24"><path d="m11.852 3.011.058-.007L12 3l.075.003.126.017.111.03.111.044.098.052.104.074.082.073 6 6a1 1 0 1 1-1.414 1.414L13 6.415v12.2C13 19.925 11.964 21 10.667 21H4a1 1 0 0 1 0-2h6.667c.175 0 .333-.164.333-.385v-12.2l-4.293 4.292a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414l6-6c.035-.036.073-.068.112-.097l.11-.071.114-.054.105-.035.118-.025Z"/></symbol><symbol id="icon-eds-i-arrow-bend-up-small" viewBox="0 0 16 16"><path d="M1 13.998a1 1 0 0 1 1-1h5V4.413L3.707 7.705a1 1 0 0 1-1.32.084l-.094-.084a1 1 0 0 1 0-1.414l5-5 .063-.059.093-.068.081-.05.105-.047.104-.035.105-.022L7.94 1l.136.001.122.017.113.03.103.04.1.053.102.07.052.043 5.04 5.037a1 1 0 1 1-1.415 1.414L9 4.415v8.583a2 2 0 0 1-2 2H2a1 1 0 0 1-1-1Z"/></symbol><symbol id="icon-eds-i-arrow-diagonal-medium" viewBox="0 0 24 24"><path d="M14 3h6l.075.003.126.017.111.03.111.044.098.052.096.067.09.08c.036.035.068.073.097.112l.071.11.054.114.035.105.03.148L21 4v6a1 1 0 0 1-2 0V6.414l-4.293 4.293a1 1 0 0 1-1.414-1.414L17.584 5H14a1 1 0 0 1-.993-.883L13 4a1 1 0 0 1 1-1ZM4 13a1 1 0 0 1 1 1v3.584l4.293-4.291a1 1 0 1 1 1.414 1.414L6.414 19H10a1 1 0 0 1 .993.883L11 20a1 1 0 0 1-1 1l-6.075-.003-.126-.017-.111-.03-.111-.044-.098-.052-.096-.067-.09-.08a1.01 1.01 0 0 1-.097-.112l-.071-.11-.054-.114-.035-.105-.025-.118-.007-.058L3 20v-6a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-arrow-diagonal-small" viewBox="0 0 16 16"><path d="m2 15-.082-.004-.119-.016-.111-.03-.111-.044-.098-.052-.096-.067-.09-.08a1.008 1.008 0 0 1-.097-.112l-.071-.11-.031-.062-.034-.081-.024-.076-.025-.118-.007-.058L1 14.02V9a1 1 0 1 1 2 0v2.584l2.793-2.791a1 1 0 1 1 1.414 1.414L4.414 13H7a1 1 0 0 1 .993.883L8 14a1 1 0 0 1-1 1H2ZM14 1l.081.003.12.017.111.03.111.044.098.052.096.067.09.08c.036.035.068.073.097.112l.071.11.031.062.034.081.024.076.03.148L15 2v5a1 1 0 0 1-2 0V4.414l-2.96 2.96A1 1 0 1 1 8.626 5.96L11.584 3H9a1 1 0 0 1-.993-.883L8 2a1 1 0 0 1 1-1h5Z"/></symbol><symbol id="icon-eds-i-arrow-down-medium" viewBox="0 0 24 24"><path d="m20.707 12.728-7.99 7.98a.996.996 0 0 1-.561.281l-.157.011a.998.998 0 0 1-.788-.384l-7.918-7.908a1 1 0 0 1 1.414-1.416L11 17.576V4a1 1 0 0 1 2 0v13.598l6.293-6.285a1 1 0 0 1 1.32-.082l.095.083a1 1 0 0 1-.001 1.414Z"/></symbol><symbol id="icon-eds-i-arrow-down-small" viewBox="0 0 16 16"><path d="m1.293 8.707 6 6 .063.059.093.069.081.048.105.049.104.034.056.013.118.017L8 15l.076-.003.122-.017.113-.03.085-.032.063-.03.098-.058.06-.043.05-.043 6.04-6.037a1 1 0 0 0-1.414-1.414L9 11.583V2a1 1 0 1 0-2 0v9.585L2.707 7.293a1 1 0 0 0-1.32-.083l-.094.083a1 1 0 0 0 0 1.414Z"/></symbol><symbol id="icon-eds-i-arrow-left-medium" viewBox="0 0 24 24"><path d="m11.272 3.293-7.98 7.99a.996.996 0 0 0-.281.561L3 12.001c0 .32.15.605.384.788l7.908 7.918a1 1 0 0 0 1.416-1.414L6.424 13H20a1 1 0 0 0 0-2H6.402l6.285-6.293a1 1 0 0 0 .082-1.32l-.083-.095a1 1 0 0 0-1.414.001Z"/></symbol><symbol id="icon-eds-i-arrow-left-small" viewBox="0 0 16 16"><path d="m7.293 1.293-6 6-.059.063-.069.093-.048.081-.049.105-.034.104-.013.056-.017.118L1 8l.003.076.017.122.03.113.032.085.03.063.058.098.043.06.043.05 6.037 6.04a1 1 0 0 0 1.414-1.414L4.417 9H14a1 1 0 0 0 0-2H4.415l4.292-4.293a1 1 0 0 0 .083-1.32l-.083-.094a1 1 0 0 0-1.414 0Z"/></symbol><symbol id="icon-eds-i-arrow-right-medium" viewBox="0 0 24 24"><path d="m12.728 3.293 7.98 7.99a.996.996 0 0 1 .281.561l.011.157c0 .32-.15.605-.384.788l-7.908 7.918a1 1 0 0 1-1.416-1.414L17.576 13H4a1 1 0 0 1 0-2h13.598l-6.285-6.293a1 1 0 0 1-.082-1.32l.083-.095a1 1 0 0 1 1.414.001Z"/></symbol><symbol id="icon-eds-i-arrow-right-small" viewBox="0 0 16 16"><path d="m8.707 1.293 6 6 .059.063.069.093.048.081.049.105.034.104.013.056.017.118L15 8l-.003.076-.017.122-.03.113-.032.085-.03.063-.058.098-.043.06-.043.05-6.037 6.04a1 1 0 0 1-1.414-1.414L11.583 9H2a1 1 0 1 1 0-2h9.585L7.293 2.707a1 1 0 0 1-.083-1.32l.083-.094a1 1 0 0 1 1.414 0Z"/></symbol><symbol id="icon-eds-i-arrow-up-medium" viewBox="0 0 24 24"><path d="m3.293 11.272 7.99-7.98a.996.996 0 0 1 .561-.281L12.001 3c.32 0 .605.15.788.384l7.918 7.908a1 1 0 0 1-1.414 1.416L13 6.424V20a1 1 0 0 1-2 0V6.402l-6.293 6.285a1 1 0 0 1-1.32.082l-.095-.083a1 1 0 0 1 .001-1.414Z"/></symbol><symbol id="icon-eds-i-arrow-up-small" viewBox="0 0 16 16"><path d="m1.293 7.293 6-6 .063-.059.093-.069.081-.048.105-.049.104-.034.056-.013.118-.017L8 1l.076.003.122.017.113.03.085.032.063.03.098.058.06.043.05.043 6.04 6.037a1 1 0 0 1-1.414 1.414L9 4.417V14a1 1 0 0 1-2 0V4.415L2.707 8.707a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414Z"/></symbol><symbol id="icon-eds-i-article-medium" viewBox="0 0 24 24"><path d="M8 7a1 1 0 0 0 0 2h4a1 1 0 1 0 0-2H8ZM8 11a1 1 0 1 0 0 2h8a1 1 0 1 0 0-2H8ZM7 16a1 1 0 0 1 1-1h8a1 1 0 1 1 0 2H8a1 1 0 0 1-1-1Z"/><path d="M5.545 1A2.542 2.542 0 0 0 3 3.538v16.924A2.542 2.542 0 0 0 5.545 23h12.91A2.542 2.542 0 0 0 21 20.462V3.5A2.5 2.5 0 0 0 18.5 1H5.545ZM5 3.538C5 3.245 5.24 3 5.545 3H18.5a.5.5 0 0 1 .5.5v16.962c0 .293-.24.538-.546.538H5.545A.542.542 0 0 1 5 20.462V3.538Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-book-medium" viewBox="0 0 24 24"><path d="M18.5 1A2.5 2.5 0 0 1 21 3.5v12c0 1.16-.79 2.135-1.86 2.418l-.14.031V21h1a1 1 0 0 1 .993.883L21 22a1 1 0 0 1-1 1H6.5A3.5 3.5 0 0 1 3 19.5v-15A3.5 3.5 0 0 1 6.5 1h12ZM17 18H6.5a1.5 1.5 0 0 0-1.493 1.356L5 19.5A1.5 1.5 0 0 0 6.5 21H17v-3Zm1.5-15h-12A1.5 1.5 0 0 0 5 4.5v11.837l.054-.025a3.481 3.481 0 0 1 1.254-.307L6.5 16h12a.5.5 0 0 0 .492-.41L19 15.5v-12a.5.5 0 0 0-.5-.5ZM15 6a1 1 0 0 1 0 2H9a1 1 0 1 1 0-2h6Z"/></symbol><symbol id="icon-eds-i-book-series-medium" viewBox="0 0 24 24"><path fill-rule="evenodd" d="M1 3.786C1 2.759 1.857 2 2.82 2H6.18c.964 0 1.82.759 1.82 1.786V4h3.168c.668 0 1.298.364 1.616.938.158-.109.333-.195.523-.252l3.216-.965c.923-.277 1.962.204 2.257 1.187l4.146 13.82c.296.984-.307 1.957-1.23 2.234l-3.217.965c-.923.277-1.962-.203-2.257-1.187L13 10.005v10.21c0 1.04-.878 1.785-1.834 1.785H7.833c-.291 0-.575-.07-.83-.195A1.849 1.849 0 0 1 6.18 22H2.821C1.857 22 1 21.241 1 20.214V3.786ZM3 4v11h3V4H3Zm0 16v-3h3v3H3Zm15.075-.04-.814-2.712 2.874-.862.813 2.712-2.873.862Zm1.485-5.49-2.874.862-2.634-8.782 2.873-.862 2.635 8.782ZM8 20V6h3v14H8Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-calendar-acceptance-medium" viewBox="0 0 24 24"><path d="M17 2a1 1 0 0 1 1 1v1h1.5C20.817 4 22 5.183 22 6.5v13c0 1.317-1.183 2.5-2.5 2.5h-15C3.183 22 2 20.817 2 19.5v-13C2 5.183 3.183 4 4.5 4a1 1 0 1 1 0 2c-.212 0-.5.288-.5.5v13c0 .212.288.5.5.5h15c.212 0 .5-.288.5-.5v-13c0-.212-.288-.5-.5-.5H18v1a1 1 0 0 1-2 0V3a1 1 0 0 1 1-1Zm-.534 7.747a1 1 0 0 1 .094 1.412l-4.846 5.538a1 1 0 0 1-1.352.141l-2.77-2.076a1 1 0 0 1 1.2-1.6l2.027 1.519 4.236-4.84a1 1 0 0 1 1.411-.094ZM7.5 2a1 1 0 0 1 1 1v1H14a1 1 0 0 1 0 2H8.5v1a1 1 0 1 1-2 0V3a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-calendar-date-medium" viewBox="0 0 24 24"><path d="M17 2a1 1 0 0 1 1 1v1h1.5C20.817 4 22 5.183 22 6.5v13c0 1.317-1.183 2.5-2.5 2.5h-15C3.183 22 2 20.817 2 19.5v-13C2 5.183 3.183 4 4.5 4a1 1 0 1 1 0 2c-.212 0-.5.288-.5.5v13c0 .212.288.5.5.5h15c.212 0 .5-.288.5-.5v-13c0-.212-.288-.5-.5-.5H18v1a1 1 0 0 1-2 0V3a1 1 0 0 1 1-1ZM8 15a1 1 0 1 1 0 2 1 1 0 0 1 0-2Zm4 0a1 1 0 1 1 0 2 1 1 0 0 1 0-2Zm-4-4a1 1 0 1 1 0 2 1 1 0 0 1 0-2Zm4 0a1 1 0 1 1 0 2 1 1 0 0 1 0-2Zm4 0a1 1 0 1 1 0 2 1 1 0 0 1 0-2ZM7.5 2a1 1 0 0 1 1 1v1H14a1 1 0 0 1 0 2H8.5v1a1 1 0 1 1-2 0V3a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-calendar-decision-medium" viewBox="0 0 24 24"><path d="M17 2a1 1 0 0 1 1 1v1h1.5C20.817 4 22 5.183 22 6.5v13c0 1.317-1.183 2.5-2.5 2.5h-15C3.183 22 2 20.817 2 19.5v-13C2 5.183 3.183 4 4.5 4a1 1 0 1 1 0 2c-.212 0-.5.288-.5.5v13c0 .212.288.5.5.5h15c.212 0 .5-.288.5-.5v-13c0-.212-.288-.5-.5-.5H18v1a1 1 0 0 1-2 0V3a1 1 0 0 1 1-1Zm-2.935 8.246 2.686 2.645c.34.335.34.883 0 1.218l-2.686 2.645a.858.858 0 0 1-1.213-.009.854.854 0 0 1 .009-1.21l1.05-1.035H7.984a.992.992 0 0 1-.984-1c0-.552.44-1 .984-1h5.928l-1.051-1.036a.854.854 0 0 1-.085-1.121l.076-.088a.858.858 0 0 1 1.213-.009ZM7.5 2a1 1 0 0 1 1 1v1H14a1 1 0 0 1 0 2H8.5v1a1 1 0 1 1-2 0V3a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-calendar-impact-factor-medium" viewBox="0 0 24 24"><path d="M17 2a1 1 0 0 1 1 1v1h1.5C20.817 4 22 5.183 22 6.5v13c0 1.317-1.183 2.5-2.5 2.5h-15C3.183 22 2 20.817 2 19.5v-13C2 5.183 3.183 4 4.5 4a1 1 0 1 1 0 2c-.212 0-.5.288-.5.5v13c0 .212.288.5.5.5h15c.212 0 .5-.288.5-.5v-13c0-.212-.288-.5-.5-.5H18v1a1 1 0 0 1-2 0V3a1 1 0 0 1 1-1Zm-3.2 6.924a.48.48 0 0 1 .125.544l-1.52 3.283h2.304c.27 0 .491.215.491.483a.477.477 0 0 1-.13.327l-4.18 4.484a.498.498 0 0 1-.69.031.48.48 0 0 1-.125-.544l1.52-3.284H9.291a.487.487 0 0 1-.491-.482c0-.121.047-.238.13-.327l4.18-4.484a.498.498 0 0 1 .69-.031ZM7.5 2a1 1 0 0 1 1 1v1H14a1 1 0 0 1 0 2H8.5v1a1 1 0 1 1-2 0V3a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-call-papers-medium" viewBox="0 0 24 24"><g><path d="m20.707 2.883-1.414 1.414a1 1 0 0 0 1.414 1.414l1.414-1.414a1 1 0 0 0-1.414-1.414Z"/><path d="M6 16.054c0 2.026 1.052 2.943 3 2.943a1 1 0 1 1 0 2c-2.996 0-5-1.746-5-4.943v-1.227a4.068 4.068 0 0 1-1.83-1.189 4.553 4.553 0 0 1-.87-1.455 4.868 4.868 0 0 1-.3-1.686c0-1.17.417-2.298 1.17-3.14.38-.426.834-.767 1.338-1 .51-.237 1.06-.36 1.617-.36L6.632 6H7l7.932-2.895A2.363 2.363 0 0 1 18 5.36v9.28a2.36 2.36 0 0 1-3.069 2.25l.084.03L7 14.997H6v1.057Zm9.637-11.057a.415.415 0 0 0-.083.008L8 7.638v5.536l7.424 1.786.104.02c.035.01.072.02.109.02.2 0 .363-.16.363-.36V5.36c0-.2-.163-.363-.363-.363Zm-9.638 3h-.874a1.82 1.82 0 0 0-.625.111l-.15.063a2.128 2.128 0 0 0-.689.517c-.42.47-.661 1.123-.661 1.81 0 .34.06.678.176.992.114.308.28.585.485.816.4.447.925.691 1.464.691h.874v-5Z" clip-rule="evenodd"/><path d="M20 8.997h2a1 1 0 1 1 0 2h-2a1 1 0 1 1 0-2ZM20.707 14.293l1.414 1.414a1 1 0 0 1-1.414 1.414l-1.414-1.414a1 1 0 0 1 1.414-1.414Z"/></g></symbol><symbol id="icon-eds-i-card-medium" viewBox="0 0 24 24"><path d="M19.615 2c.315 0 .716.067 1.14.279.76.38 1.245 1.107 1.245 2.106v15.23c0 .315-.067.716-.279 1.14-.38.76-1.107 1.245-2.106 1.245H4.385a2.56 2.56 0 0 1-1.14-.279C2.485 21.341 2 20.614 2 19.615V4.385c0-.315.067-.716.279-1.14C2.659 2.485 3.386 2 4.385 2h15.23Zm0 2H4.385c-.213 0-.265.034-.317.14A.71.71 0 0 0 4 4.385v15.23c0 .213.034.265.14.317a.71.71 0 0 0 .245.068h15.23c.213 0 .265-.034.317-.14a.71.71 0 0 0 .068-.245V4.385c0-.213-.034-.265-.14-.317A.71.71 0 0 0 19.615 4ZM17 16a1 1 0 0 1 0 2H7a1 1 0 0 1 0-2h10Zm0-3a1 1 0 0 1 0 2H7a1 1 0 0 1 0-2h10Zm-.5-7A1.5 1.5 0 0 1 18 7.5v3a1.5 1.5 0 0 1-1.5 1.5h-9A1.5 1.5 0 0 1 6 10.5v-3A1.5 1.5 0 0 1 7.5 6h9ZM16 8H8v2h8V8Z"/></symbol><symbol id="icon-eds-i-cart-medium" viewBox="0 0 24 24"><path d="M5.76 1a1 1 0 0 1 .994.902L7.155 6h13.34c.18 0 .358.02.532.057l.174.045a2.5 2.5 0 0 1 1.693 3.103l-2.069 7.03c-.36 1.099-1.398 1.823-2.49 1.763H8.65c-1.272.015-2.352-.927-2.546-2.244L4.852 3H2a1 1 0 0 1-.993-.883L1 2a1 1 0 0 1 1-1h3.76Zm2.328 14.51a.555.555 0 0 0 .55.488l9.751.001a.533.533 0 0 0 .527-.357l2.059-7a.5.5 0 0 0-.48-.642H7.351l.737 7.51ZM18 19a2 2 0 1 1 0 4 2 2 0 0 1 0-4ZM8 19a2 2 0 1 1 0 4 2 2 0 0 1 0-4Z"/></symbol><symbol id="icon-eds-i-check-circle-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 2a9 9 0 1 0 0 18 9 9 0 0 0 0-18Zm5.125 4.72a1 1 0 0 1 .156 1.405l-6 7.5a1 1 0 0 1-1.421.143l-3-2.5a1 1 0 0 1 1.28-1.536l2.217 1.846 5.362-6.703a1 1 0 0 1 1.406-.156Z"/></symbol><symbol id="icon-eds-i-check-filled-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm5.125 6.72a1 1 0 0 0-1.406.155l-5.362 6.703-2.217-1.846a1 1 0 1 0-1.28 1.536l3 2.5a1 1 0 0 0 1.42-.143l6-7.5a1 1 0 0 0-.155-1.406Z"/></symbol><symbol id="icon-eds-i-chevron-down-medium" viewBox="0 0 24 24"><path d="M3.305 8.28a1 1 0 0 0-.024 1.415l7.495 7.762c.314.345.757.543 1.224.543.467 0 .91-.198 1.204-.522l7.515-7.783a1 1 0 1 0-1.438-1.39L12 15.845l-7.28-7.54A1 1 0 0 0 3.4 8.2l-.096.082Z"/></symbol><symbol id="icon-eds-i-chevron-down-small" viewBox="0 0 16 16"><path d="M13.692 5.278a1 1 0 0 1 .03 1.414L9.103 11.51a1.491 1.491 0 0 1-2.188.019L2.278 6.692a1 1 0 0 1 1.444-1.384L8 9.771l4.278-4.463a1 1 0 0 1 1.318-.111l.096.081Z"/></symbol><symbol id="icon-eds-i-chevron-left-medium" viewBox="0 0 24 24"><path d="M15.72 3.305a1 1 0 0 0-1.415-.024l-7.762 7.495A1.655 1.655 0 0 0 6 12c0 .467.198.91.522 1.204l7.783 7.515a1 1 0 1 0 1.39-1.438L8.155 12l7.54-7.28A1 1 0 0 0 15.8 3.4l-.082-.096Z"/></symbol><symbol id="icon-eds-i-chevron-left-small" viewBox="0 0 16 16"><path d="M10.722 2.308a1 1 0 0 0-1.414-.03L4.49 6.897a1.491 1.491 0 0 0-.019 2.188l4.838 4.637a1 1 0 1 0 1.384-1.444L6.229 8l4.463-4.278a1 1 0 0 0 .111-1.318l-.081-.096Z"/></symbol><symbol id="icon-eds-i-chevron-right-medium" viewBox="0 0 24 24"><path d="M8.28 3.305a1 1 0 0 1 1.415-.024l7.762 7.495c.345.314.543.757.543 1.224 0 .467-.198.91-.522 1.204l-7.783 7.515a1 1 0 1 1-1.39-1.438L15.845 12l-7.54-7.28A1 1 0 0 1 8.2 3.4l.082-.096Z"/></symbol><symbol id="icon-eds-i-chevron-right-small" viewBox="0 0 16 16"><path d="M5.278 2.308a1 1 0 0 1 1.414-.03l4.819 4.619a1.491 1.491 0 0 1 .019 2.188l-4.838 4.637a1 1 0 1 1-1.384-1.444L9.771 8 5.308 3.722a1 1 0 0 1-.111-1.318l.081-.096Z"/></symbol><symbol id="icon-eds-i-chevron-up-medium" viewBox="0 0 24 24"><path d="M20.695 15.72a1 1 0 0 0 .024-1.415l-7.495-7.762A1.655 1.655 0 0 0 12 6c-.467 0-.91.198-1.204.522l-7.515 7.783a1 1 0 1 0 1.438 1.39L12 8.155l7.28 7.54a1 1 0 0 0 1.319.106l.096-.082Z"/></symbol><symbol id="icon-eds-i-chevron-up-small" viewBox="0 0 16 16"><path d="M13.692 10.722a1 1 0 0 0 .03-1.414L9.103 4.49a1.491 1.491 0 0 0-2.188-.019L2.278 9.308a1 1 0 0 0 1.444 1.384L8 6.229l4.278 4.463a1 1 0 0 0 1.318.111l.096-.081Z"/></symbol><symbol id="icon-eds-i-citations-medium" viewBox="0 0 24 24"><path d="M15.59 1a1 1 0 0 1 .706.291l5.41 5.385a1 1 0 0 1 .294.709v13.077c0 .674-.269 1.32-.747 1.796a2.549 2.549 0 0 1-1.798.742h-5.843a1 1 0 1 1 0-2h5.843a.549.549 0 0 0 .387-.16.535.535 0 0 0 .158-.378V7.8L15.178 3H5.545a.543.543 0 0 0-.538.451L5 3.538v8.607a1 1 0 0 1-2 0V3.538A2.542 2.542 0 0 1 5.545 1h10.046ZM5.483 14.35c.197.26.17.62-.049.848l-.095.083-.016.011c-.36.24-.628.45-.804.634-.393.409-.59.93-.59 1.562.077-.019.192-.028.345-.028.442 0 .84.158 1.195.474.355.316.532.716.532 1.2 0 .501-.173.9-.518 1.198-.345.298-.767.446-1.266.446-.672 0-1.209-.195-1.612-.585-.403-.39-.604-.976-.604-1.757 0-.744.11-1.39.33-1.938.222-.549.49-1.009.807-1.38a4.28 4.28 0 0 1 .992-.88c.07-.043.148-.087.232-.133a.881.881 0 0 1 1.121.245Zm5 0c.197.26.17.62-.049.848l-.095.083-.016.011c-.36.24-.628.45-.804.634-.393.409-.59.93-.59 1.562.077-.019.192-.028.345-.028.442 0 .84.158 1.195.474.355.316.532.716.532 1.2 0 .501-.173.9-.518 1.198-.345.298-.767.446-1.266.446-.672 0-1.209-.195-1.612-.585-.403-.39-.604-.976-.604-1.757 0-.744.11-1.39.33-1.938.222-.549.49-1.009.807-1.38a4.28 4.28 0 0 1 .992-.88c.07-.043.148-.087.232-.133a.881.881 0 0 1 1.121.245Z"/></symbol><symbol id="icon-eds-i-clipboard-check-medium" viewBox="0 0 24 24"><path d="M14.4 1c1.238 0 2.274.865 2.536 2.024L18.5 3C19.886 3 21 4.14 21 5.535v14.93C21 21.86 19.886 23 18.5 23h-13C4.114 23 3 21.86 3 20.465V5.535C3 4.14 4.114 3 5.5 3h1.57c.27-1.147 1.3-2 2.53-2h4.8Zm4.115 4-1.59.024A2.601 2.601 0 0 1 14.4 7H9.6c-1.23 0-2.26-.853-2.53-2H5.5c-.27 0-.5.234-.5.535v14.93c0 .3.23.535.5.535h13c.27 0 .5-.234.5-.535V5.535c0-.3-.23-.535-.485-.535Zm-1.909 4.205a1 1 0 0 1 .19 1.401l-5.334 7a1 1 0 0 1-1.344.23l-2.667-1.75a1 1 0 1 1 1.098-1.672l1.887 1.238 4.769-6.258a1 1 0 0 1 1.401-.19ZM14.4 3H9.6a.6.6 0 0 0-.6.6v.8a.6.6 0 0 0 .6.6h4.8a.6.6 0 0 0 .6-.6v-.8a.6.6 0 0 0-.6-.6Z"/></symbol><symbol id="icon-eds-i-clipboard-report-medium" viewBox="0 0 24 24"><path d="M14.4 1c1.238 0 2.274.865 2.536 2.024L18.5 3C19.886 3 21 4.14 21 5.535v14.93C21 21.86 19.886 23 18.5 23h-13C4.114 23 3 21.86 3 20.465V5.535C3 4.14 4.114 3 5.5 3h1.57c.27-1.147 1.3-2 2.53-2h4.8Zm4.115 4-1.59.024A2.601 2.601 0 0 1 14.4 7H9.6c-1.23 0-2.26-.853-2.53-2H5.5c-.27 0-.5.234-.5.535v14.93c0 .3.23.535.5.535h13c.27 0 .5-.234.5-.535V5.535c0-.3-.23-.535-.485-.535Zm-2.658 10.929a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h7.857Zm0-3.929a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h7.857ZM14.4 3H9.6a.6.6 0 0 0-.6.6v.8a.6.6 0 0 0 .6.6h4.8a.6.6 0 0 0 .6-.6v-.8a.6.6 0 0 0-.6-.6Z"/></symbol><symbol id="icon-eds-i-close-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 2a9 9 0 1 0 0 18 9 9 0 0 0 0-18ZM8.707 7.293 12 10.585l3.293-3.292a1 1 0 0 1 1.414 1.414L13.415 12l3.292 3.293a1 1 0 0 1-1.414 1.414L12 13.415l-3.293 3.292a1 1 0 1 1-1.414-1.414L10.585 12 7.293 8.707a1 1 0 0 1 1.414-1.414Z"/></symbol><symbol id="icon-eds-i-cloud-upload-medium" viewBox="0 0 24 24"><path d="m12.852 10.011.028-.004L13 10l.075.003.126.017.086.022.136.052.098.052.104.074.082.073 3 3a1 1 0 0 1 0 1.414l-.094.083a1 1 0 0 1-1.32-.083L14 13.416V20a1 1 0 0 1-2 0v-6.586l-1.293 1.293a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414l3-3 .112-.097.11-.071.114-.054.105-.035.118-.025Zm.587-7.962c3.065.362 5.497 2.662 5.992 5.562l.013.085.207.073c2.117.782 3.496 2.845 3.337 5.097l-.022.226c-.297 2.561-2.503 4.491-5.124 4.502a1 1 0 1 1-.009-2c1.619-.007 2.967-1.186 3.147-2.733.179-1.542-.86-2.979-2.487-3.353-.512-.149-.894-.579-.981-1.165-.21-2.237-2-4.035-4.308-4.308-2.31-.273-4.497 1.06-5.25 3.19l-.049.113c-.234.468-.718.756-1.176.743-1.418.057-2.689.857-3.32 2.084a3.668 3.668 0 0 0 .262 3.798c.796 1.136 2.169 1.764 3.583 1.635a1 1 0 1 1 .182 1.992c-2.125.194-4.193-.753-5.403-2.48a5.668 5.668 0 0 1-.403-5.86c.85-1.652 2.449-2.79 4.323-3.092l.287-.039.013-.028c1.207-2.741 4.125-4.404 7.186-4.042Z"/></symbol><symbol id="icon-eds-i-collection-medium" viewBox="0 0 24 24"><path d="M21 7a1 1 0 0 1 1 1v12.5a2.5 2.5 0 0 1-2.5 2.5H8a1 1 0 0 1 0-2h11.5a.5.5 0 0 0 .5-.5V8a1 1 0 0 1 1-1Zm-5.5-5A2.5 2.5 0 0 1 18 4.5v12a2.5 2.5 0 0 1-2.5 2.5h-11A2.5 2.5 0 0 1 2 16.5v-12A2.5 2.5 0 0 1 4.5 2h11Zm0 2h-11a.5.5 0 0 0-.5.5v12a.5.5 0 0 0 .5.5h11a.5.5 0 0 0 .5-.5v-12a.5.5 0 0 0-.5-.5ZM13 13a1 1 0 0 1 0 2H7a1 1 0 0 1 0-2h6Zm0-3.5a1 1 0 0 1 0 2H7a1 1 0 0 1 0-2h6ZM13 6a1 1 0 0 1 0 2H7a1 1 0 1 1 0-2h6Z"/></symbol><symbol id="icon-eds-i-conference-series-medium" viewBox="0 0 24 24"><path fill-rule="evenodd" d="M4.5 2A2.5 2.5 0 0 0 2 4.5v11A2.5 2.5 0 0 0 4.5 18h2.37l-2.534 2.253a1 1 0 0 0 1.328 1.494L9.88 18H11v3a1 1 0 1 0 2 0v-3h1.12l4.216 3.747a1 1 0 0 0 1.328-1.494L17.13 18h2.37a2.5 2.5 0 0 0 2.5-2.5v-11A2.5 2.5 0 0 0 19.5 2h-15ZM20 6V4.5a.5.5 0 0 0-.5-.5h-15a.5.5 0 0 0-.5.5V6h16ZM4 8v7.5a.5.5 0 0 0 .5.5h15a.5.5 0 0 0 .5-.5V8H4Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-delivery-medium" viewBox="0 0 24 24"><path d="M8.51 20.598a3.037 3.037 0 0 1-3.02 0A2.968 2.968 0 0 1 4.161 19L3.5 19A2.5 2.5 0 0 1 1 16.5v-11A2.5 2.5 0 0 1 3.5 3h10a2.5 2.5 0 0 1 2.45 2.004L16 5h2.527c.976 0 1.855.585 2.27 1.49l2.112 4.62a1 1 0 0 1 .091.416v4.856C23 17.814 21.889 19 20.484 19h-.523a1.01 1.01 0 0 1-.121-.007 2.96 2.96 0 0 1-1.33 1.605 3.037 3.037 0 0 1-3.02 0A2.968 2.968 0 0 1 14.161 19H9.838a2.968 2.968 0 0 1-1.327 1.597Zm-2.024-3.462a.955.955 0 0 0-.481.73L5.999 18l.001.022a.944.944 0 0 0 .388.777l.098.065c.316.181.712.181 1.028 0A.97.97 0 0 0 8 17.978a.95.95 0 0 0-.486-.842 1.037 1.037 0 0 0-1.028 0Zm10 0a.955.955 0 0 0-.481.73l-.005.156a.944.944 0 0 0 .388.777l.098.065c.316.181.712.181 1.028 0a.97.97 0 0 0 .486-.886.95.95 0 0 0-.486-.842 1.037 1.037 0 0 0-1.028 0ZM21 12h-5v3.17a3.038 3.038 0 0 1 2.51.232 2.993 2.993 0 0 1 1.277 1.45l.058.155.058-.005.581-.002c.27 0 .516-.263.516-.618V12Zm-7.5-7h-10a.5.5 0 0 0-.5.5v11a.5.5 0 0 0 .5.5h.662a2.964 2.964 0 0 1 1.155-1.491l.172-.107a3.037 3.037 0 0 1 3.022 0A2.987 2.987 0 0 1 9.843 17H13.5a.5.5 0 0 0 .5-.5v-11a.5.5 0 0 0-.5-.5Zm5.027 2H16v3h4.203l-1.224-2.677a.532.532 0 0 0-.375-.316L18.527 7Z"/></symbol><symbol id="icon-eds-i-download-medium" viewBox="0 0 24 24"><path d="M22 18.5a3.5 3.5 0 0 1-3.5 3.5h-13A3.5 3.5 0 0 1 2 18.5V18a1 1 0 0 1 2 0v.5A1.5 1.5 0 0 0 5.5 20h13a1.5 1.5 0 0 0 1.5-1.5V18a1 1 0 0 1 2 0v.5Zm-3.293-7.793-6 6-.063.059-.093.069-.081.048-.105.049-.104.034-.056.013-.118.017L12 17l-.076-.003-.122-.017-.113-.03-.085-.032-.063-.03-.098-.058-.06-.043-.05-.043-6.04-6.037a1 1 0 0 1 1.414-1.414l4.294 4.29L11 3a1 1 0 0 1 2 0l.001 10.585 4.292-4.292a1 1 0 0 1 1.32-.083l.094.083a1 1 0 0 1 0 1.414Z"/></symbol><symbol id="icon-eds-i-edit-medium" viewBox="0 0 24 24"><path d="M17.149 2a2.38 2.38 0 0 1 1.699.711l2.446 2.46a2.384 2.384 0 0 1 .005 3.38L10.01 19.906a1 1 0 0 1-.434.257l-6.3 1.8a1 1 0 0 1-1.237-1.237l1.8-6.3a1 1 0 0 1 .257-.434L15.443 2.718A2.385 2.385 0 0 1 17.15 2Zm-3.874 5.689-7.586 7.536-1.234 4.319 4.318-1.234 7.54-7.582-3.038-3.039ZM17.149 4a.395.395 0 0 0-.286.126L14.695 6.28l3.029 3.029 2.162-2.173a.384.384 0 0 0 .106-.197L20 6.864c0-.103-.04-.2-.119-.278l-2.457-2.47A.385.385 0 0 0 17.149 4Z"/></symbol><symbol id="icon-eds-i-education-medium" viewBox="0 0 24 24"><path fill-rule="evenodd" d="M12.41 2.088a1 1 0 0 0-.82 0l-10 4.5a1 1 0 0 0 0 1.824L3 9.047v7.124A3.001 3.001 0 0 0 4 22a3 3 0 0 0 1-5.83V9.948l1 .45V14.5a1 1 0 0 0 .087.408L7 14.5c-.913.408-.912.41-.912.41l.001.003.003.006.007.015a1.988 1.988 0 0 0 .083.16c.054.097.131.225.236.373.21.297.53.68.993 1.057C8.351 17.292 9.824 18 12 18c2.176 0 3.65-.707 4.589-1.476.463-.378.783-.76.993-1.057a4.162 4.162 0 0 0 .319-.533l.007-.015.003-.006v-.003h.002s0-.002-.913-.41l.913.408A1 1 0 0 0 18 14.5v-4.103l4.41-1.985a1 1 0 0 0 0-1.824l-10-4.5ZM16 11.297l-3.59 1.615a1 1 0 0 1-.82 0L8 11.297v2.94a3.388 3.388 0 0 0 .677.739C9.267 15.457 10.294 16 12 16s2.734-.543 3.323-1.024a3.388 3.388 0 0 0 .677-.739v-2.94ZM4.437 7.5 12 4.097 19.563 7.5 12 10.903 4.437 7.5ZM3 19a1 1 0 1 1 2 0 1 1 0 0 1-2 0Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-error-diamond-medium" viewBox="0 0 24 24"><path d="M12.002 1c.702 0 1.375.279 1.871.775l8.35 8.353a2.646 2.646 0 0 1 .001 3.744l-8.353 8.353a2.646 2.646 0 0 1-3.742 0l-8.353-8.353a2.646 2.646 0 0 1 0-3.744l8.353-8.353.156-.142c.424-.362.952-.58 1.507-.625l.21-.008Zm0 2a.646.646 0 0 0-.38.123l-.093.08-8.34 8.34a.646.646 0 0 0-.18.355L3 12c0 .171.068.336.19.457l8.353 8.354a.646.646 0 0 0 .914 0l8.354-8.354a.646.646 0 0 0-.001-.914l-8.351-8.354A.646.646 0 0 0 12.002 3ZM12 14.5a1.5 1.5 0 0 1 .144 2.993L12 17.5a1.5 1.5 0 0 1 0-3ZM12 6a1 1 0 0 1 1 1v5a1 1 0 0 1-2 0V7a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-error-filled-medium" viewBox="0 0 24 24"><path d="M12.002 1c.702 0 1.375.279 1.871.775l8.35 8.353a2.646 2.646 0 0 1 .001 3.744l-8.353 8.353a2.646 2.646 0 0 1-3.742 0l-8.353-8.353a2.646 2.646 0 0 1 0-3.744l8.353-8.353.156-.142c.424-.362.952-.58 1.507-.625l.21-.008ZM12 14.5a1.5 1.5 0 0 0 0 3l.144-.007A1.5 1.5 0 0 0 12 14.5ZM12 6a1 1 0 0 0-1 1v5a1 1 0 0 0 2 0V7a1 1 0 0 0-1-1Z"/></symbol><symbol id="icon-eds-i-external-link-medium" viewBox="0 0 24 24"><path d="M9 2a1 1 0 1 1 0 2H4.6c-.371 0-.6.209-.6.5v15c0 .291.229.5.6.5h14.8c.371 0 .6-.209.6-.5V15a1 1 0 0 1 2 0v4.5c0 1.438-1.162 2.5-2.6 2.5H4.6C3.162 22 2 20.938 2 19.5v-15C2 3.062 3.162 2 4.6 2H9Zm6 0h6l.075.003.126.017.111.03.111.044.098.052.096.067.09.08c.036.035.068.073.097.112l.071.11.054.114.035.105.03.148L22 3v6a1 1 0 0 1-2 0V5.414l-6.693 6.693a1 1 0 0 1-1.414-1.414L18.584 4H15a1 1 0 0 1-.993-.883L14 3a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-external-link-small" viewBox="0 0 16 16"><path d="M5 1a1 1 0 1 1 0 2l-2-.001V13L13 13v-2a1 1 0 0 1 2 0v2c0 1.15-.93 2-2.067 2H3.067C1.93 15 1 14.15 1 13V3c0-1.15.93-2 2.067-2H5Zm4 0h5l.075.003.126.017.111.03.111.044.098.052.096.067.09.08.044.047.073.093.051.083.054.113.035.105.03.148L15 2v5a1 1 0 0 1-2 0V4.414L9.107 8.307a1 1 0 0 1-1.414-1.414L11.584 3H9a1 1 0 0 1-.993-.883L8 2a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-file-download-medium" viewBox="0 0 24 24"><path d="M14.5 1a1 1 0 0 1 .707.293l5.5 5.5A1 1 0 0 1 21 7.5v12.962A2.542 2.542 0 0 1 18.455 23H5.545A2.542 2.542 0 0 1 3 20.462V3.538A2.542 2.542 0 0 1 5.545 1H14.5Zm-.415 2h-8.54A.542.542 0 0 0 5 3.538v16.924c0 .296.243.538.545.538h12.91a.542.542 0 0 0 .545-.538V7.915L14.085 3ZM12 7a1 1 0 0 1 1 1v6.585l2.293-2.292a1 1 0 0 1 1.32-.083l.094.083a1 1 0 0 1 0 1.414l-4 4a1.008 1.008 0 0 1-.112.097l-.11.071-.114.054-.105.035-.149.03L12 18l-.075-.003-.126-.017-.111-.03-.111-.044-.098-.052-.096-.067-.09-.08-4-4a1 1 0 0 1 1.414-1.414L11 14.585V8a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-file-report-medium" viewBox="0 0 24 24"><path d="M14.5 1a1 1 0 0 1 .707.293l5.5 5.5A1 1 0 0 1 21 7.5v12.962c0 .674-.269 1.32-.747 1.796a2.549 2.549 0 0 1-1.798.742H5.545c-.674 0-1.32-.267-1.798-.742A2.535 2.535 0 0 1 3 20.462V3.538A2.542 2.542 0 0 1 5.545 1H14.5Zm-.415 2h-8.54A.542.542 0 0 0 5 3.538v16.924c0 .142.057.278.158.379.102.102.242.159.387.159h12.91a.549.549 0 0 0 .387-.16.535.535 0 0 0 .158-.378V7.915L14.085 3ZM16 17a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h8Zm0-3a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h8Zm-4.793-6.207L13 9.585l1.793-1.792a1 1 0 0 1 1.32-.083l.094.083a1 1 0 0 1 0 1.414l-2.5 2.5a1 1 0 0 1-1.414 0L10.5 9.915l-1.793 1.792a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414l2.5-2.5a1 1 0 0 1 1.414 0Z"/></symbol><symbol id="icon-eds-i-file-text-medium" viewBox="0 0 24 24"><path d="M14.5 1a1 1 0 0 1 .707.293l5.5 5.5A1 1 0 0 1 21 7.5v12.962A2.542 2.542 0 0 1 18.455 23H5.545A2.542 2.542 0 0 1 3 20.462V3.538A2.542 2.542 0 0 1 5.545 1H14.5Zm-.415 2h-8.54A.542.542 0 0 0 5 3.538v16.924c0 .296.243.538.545.538h12.91a.542.542 0 0 0 .545-.538V7.915L14.085 3ZM16 15a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h8Zm0-4a1 1 0 0 1 0 2H8a1 1 0 0 1 0-2h8Zm-5-4a1 1 0 0 1 0 2H8a1 1 0 1 1 0-2h3Z"/></symbol><symbol id="icon-eds-i-file-upload-medium" viewBox="0 0 24 24"><path d="M14.5 1a1 1 0 0 1 .707.293l5.5 5.5A1 1 0 0 1 21 7.5v12.962A2.542 2.542 0 0 1 18.455 23H5.545A2.542 2.542 0 0 1 3 20.462V3.538A2.542 2.542 0 0 1 5.545 1H14.5Zm-.415 2h-8.54A.542.542 0 0 0 5 3.538v16.924c0 .296.243.538.545.538h12.91a.542.542 0 0 0 .545-.538V7.915L14.085 3Zm-2.233 4.011.058-.007L12 7l.075.003.126.017.111.03.111.044.098.052.104.074.082.073 4 4a1 1 0 0 1 0 1.414l-.094.083a1 1 0 0 1-1.32-.083L13 10.415V17a1 1 0 0 1-2 0v-6.585l-2.293 2.292a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414l4-4 .112-.097.11-.071.114-.054.105-.035.118-.025Z"/></symbol><symbol id="icon-eds-i-filter-medium" viewBox="0 0 24 24"><path d="M21 2a1 1 0 0 1 .82 1.573L15 13.314V18a1 1 0 0 1-.31.724l-.09.076-4 3A1 1 0 0 1 9 21v-7.684L2.18 3.573a1 1 0 0 1 .707-1.567L3 2h18Zm-1.921 2H4.92l5.9 8.427a1 1 0 0 1 .172.45L11 13v6l2-1.5V13a1 1 0 0 1 .117-.469l.064-.104L19.079 4Z"/></symbol><symbol id="icon-eds-i-funding-medium" viewBox="0 0 24 24"><path fill-rule="evenodd" d="M23 8A7 7 0 1 0 9 8a7 7 0 0 0 14 0ZM9.006 12.225A4.07 4.07 0 0 0 6.12 11.02H2a.979.979 0 1 0 0 1.958h4.12c.558 0 1.094.222 1.489.617l2.207 2.288c.27.27.27.687.012.944a.656.656 0 0 1-.928 0L7.744 15.67a.98.98 0 0 0-1.386 1.384l1.157 1.158c.535.536 1.244.791 1.946.765l.041.002h6.922c.874 0 1.597.748 1.597 1.688 0 .203-.146.354-.309.354H7.755c-.487 0-.96-.178-1.339-.504L2.64 17.259a.979.979 0 0 0-1.28 1.482L5.137 22c.733.631 1.66.979 2.618.979h9.957c1.26 0 2.267-1.043 2.267-2.312 0-2.006-1.584-3.646-3.555-3.646h-4.529a2.617 2.617 0 0 0-.681-2.509l-2.208-2.287ZM16 3a5 5 0 1 0 0 10 5 5 0 0 0 0-10Zm.979 3.5a.979.979 0 1 0-1.958 0v3a.979.979 0 1 0 1.958 0v-3Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-hashtag-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 2a9 9 0 1 0 0 18 9 9 0 0 0 0-18ZM9.52 18.189a1 1 0 1 1-1.964-.378l.437-2.274H6a1 1 0 1 1 0-2h2.378l.592-3.076H6a1 1 0 0 1 0-2h3.354l.51-2.65a1 1 0 1 1 1.964.378l-.437 2.272h3.04l.51-2.65a1 1 0 1 1 1.964.378l-.438 2.272H18a1 1 0 0 1 0 2h-1.917l-.592 3.076H18a1 1 0 0 1 0 2h-2.893l-.51 2.652a1 1 0 1 1-1.964-.378l.437-2.274h-3.04l-.51 2.652Zm.895-4.652h3.04l.591-3.076h-3.04l-.591 3.076Z"/></symbol><symbol id="icon-eds-i-home-medium" viewBox="0 0 24 24"><path d="M5 22a1 1 0 0 1-1-1v-8.586l-1.293 1.293a1 1 0 0 1-1.32.083l-.094-.083a1 1 0 0 1 0-1.414l10-10a1 1 0 0 1 1.414 0l10 10a1 1 0 0 1-1.414 1.414L20 12.415V21a1 1 0 0 1-1 1H5Zm7-17.585-6 5.999V20h5v-4a1 1 0 0 1 2 0v4h5v-9.585l-6-6Z"/></symbol><symbol id="icon-eds-i-image-medium" viewBox="0 0 24 24"><path d="M19.615 2A2.385 2.385 0 0 1 22 4.385v15.23A2.385 2.385 0 0 1 19.615 22H4.385A2.385 2.385 0 0 1 2 19.615V4.385A2.385 2.385 0 0 1 4.385 2h15.23Zm0 2H4.385A.385.385 0 0 0 4 4.385v15.23c0 .213.172.385.385.385h1.244l10.228-8.76a1 1 0 0 1 1.254-.037L20 13.392V4.385A.385.385 0 0 0 19.615 4Zm-3.07 9.283L8.703 20h10.912a.385.385 0 0 0 .385-.385v-3.713l-3.455-2.619ZM9.5 6a3.5 3.5 0 1 1 0 7 3.5 3.5 0 0 1 0-7Zm0 2a1.5 1.5 0 1 0 0 3 1.5 1.5 0 0 0 0-3Z"/></symbol><symbol id="icon-eds-i-impact-factor-medium" viewBox="0 0 24 24"><path d="M16.49 2.672c.74.694.986 1.765.632 2.712l-.04.1-1.549 3.54h1.477a2.496 2.496 0 0 1 2.485 2.34l.005.163c0 .618-.23 1.21-.642 1.675l-7.147 7.961a2.48 2.48 0 0 1-3.554.165 2.512 2.512 0 0 1-.633-2.712l.042-.103L9.108 15H7.46c-1.393 0-2.379-1.11-2.455-2.369L5 12.473c0-.593.142-1.145.628-1.692l7.307-7.944a2.48 2.48 0 0 1 3.555-.165ZM14.43 4.164l-7.33 7.97c-.083.093-.101.214-.101.34 0 .277.19.526.46.526h4.163l.097-.009c.015 0 .03.003.046.009.181.078.264.32.186.5l-2.554 5.817a.512.512 0 0 0 .127.552.48.48 0 0 0 .69-.033l7.155-7.97a.513.513 0 0 0 .13-.34.497.497 0 0 0-.49-.502h-3.988a.355.355 0 0 1-.328-.497l2.555-5.844a.512.512 0 0 0-.127-.552.48.48 0 0 0-.69.033Z"/></symbol><symbol id="icon-eds-i-info-circle-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 2a9 9 0 1 0 0 18 9 9 0 0 0 0-18Zm0 7a1 1 0 0 1 1 1v5h1.5a1 1 0 0 1 0 2h-5a1 1 0 0 1 0-2H11v-4h-.5a1 1 0 0 1-.993-.883L9.5 11a1 1 0 0 1 1-1H12Zm0-4.5a1.5 1.5 0 0 1 .144 2.993L12 8.5a1.5 1.5 0 0 1 0-3Z"/></symbol><symbol id="icon-eds-i-info-filled-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 9h-1.5a1 1 0 0 0-1 1l.007.117A1 1 0 0 0 10.5 12h.5v4H9.5a1 1 0 0 0 0 2h5a1 1 0 0 0 0-2H13v-5a1 1 0 0 0-1-1Zm0-4.5a1.5 1.5 0 0 0 0 3l.144-.007A1.5 1.5 0 0 0 12 5.5Z"/></symbol><symbol id="icon-eds-i-journal-medium" viewBox="0 0 24 24"><path d="M18.5 1A2.5 2.5 0 0 1 21 3.5v14a2.5 2.5 0 0 1-2.5 2.5h-13a.5.5 0 1 0 0 1H20a1 1 0 0 1 0 2H5.5A2.5 2.5 0 0 1 3 20.5v-17A2.5 2.5 0 0 1 5.5 1h13ZM7 3H5.5a.5.5 0 0 0-.5.5v14.549l.016-.002c.104-.02.211-.035.32-.042L5.5 18H7V3Zm11.5 0H9v15h9.5a.5.5 0 0 0 .5-.5v-14a.5.5 0 0 0-.5-.5ZM16 5a1 1 0 0 1 1 1v4a1 1 0 0 1-1 1h-5a1 1 0 0 1-1-1V6a1 1 0 0 1 1-1h5Zm-1 2h-3v2h3V7Z"/></symbol><symbol id="icon-eds-i-mail-medium" viewBox="0 0 24 24"><path d="M20.462 3C21.875 3 23 4.184 23 5.619v12.762C23 19.816 21.875 21 20.462 21H3.538C2.125 21 1 19.816 1 18.381V5.619C1 4.184 2.125 3 3.538 3h16.924ZM21 8.158l-7.378 6.258a2.549 2.549 0 0 1-3.253-.008L3 8.16v10.222c0 .353.253.619.538.619h16.924c.285 0 .538-.266.538-.619V8.158ZM20.462 5H3.538c-.264 0-.5.228-.534.542l8.65 7.334c.2.165.492.165.684.007l8.656-7.342-.001-.025c-.044-.3-.274-.516-.531-.516Z"/></symbol><symbol id="icon-eds-i-mail-send-medium" viewBox="0 0 24 24"><path d="M20.444 5a2.562 2.562 0 0 1 2.548 2.37l.007.078.001.123v7.858A2.564 2.564 0 0 1 20.444 18H9.556A2.564 2.564 0 0 1 7 15.429l.001-7.977.007-.082A2.561 2.561 0 0 1 9.556 5h10.888ZM21 9.331l-5.46 3.51a1 1 0 0 1-1.08 0L9 9.332v6.097c0 .317.251.571.556.571h10.888a.564.564 0 0 0 .556-.571V9.33ZM20.444 7H9.556a.543.543 0 0 0-.32.105l5.763 3.706 5.766-3.706a.543.543 0 0 0-.32-.105ZM4.308 5a1 1 0 1 1 0 2H2a1 1 0 1 1 0-2h2.308Zm0 5.5a1 1 0 0 1 0 2H2a1 1 0 0 1 0-2h2.308Zm0 5.5a1 1 0 0 1 0 2H2a1 1 0 0 1 0-2h2.308Z"/></symbol><symbol id="icon-eds-i-mentions-medium" viewBox="0 0 24 24"><path d="m9.452 1.293 5.92 5.92 2.92-2.92a1 1 0 0 1 1.415 1.414l-2.92 2.92 5.92 5.92a1 1 0 0 1 0 1.415 10.371 10.371 0 0 1-10.378 2.584l.652 3.258A1 1 0 0 1 12 23H2a1 1 0 0 1-.874-1.486l4.789-8.62C4.194 9.074 4.9 4.43 8.038 1.292a1 1 0 0 1 1.414 0Zm-2.355 13.59L3.699 21h7.081l-.689-3.442a10.392 10.392 0 0 1-2.775-2.396l-.22-.28Zm1.69-11.427-.07.09a8.374 8.374 0 0 0 11.737 11.737l.089-.071L8.787 3.456Z"/></symbol><symbol id="icon-eds-i-menu-medium" viewBox="0 0 24 24"><path d="M21 4a1 1 0 0 1 0 2H3a1 1 0 1 1 0-2h18Zm-4 7a1 1 0 0 1 0 2H3a1 1 0 0 1 0-2h14Zm4 7a1 1 0 0 1 0 2H3a1 1 0 0 1 0-2h18Z"/></symbol><symbol id="icon-eds-i-metrics-medium" viewBox="0 0 24 24"><path d="M3 22a1 1 0 0 1-1-1V3a1 1 0 0 1 1-1h6a1 1 0 0 1 1 1v7h4V8a1 1 0 0 1 1-1h6a1 1 0 0 1 1 1v13a1 1 0 0 1-.883.993L21 22H3Zm17-2V9h-4v11h4Zm-6-8h-4v8h4v-8ZM8 4H4v16h4V4Z"/></symbol><symbol id="icon-eds-i-news-medium" viewBox="0 0 24 24"><path d="M17.384 3c.975 0 1.77.787 1.77 1.762v13.333c0 .462.354.846.815.899l.107.006.109-.006a.915.915 0 0 0 .809-.794l.006-.105V8.19a1 1 0 0 1 2 0v9.905A2.914 2.914 0 0 1 20.077 21H3.538a2.547 2.547 0 0 1-1.644-.601l-.147-.135A2.516 2.516 0 0 1 1 18.476V4.762C1 3.787 1.794 3 2.77 3h14.614Zm-.231 2H3v13.476c0 .11.035.216.1.304l.054.063c.101.1.24.157.384.157l13.761-.001-.026-.078a2.88 2.88 0 0 1-.115-.655l-.004-.17L17.153 5ZM14 15.021a.979.979 0 1 1 0 1.958H6a.979.979 0 1 1 0-1.958h8Zm0-8c.54 0 .979.438.979.979v4c0 .54-.438.979-.979.979H6A.979.979 0 0 1 5.021 12V8c0-.54.438-.979.979-.979h8Zm-.98 1.958H6.979v2.041h6.041V8.979Z"/></symbol><symbol id="icon-eds-i-newsletter-medium" viewBox="0 0 24 24"><path d="M21 10a1 1 0 0 1 1 1v9.5a2.5 2.5 0 0 1-2.5 2.5h-15A2.5 2.5 0 0 1 2 20.5V11a1 1 0 0 1 2 0v.439l8 4.888 8-4.889V11a1 1 0 0 1 1-1Zm-1 3.783-7.479 4.57a1 1 0 0 1-1.042 0l-7.48-4.57V20.5a.5.5 0 0 0 .501.5h15a.5.5 0 0 0 .5-.5v-6.717ZM15 9a1 1 0 0 1 0 2H9a1 1 0 0 1 0-2h6Zm2.5-8A2.5 2.5 0 0 1 20 3.5V9a1 1 0 0 1-2 0V3.5a.5.5 0 0 0-.5-.5h-11a.5.5 0 0 0-.5.5V9a1 1 0 1 1-2 0V3.5A2.5 2.5 0 0 1 6.5 1h11ZM15 5a1 1 0 0 1 0 2H9a1 1 0 1 1 0-2h6Z"/></symbol><symbol id="icon-eds-i-notifcation-medium" viewBox="0 0 24 24"><path d="M14 20a1 1 0 0 1 0 2h-4a1 1 0 0 1 0-2h4ZM3 18l-.133-.007c-1.156-.124-1.156-1.862 0-1.986l.3-.012C4.32 15.923 5 15.107 5 14V9.5C5 5.368 8.014 2 12 2s7 3.368 7 7.5V14c0 1.107.68 1.923 1.832 1.995l.301.012c1.156.124 1.156 1.862 0 1.986L21 18H3Zm9-14C9.17 4 7 6.426 7 9.5V14c0 .671-.146 1.303-.416 1.858L6.51 16h10.979l-.073-.142a4.192 4.192 0 0 1-.412-1.658L17 14V9.5C17 6.426 14.83 4 12 4Z"/></symbol><symbol id="icon-eds-i-publish-medium" viewBox="0 0 24 24"><g><path d="M16.296 1.291A1 1 0 0 0 15.591 1H5.545A2.542 2.542 0 0 0 3 3.538V13a1 1 0 1 0 2 0V3.538l.007-.087A.543.543 0 0 1 5.545 3h9.633L20 7.8v12.662a.534.534 0 0 1-.158.379.548.548 0 0 1-.387.159H11a1 1 0 1 0 0 2h8.455c.674 0 1.32-.267 1.798-.742A2.534 2.534 0 0 0 22 20.462V7.385a1 1 0 0 0-.294-.709l-5.41-5.385Z"/><path d="M10.762 16.647a1 1 0 0 0-1.525-1.294l-4.472 5.271-2.153-1.665a1 1 0 1 0-1.224 1.582l2.91 2.25a1 1 0 0 0 1.374-.144l5.09-6ZM16 10a1 1 0 1 1 0 2H8a1 1 0 1 1 0-2h8ZM12 7a1 1 0 0 0-1-1H8a1 1 0 1 0 0 2h3a1 1 0 0 0 1-1Z"/></g></symbol><symbol id="icon-eds-i-refresh-medium" viewBox="0 0 24 24"><g><path d="M7.831 5.636H6.032A8.76 8.76 0 0 1 9 3.631 8.549 8.549 0 0 1 12.232 3c.603 0 1.192.063 1.76.182C17.979 4.017 21 7.632 21 12a1 1 0 1 0 2 0c0-5.296-3.674-9.746-8.591-10.776A10.61 10.61 0 0 0 5 3.851V2.805a1 1 0 0 0-.987-1H4a1 1 0 0 0-1 1v3.831a1 1 0 0 0 1 1h3.831a1 1 0 0 0 .013-2h-.013ZM17.968 18.364c-1.59 1.632-3.784 2.636-6.2 2.636C6.948 21 3 16.993 3 12a1 1 0 1 0-2 0c0 6.053 4.799 11 10.768 11 2.788 0 5.324-1.082 7.232-2.85v1.045a1 1 0 1 0 2 0v-3.831a1 1 0 0 0-1-1h-3.831a1 1 0 0 0 0 2h1.799Z"/></g></symbol><symbol id="icon-eds-i-search-medium" viewBox="0 0 24 24"><path d="M11 1c5.523 0 10 4.477 10 10 0 2.4-.846 4.604-2.256 6.328l3.963 3.965a1 1 0 0 1-1.414 1.414l-3.965-3.963A9.959 9.959 0 0 1 11 21C5.477 21 1 16.523 1 11S5.477 1 11 1Zm0 2a8 8 0 1 0 0 16 8 8 0 0 0 0-16Z"/></symbol><symbol id="icon-eds-i-settings-medium" viewBox="0 0 24 24"><path d="M11.382 1h1.24a2.508 2.508 0 0 1 2.334 1.63l.523 1.378 1.59.933 1.444-.224c.954-.132 1.89.3 2.422 1.101l.095.155.598 1.066a2.56 2.56 0 0 1-.195 2.848l-.894 1.161v1.896l.92 1.163c.6.768.707 1.812.295 2.674l-.09.17-.606 1.08a2.504 2.504 0 0 1-2.531 1.25l-1.428-.223-1.589.932-.523 1.378a2.512 2.512 0 0 1-2.155 1.625L12.65 23h-1.27a2.508 2.508 0 0 1-2.334-1.63l-.524-1.379-1.59-.933-1.443.225c-.954.132-1.89-.3-2.422-1.101l-.095-.155-.598-1.066a2.56 2.56 0 0 1 .195-2.847l.891-1.161v-1.898l-.919-1.162a2.562 2.562 0 0 1-.295-2.674l.09-.17.606-1.08a2.504 2.504 0 0 1 2.531-1.25l1.43.223 1.618-.938.524-1.375.07-.167A2.507 2.507 0 0 1 11.382 1Zm.003 2a.509.509 0 0 0-.47.338l-.65 1.71a1 1 0 0 1-.434.51L7.6 6.85a1 1 0 0 1-.655.123l-1.762-.275a.497.497 0 0 0-.498.252l-.61 1.088a.562.562 0 0 0 .04.619l1.13 1.43a1 1 0 0 1 .216.62v2.585a1 1 0 0 1-.207.61L4.15 15.339a.568.568 0 0 0-.036.634l.601 1.072a.494.494 0 0 0 .484.26l1.78-.278a1 1 0 0 1 .66.126l2.2 1.292a1 1 0 0 1 .43.507l.648 1.71a.508.508 0 0 0 .467.338h1.263a.51.51 0 0 0 .47-.34l.65-1.708a1 1 0 0 1 .428-.507l2.201-1.292a1 1 0 0 1 .66-.126l1.763.275a.497.497 0 0 0 .498-.252l.61-1.088a.562.562 0 0 0-.04-.619l-1.13-1.43a1 1 0 0 1-.216-.62v-2.585a1 1 0 0 1 .207-.61l1.105-1.437a.568.568 0 0 0 .037-.634l-.601-1.072a.494.494 0 0 0-.484-.26l-1.78.278a1 1 0 0 1-.66-.126l-2.2-1.292a1 1 0 0 1-.43-.507l-.649-1.71A.508.508 0 0 0 12.62 3h-1.234ZM12 8a4 4 0 1 1 0 8 4 4 0 0 1 0-8Zm0 2a2 2 0 1 0 0 4 2 2 0 0 0 0-4Z"/></symbol><symbol id="icon-eds-i-shipping-medium" viewBox="0 0 24 24"><path d="M16.515 2c1.406 0 2.706.728 3.352 1.902l2.02 3.635.02.042.036.089.031.105.012.058.01.073.004.075v11.577c0 .64-.244 1.255-.683 1.713a2.356 2.356 0 0 1-1.701.731H4.386a2.356 2.356 0 0 1-1.702-.731 2.476 2.476 0 0 1-.683-1.713V7.948c.01-.217.083-.43.22-.6L4.2 3.905C4.833 2.755 6.089 2.032 7.486 2h9.029ZM20 9H4v10.556a.49.49 0 0 0 .075.26l.053.07a.356.356 0 0 0 .257.114h15.23c.094 0 .186-.04.258-.115a.477.477 0 0 0 .127-.33V9Zm-2 7.5a1 1 0 0 1 0 2h-4a1 1 0 0 1 0-2h4ZM16.514 4H13v3h6.3l-1.183-2.13c-.288-.522-.908-.87-1.603-.87ZM11 3.999H7.51c-.679.017-1.277.36-1.566.887L4.728 7H11V3.999Z"/></symbol><symbol id="icon-eds-i-step-guide-medium" viewBox="0 0 24 24"><path d="M11.394 9.447a1 1 0 1 0-1.788-.894l-.88 1.759-.019-.02a1 1 0 1 0-1.414 1.415l1 1a1 1 0 0 0 1.601-.26l1.5-3ZM12 11a1 1 0 0 1 1-1h3a1 1 0 1 1 0 2h-3a1 1 0 0 1-1-1ZM12 17a1 1 0 0 1 1-1h3a1 1 0 1 1 0 2h-3a1 1 0 0 1-1-1ZM10.947 14.105a1 1 0 0 1 .447 1.342l-1.5 3a1 1 0 0 1-1.601.26l-1-1a1 1 0 1 1 1.414-1.414l.02.019.879-1.76a1 1 0 0 1 1.341-.447Z"/><path d="M5.545 1A2.542 2.542 0 0 0 3 3.538v16.924A2.542 2.542 0 0 0 5.545 23h12.91A2.542 2.542 0 0 0 21 20.462V7.5a1 1 0 0 0-.293-.707l-5.5-5.5A1 1 0 0 0 14.5 1H5.545ZM5 3.538C5 3.245 5.24 3 5.545 3h8.54L19 7.914v12.547c0 .294-.24.539-.546.539H5.545A.542.542 0 0 1 5 20.462V3.538Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-submission-medium" viewBox="0 0 24 24"><g><path d="M5 3.538C5 3.245 5.24 3 5.545 3h9.633L20 7.8v12.662a.535.535 0 0 1-.158.379.549.549 0 0 1-.387.159H6a1 1 0 0 1-1-1v-2.5a1 1 0 1 0-2 0V20a3 3 0 0 0 3 3h13.455c.673 0 1.32-.266 1.798-.742A2.535 2.535 0 0 0 22 20.462V7.385a1 1 0 0 0-.294-.709l-5.41-5.385A1 1 0 0 0 15.591 1H5.545A2.542 2.542 0 0 0 3 3.538V7a1 1 0 0 0 2 0V3.538Z"/><path d="m13.707 13.707-4 4a1 1 0 0 1-1.414 0l-.083-.094a1 1 0 0 1 .083-1.32L10.585 14 2 14a1 1 0 1 1 0-2l8.583.001-2.29-2.294a1 1 0 0 1 1.414-1.414l4.037 4.04.043.05.043.06.059.098.03.063.031.085.03.113.017.122L14 13l-.004.087-.017.118-.013.056-.034.104-.049.105-.048.081-.07.093-.058.063Z"/></g></symbol><symbol id="icon-eds-i-table-1-medium" viewBox="0 0 24 24"><path d="M4.385 22a2.56 2.56 0 0 1-1.14-.279C2.485 21.341 2 20.614 2 19.615V4.385c0-.315.067-.716.279-1.14C2.659 2.485 3.386 2 4.385 2h15.23c.315 0 .716.067 1.14.279.76.38 1.245 1.107 1.245 2.106v15.23c0 .315-.067.716-.279 1.14-.38.76-1.107 1.245-2.106 1.245H4.385ZM4 19.615c0 .213.034.265.14.317a.71.71 0 0 0 .245.068H8v-4H4v3.615ZM20 16H10v4h9.615c.213 0 .265-.034.317-.14a.71.71 0 0 0 .068-.245V16Zm0-2v-4H10v4h10ZM4 14h4v-4H4v4ZM19.615 4H10v4h10V4.385c0-.213-.034-.265-.14-.317A.71.71 0 0 0 19.615 4ZM8 4H4.385l-.082.002c-.146.01-.19.047-.235.138A.71.71 0 0 0 4 4.385V8h4V4Z"/></symbol><symbol id="icon-eds-i-table-2-medium" viewBox="0 0 24 24"><path d="M4.384 22A2.384 2.384 0 0 1 2 19.616V4.384A2.384 2.384 0 0 1 4.384 2h15.232A2.384 2.384 0 0 1 22 4.384v15.232A2.384 2.384 0 0 1 19.616 22H4.384ZM10 15H4v4.616c0 .212.172.384.384.384H10v-5Zm5 0h-3v5h3v-5Zm5 0h-3v5h2.616a.384.384 0 0 0 .384-.384V15ZM10 9H4v4h6V9Zm5 0h-3v4h3V9Zm5 0h-3v4h3V9Zm-.384-5H4.384A.384.384 0 0 0 4 4.384V7h16V4.384A.384.384 0 0 0 19.616 4Z"/></symbol><symbol id="icon-eds-i-tag-medium" viewBox="0 0 24 24"><path d="m12.621 1.998.127.004L20.496 2a1.5 1.5 0 0 1 1.497 1.355L22 3.5l-.005 7.669c.038.456-.133.905-.447 1.206l-9.02 9.018a2.075 2.075 0 0 1-2.932 0l-6.99-6.99a2.075 2.075 0 0 1 .001-2.933L11.61 2.47c.246-.258.573-.418.881-.46l.131-.011Zm.286 2-8.885 8.886a.075.075 0 0 0 0 .106l6.987 6.988c.03.03.077.03.106 0l8.883-8.883L19.999 4l-7.092-.002ZM16 6.5a1.5 1.5 0 0 1 .144 2.993L16 9.5a1.5 1.5 0 0 1 0-3Z"/></symbol><symbol id="icon-eds-i-trash-medium" viewBox="0 0 24 24"><path d="M12 1c2.717 0 4.913 2.232 4.997 5H21a1 1 0 0 1 0 2h-1v12.5c0 1.389-1.152 2.5-2.556 2.5H6.556C5.152 23 4 21.889 4 20.5V8H3a1 1 0 1 1 0-2h4.003l.001-.051C7.114 3.205 9.3 1 12 1Zm6 7H6v12.5c0 .238.19.448.454.492l.102.008h10.888c.315 0 .556-.232.556-.5V8Zm-4 3a1 1 0 0 1 1 1v6.005a1 1 0 0 1-2 0V12a1 1 0 0 1 1-1Zm-4 0a1 1 0 0 1 1 1v6a1 1 0 0 1-2 0v-6a1 1 0 0 1 1-1Zm2-8c-1.595 0-2.914 1.32-2.996 3h5.991v-.02C14.903 4.31 13.589 3 12 3Z"/></symbol><symbol id="icon-eds-i-user-account-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 16c-1.806 0-3.52.994-4.664 2.698A8.947 8.947 0 0 0 12 21a8.958 8.958 0 0 0 4.664-1.301C15.52 17.994 13.806 17 12 17Zm0-14a9 9 0 0 0-6.25 15.476C7.253 16.304 9.54 15 12 15s4.747 1.304 6.25 3.475A9 9 0 0 0 12 3Zm0 3a4 4 0 1 1 0 8 4 4 0 0 1 0-8Zm0 2a2 2 0 1 0 0 4 2 2 0 0 0 0-4Z"/></symbol><symbol id="icon-eds-i-user-add-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm9 10a1 1 0 0 1 1 1v3h3a1 1 0 0 1 0 2h-3v3a1 1 0 0 1-2 0v-3h-3a1 1 0 0 1 0-2h3v-3a1 1 0 0 1 1-1Zm-5.545-.15a1 1 0 1 1-.91 1.78 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 11.5 20a1 1 0 0 1 .993.883L12.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378Z"/></symbol><symbol id="icon-eds-i-user-assign-medium" viewBox="0 0 24 24"><path d="M16.226 13.298a1 1 0 0 1 1.414-.01l.084.093a1 1 0 0 1-.073 1.32L15.39 17H22a1 1 0 0 1 0 2h-6.611l2.262 2.298a1 1 0 0 1-1.425 1.404l-3.939-4a1 1 0 0 1 0-1.404l3.94-4Zm-3.771-.449a1 1 0 1 1-.91 1.781 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 10.5 20a1 1 0 0 1 .993.883L11.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378ZM9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Z"/></symbol><symbol id="icon-eds-i-user-block-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm9 10a5 5 0 1 1 0 10 5 5 0 0 1 0-10Zm-5.545-.15a1 1 0 1 1-.91 1.78 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 11.5 20a1 1 0 0 1 .993.883L12.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378ZM15 18a3 3 0 0 0 4.294 2.707l-4.001-4c-.188.391-.293.83-.293 1.293Zm3-3c-.463 0-.902.105-1.294.293l4.001 4A3 3 0 0 0 18 15Z"/></symbol><symbol id="icon-eds-i-user-check-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm13.647 12.237a1 1 0 0 1 .116 1.41l-5.091 6a1 1 0 0 1-1.375.144l-2.909-2.25a1 1 0 1 1 1.224-1.582l2.153 1.665 4.472-5.271a1 1 0 0 1 1.41-.116Zm-8.139-.977c.22.214.428.44.622.678a1 1 0 1 1-1.548 1.266 6.025 6.025 0 0 0-1.795-1.49.86.86 0 0 1-.163-.048l-.079-.036a5.721 5.721 0 0 0-2.62-.63l-.194.006c-2.76.134-5.022 2.177-5.592 4.864l-.035.175-.035.213c-.03.201-.05.405-.06.61L3.003 20 10 20a1 1 0 0 1 .993.883L11 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876l.005-.223.02-.356.02-.222.03-.248.022-.15c.02-.133.044-.265.071-.397.44-2.178 1.725-4.105 3.595-5.301a7.75 7.75 0 0 1 3.755-1.215l.12-.004a7.908 7.908 0 0 1 5.87 2.252Z"/></symbol><symbol id="icon-eds-i-user-delete-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6ZM4.763 13.227a7.713 7.713 0 0 1 7.692-.378 1 1 0 1 1-.91 1.781 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20H11.5a1 1 0 0 1 .993.883L12.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897Zm11.421 1.543 2.554 2.553 2.555-2.553a1 1 0 0 1 1.414 1.414l-2.554 2.554 2.554 2.555a1 1 0 0 1-1.414 1.414l-2.555-2.554-2.554 2.554a1 1 0 0 1-1.414-1.414l2.553-2.555-2.553-2.554a1 1 0 0 1 1.414-1.414Z"/></symbol><symbol id="icon-eds-i-user-edit-medium" viewBox="0 0 24 24"><path d="m19.876 10.77 2.831 2.83a1 1 0 0 1 0 1.415l-7.246 7.246a1 1 0 0 1-.572.284l-3.277.446a1 1 0 0 1-1.125-1.13l.461-3.277a1 1 0 0 1 .283-.567l7.23-7.246a1 1 0 0 1 1.415-.001Zm-7.421 2.08a1 1 0 1 1-.91 1.78 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 7.5 20a1 1 0 0 1 .993.883L8.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378Zm6.715.042-6.29 6.3-.23 1.639 1.633-.222 6.302-6.302-1.415-1.415ZM9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Z"/></symbol><symbol id="icon-eds-i-user-linked-medium" viewBox="0 0 24 24"><path d="M15.65 6c.31 0 .706.066 1.122.274C17.522 6.65 18 7.366 18 8.35v12.3c0 .31-.066.706-.274 1.122-.375.75-1.092 1.228-2.076 1.228H3.35a2.52 2.52 0 0 1-1.122-.274C1.478 22.35 1 21.634 1 20.65V8.35c0-.31.066-.706.274-1.122C1.65 6.478 2.366 6 3.35 6h12.3Zm0 2-12.376.002c-.134.007-.17.04-.21.12A.672.672 0 0 0 3 8.35v12.3c0 .198.028.24.122.287.09.044.2.063.228.063h.887c.788-2.269 2.814-3.5 5.263-3.5 2.45 0 4.475 1.231 5.263 3.5h.887c.198 0 .24-.028.287-.122.044-.09.063-.2.063-.228V8.35c0-.198-.028-.24-.122-.287A.672.672 0 0 0 15.65 8ZM9.5 19.5c-1.36 0-2.447.51-3.06 1.5h6.12c-.613-.99-1.7-1.5-3.06-1.5ZM20.65 1A2.35 2.35 0 0 1 23 3.348V15.65A2.35 2.35 0 0 1 20.65 18H20a1 1 0 0 1 0-2h.65a.35.35 0 0 0 .35-.35V3.348A.35.35 0 0 0 20.65 3H8.35a.35.35 0 0 0-.35.348V4a1 1 0 1 1-2 0v-.652A2.35 2.35 0 0 1 8.35 1h12.3ZM9.5 10a3.5 3.5 0 1 1 0 7 3.5 3.5 0 0 1 0-7Zm0 2a1.5 1.5 0 1 0 0 3 1.5 1.5 0 0 0 0-3Z"/></symbol><symbol id="icon-eds-i-user-multiple-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm6 0a5 5 0 0 1 0 10 1 1 0 0 1-.117-1.993L15 9a3 3 0 0 0 0-6 1 1 0 0 1 0-2ZM9 3a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm8.857 9.545a7.99 7.99 0 0 1 2.651 1.715A8.31 8.31 0 0 1 23 20.134V21a1 1 0 0 1-1 1h-3a1 1 0 0 1 0-2h1.995l-.005-.153a6.307 6.307 0 0 0-1.673-3.945l-.204-.209a5.99 5.99 0 0 0-1.988-1.287 1 1 0 1 1 .732-1.861Zm-3.349 1.715A8.31 8.31 0 0 1 17 20.134V21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.877c.044-4.343 3.387-7.908 7.638-8.115a7.908 7.908 0 0 1 5.87 2.252ZM9.016 14l-.285.006c-3.104.15-5.58 2.718-5.725 5.9L3.004 20h11.991l-.005-.153a6.307 6.307 0 0 0-1.673-3.945l-.204-.209A5.924 5.924 0 0 0 9.3 14.008L9.016 14Z"/></symbol><symbol id="icon-eds-i-user-notify-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm10 18v1a1 1 0 0 1-2 0v-1h-3a1 1 0 0 1 0-2v-2.818C14 13.885 15.777 12 18 12s4 1.885 4 4.182V19a1 1 0 0 1 0 2h-3Zm-6.545-8.15a1 1 0 1 1-.91 1.78 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 11.5 20a1 1 0 0 1 .993.883L12.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378ZM18 14c-1.091 0-2 .964-2 2.182V19h4v-2.818c0-1.165-.832-2.098-1.859-2.177L18 14Z"/></symbol><symbol id="icon-eds-i-user-remove-medium" viewBox="0 0 24 24"><path d="M9 1a5 5 0 1 1 0 10A5 5 0 0 1 9 1Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm3.455 9.85a1 1 0 1 1-.91 1.78 5.713 5.713 0 0 0-5.705.282c-1.67 1.068-2.728 2.927-2.832 4.956L3.004 20 11.5 20a1 1 0 0 1 .993.883L12.5 21a1 1 0 0 1-1 1H2a1 1 0 0 1-1-1v-.876c.028-2.812 1.446-5.416 3.763-6.897a7.713 7.713 0 0 1 7.692-.378ZM22 17a1 1 0 0 1 0 2h-8a1 1 0 0 1 0-2h8Z"/></symbol><symbol id="icon-eds-i-user-single-medium" viewBox="0 0 24 24"><path d="M12 1a5 5 0 1 1 0 10 5 5 0 0 1 0-10Zm0 2a3 3 0 1 0 0 6 3 3 0 0 0 0-6Zm-.406 9.008a8.965 8.965 0 0 1 6.596 2.494A9.161 9.161 0 0 1 21 21.025V22a1 1 0 0 1-1 1H4a1 1 0 0 1-1-1v-.985c.05-4.825 3.815-8.777 8.594-9.007Zm.39 1.992-.299.006c-3.63.175-6.518 3.127-6.678 6.775L5 21h13.998l-.009-.268a7.157 7.157 0 0 0-1.97-4.573l-.214-.213A6.967 6.967 0 0 0 11.984 14Z"/></symbol><symbol id="icon-eds-i-warning-circle-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 2a9 9 0 1 0 0 18 9 9 0 0 0 0-18Zm0 11.5a1.5 1.5 0 0 1 .144 2.993L12 17.5a1.5 1.5 0 0 1 0-3ZM12 6a1 1 0 0 1 1 1v5a1 1 0 0 1-2 0V7a1 1 0 0 1 1-1Z"/></symbol><symbol id="icon-eds-i-warning-filled-medium" viewBox="0 0 24 24"><path d="M12 1c6.075 0 11 4.925 11 11s-4.925 11-11 11S1 18.075 1 12 5.925 1 12 1Zm0 13.5a1.5 1.5 0 0 0 0 3l.144-.007A1.5 1.5 0 0 0 12 14.5ZM12 6a1 1 0 0 0-1 1v5a1 1 0 0 0 2 0V7a1 1 0 0 0-1-1Z"/></symbol><symbol id="icon-chevron-left-medium" viewBox="0 0 24 24"><path d="M15.7194 3.3054C15.3358 2.90809 14.7027 2.89699 14.3054 3.28061L6.54342 10.7757C6.19804 11.09 6 11.5335 6 12C6 12.4665 6.19804 12.91 6.5218 13.204L14.3054 20.7194C14.7027 21.103 15.3358 21.0919 15.7194 20.6946C16.103 20.2973 16.0919 19.6642 15.6946 19.2806L8.155 12L15.6946 4.71939C16.0614 4.36528 16.099 3.79863 15.8009 3.40105L15.7194 3.3054Z"/></symbol><symbol id="icon-chevron-right-medium" viewBox="0 0 24 24"><path d="M8.28061 3.3054C8.66423 2.90809 9.29729 2.89699 9.6946 3.28061L17.4566 10.7757C17.802 11.09 18 11.5335 18 12C18 12.4665 17.802 12.91 17.4782 13.204L9.6946 20.7194C9.29729 21.103 8.66423 21.0919 8.28061 20.6946C7.89699 20.2973 7.90809 19.6642 8.3054 19.2806L15.845 12L8.3054 4.71939C7.93865 4.36528 7.90098 3.79863 8.19908 3.40105L8.28061 3.3054Z"/></symbol><symbol id="icon-eds-alerts" viewBox="0 0 32 32"><path d="M28 12.667c.736 0 1.333.597 1.333 1.333v13.333A3.333 3.333 0 0 1 26 30.667H6a3.333 3.333 0 0 1-3.333-3.334V14a1.333 1.333 0 1 1 2.666 0v1.252L16 21.769l10.667-6.518V14c0-.736.597-1.333 1.333-1.333Zm-1.333 5.71-9.972 6.094c-.427.26-.963.26-1.39 0l-9.972-6.094v8.956c0 .368.299.667.667.667h20a.667.667 0 0 0 .667-.667v-8.956ZM19.333 12a1.333 1.333 0 1 1 0 2.667h-6.666a1.333 1.333 0 1 1 0-2.667h6.666Zm4-10.667a3.333 3.333 0 0 1 3.334 3.334v6.666a1.333 1.333 0 1 1-2.667 0V4.667A.667.667 0 0 0 23.333 4H8.667A.667.667 0 0 0 8 4.667v6.666a1.333 1.333 0 1 1-2.667 0V4.667a3.333 3.333 0 0 1 3.334-3.334h14.666Zm-4 5.334a1.333 1.333 0 0 1 0 2.666h-6.666a1.333 1.333 0 1 1 0-2.666h6.666Z"/></symbol><symbol id="icon-eds-arrow-up" viewBox="0 0 24 24"><path fill-rule="evenodd" d="m13.002 7.408 4.88 4.88a.99.99 0 0 0 1.32.08l.09-.08c.39-.39.39-1.03 0-1.42l-6.58-6.58a1.01 1.01 0 0 0-1.42 0l-6.58 6.58a1 1 0 0 0-.09 1.32l.08.1a1 1 0 0 0 1.42-.01l4.88-4.87v11.59a.99.99 0 0 0 .88.99l.12.01c.55 0 1-.45 1-1V7.408z" class="layer"/></symbol><symbol id="icon-eds-checklist" viewBox="0 0 32 32"><path d="M19.2 1.333a3.468 3.468 0 0 1 3.381 2.699L24.667 4C26.515 4 28 5.52 28 7.38v19.906c0 1.86-1.485 3.38-3.333 3.38H7.333c-1.848 0-3.333-1.52-3.333-3.38V7.38C4 5.52 5.485 4 7.333 4h2.093A3.468 3.468 0 0 1 12.8 1.333h6.4ZM9.426 6.667H7.333c-.36 0-.666.312-.666.713v19.906c0 .401.305.714.666.714h17.334c.36 0 .666-.313.666-.714V7.38c0-.4-.305-.713-.646-.714l-2.121.033A3.468 3.468 0 0 1 19.2 9.333h-6.4a3.468 3.468 0 0 1-3.374-2.666Zm12.715 5.606c.586.446.7 1.283.253 1.868l-7.111 9.334a1.333 1.333 0 0 1-1.792.306l-3.556-2.333a1.333 1.333 0 1 1 1.463-2.23l2.517 1.651 6.358-8.344a1.333 1.333 0 0 1 1.868-.252ZM19.2 4h-6.4a.8.8 0 0 0-.8.8v1.067a.8.8 0 0 0 .8.8h6.4a.8.8 0 0 0 .8-.8V4.8a.8.8 0 0 0-.8-.8Z"/></symbol><symbol id="icon-eds-citation" viewBox="0 0 36 36"><path d="M23.25 1.5a1.5 1.5 0 0 1 1.06.44l8.25 8.25a1.5 1.5 0 0 1 .44 1.06v19.5c0 2.105-1.645 3.75-3.75 3.75H18a1.5 1.5 0 0 1 0-3h11.25c.448 0 .75-.302.75-.75V11.873L22.628 4.5H8.31a.811.811 0 0 0-.8.68l-.011.13V16.5a1.5 1.5 0 0 1-3 0V5.31A3.81 3.81 0 0 1 8.31 1.5h14.94ZM8.223 20.358a.984.984 0 0 1-.192 1.378l-.048.034c-.54.36-.942.676-1.206.951-.59.614-.885 1.395-.885 2.343.115-.028.288-.042.518-.042.662 0 1.26.237 1.791.711.533.474.799 1.074.799 1.799 0 .753-.259 1.352-.777 1.799-.518.446-1.151.669-1.9.669-1.006 0-1.812-.293-2.417-.878C3.302 28.536 3 27.657 3 26.486c0-1.115.165-2.085.496-2.907.331-.823.734-1.513 1.209-2.071.475-.558.971-.997 1.49-1.318a6.01 6.01 0 0 1 .347-.2 1.321 1.321 0 0 1 1.681.368Zm7.5 0a.984.984 0 0 1-.192 1.378l-.048.034c-.54.36-.942.676-1.206.951-.59.614-.885 1.395-.885 2.343.115-.028.288-.042.518-.042.662 0 1.26.237 1.791.711.533.474.799 1.074.799 1.799 0 .753-.259 1.352-.777 1.799-.518.446-1.151.669-1.9.669-1.006 0-1.812-.293-2.417-.878-.604-.586-.906-1.465-.906-2.636 0-1.115.165-2.085.496-2.907.331-.823.734-1.513 1.209-2.071.475-.558.971-.997 1.49-1.318a6.01 6.01 0 0 1 .347-.2 1.321 1.321 0 0 1 1.681.368Z"/></symbol><symbol id="icon-eds-i-access-indicator" viewBox="0 0 16 16"><circle cx="4.5" cy="11.5" r="3.5" style="fill:currentColor"/><path fill-rule="evenodd" d="M4 3v3a1 1 0 0 1-2 0V2.923C2 1.875 2.84 1 3.909 1h5.909a1 1 0 0 1 .713.298l3.181 3.231a1 1 0 0 1 .288.702v7.846c0 .505-.197.993-.554 1.354a1.902 1.902 0 0 1-1.355.569H10a1 1 0 1 1 0-2h2V5.64L9.4 3H4Z" clip-rule="evenodd" style="fill:#222"/></symbol><symbol id="icon-eds-i-github-medium" viewBox="0 0 24 24"><path d="M 11.964844 0 C 5.347656 0 0 5.269531 0 11.792969 C 0 17.003906 3.425781 21.417969 8.179688 22.976562 C 8.773438 23.09375 8.992188 22.722656 8.992188 22.410156 C 8.992188 22.136719 8.972656 21.203125 8.972656 20.226562 C 5.644531 20.929688 4.953125 18.820312 4.953125 18.820312 C 4.417969 17.453125 3.625 17.101562 3.625 17.101562 C 2.535156 16.378906 3.703125 16.378906 3.703125 16.378906 C 4.914062 16.457031 5.546875 17.589844 5.546875 17.589844 C 6.617188 19.386719 8.339844 18.878906 9.03125 18.566406 C 9.132812 17.804688 9.449219 17.277344 9.785156 16.984375 C 7.132812 16.710938 4.339844 15.695312 4.339844 11.167969 C 4.339844 9.878906 4.8125 8.824219 5.566406 8.003906 C 5.445312 7.710938 5.03125 6.5 5.683594 4.878906 C 5.683594 4.878906 6.695312 4.566406 8.972656 6.089844 C 9.949219 5.832031 10.953125 5.703125 11.964844 5.699219 C 12.972656 5.699219 14.003906 5.835938 14.957031 6.089844 C 17.234375 4.566406 18.242188 4.878906 18.242188 4.878906 C 18.898438 6.5 18.480469 7.710938 18.363281 8.003906 C 19.136719 8.824219 19.589844 9.878906 19.589844 11.167969 C 19.589844 15.695312 16.796875 16.691406 14.125 16.984375 C 14.558594 17.355469 14.933594 18.058594 14.933594 19.171875 C 14.933594 20.753906 14.914062 22.019531 14.914062 22.410156 C 14.914062 22.722656 15.132812 23.09375 15.726562 22.976562 C 20.480469 21.414062 23.910156 17.003906 23.910156 11.792969 C 23.929688 5.269531 18.558594 0 11.964844 0 Z M 11.964844 0 "/></symbol><symbol id="icon-eds-i-limited-access" viewBox="0 0 16 16"><path fill-rule="evenodd" d="M4 3v3a1 1 0 0 1-2 0V2.923C2 1.875 2.84 1 3.909 1h5.909a1 1 0 0 1 .713.298l3.181 3.231a1 1 0 0 1 .288.702V6a1 1 0 1 1-2 0v-.36L9.4 3H4ZM3 8a1 1 0 0 1 1 1v1a1 1 0 1 1-2 0V9a1 1 0 0 1 1-1Zm10 0a1 1 0 0 1 1 1v1a1 1 0 1 1-2 0V9a1 1 0 0 1 1-1Zm-3.5 6a1 1 0 0 1-1 1h-1a1 1 0 1 1 0-2h1a1 1 0 0 1 1 1Zm2.441-1a1 1 0 0 1 2 0c0 .73-.246 1.306-.706 1.664a1.61 1.61 0 0 1-.876.334l-.032.002H11.5a1 1 0 1 1 0-2h.441ZM4 13a1 1 0 0 0-2 0c0 .73.247 1.306.706 1.664a1.609 1.609 0 0 0 .876.334l.032.002H4.5a1 1 0 1 0 0-2H4Z" clip-rule="evenodd"/></symbol><symbol id="icon-eds-i-subjects-medium" viewBox="0 0 24 24"><g id="icon-subjects-copy" stroke="none" stroke-width="1" fill-rule="evenodd"><path d="M13.3846154,2 C14.7015971,2 15.7692308,3.06762994 15.7692308,4.38461538 L15.7692308,7.15384615 C15.7692308,8.47082629 14.7015955,9.53846154 13.3846154,9.53846154 L13.1038388,9.53925278 C13.2061091,9.85347965 13.3815528,10.1423885 13.6195822,10.3804178 C13.9722182,10.7330539 14.436524,10.9483278 14.9293854,10.9918129 L15.1153846,11 C16.2068332,11 17.2535347,11.433562 18.0254647,12.2054189 C18.6411944,12.8212361 19.0416785,13.6120766 19.1784166,14.4609738 L19.6153846,14.4615385 C20.932386,14.4615385 22,15.5291672 22,16.8461538 L22,19.6153846 C22,20.9323924 20.9323924,22 19.6153846,22 L16.8461538,22 C15.5291672,22 14.4615385,20.932386 14.4615385,19.6153846 L14.4615385,16.8461538 C14.4615385,15.5291737 15.5291737,14.4615385 16.8461538,14.4615385 L17.126925,14.460779 C17.0246537,14.1465537 16.8492179,13.857633 16.6112344,13.6196157 C16.2144418,13.2228606 15.6764136,13 15.1153846,13 C14.0239122,13 12.9771569,12.5664197 12.2053686,11.7946314 C12.1335167,11.7227795 12.0645962,11.6485444 11.9986839,11.5721119 C11.9354038,11.6485444 11.8664833,11.7227795 11.7946314,11.7946314 C11.0228431,12.5664197 9.97608778,13 8.88461538,13 C8.323576,13 7.78552852,13.2228666 7.38881294,13.6195822 C7.15078359,13.8576115 6.97533988,14.1465203 6.8730696,14.4607472 L7.15384615,14.4615385 C8.47082629,14.4615385 9.53846154,15.5291737 9.53846154,16.8461538 L9.53846154,19.6153846 C9.53846154,20.932386 8.47083276,22 7.15384615,22 L4.38461538,22 C3.06762347,22 2,20.9323876 2,19.6153846 L2,16.8461538 C2,15.5291721 3.06762994,14.4615385 4.38461538,14.4615385 L4.8215823,14.4609378 C4.95831893,13.6120029 5.3588057,12.8211623 5.97459937,12.2053686 C6.69125996,11.488708 7.64500941,11.0636656 8.6514968,11.0066017 L8.88461538,11 C9.44565477,11 9.98370225,10.7771334 10.3804178,10.3804178 C10.6184472,10.1423885 10.7938909,9.85347965 10.8961612,9.53925278 L10.6153846,9.53846154 C9.29840448,9.53846154 8.23076923,8.47082629 8.23076923,7.15384615 L8.23076923,4.38461538 C8.23076923,3.06762994 9.29840286,2 10.6153846,2 L13.3846154,2 Z M7.15384615,16.4615385 L4.38461538,16.4615385 C4.17220099,16.4615385 4,16.63374 4,16.8461538 L4,19.6153846 C4,19.8278134 4.17218833,20 4.38461538,20 L7.15384615,20 C7.36626945,20 7.53846154,19.8278103 7.53846154,19.6153846 L7.53846154,16.8461538 C7.53846154,16.6337432 7.36625679,16.4615385 7.15384615,16.4615385 Z M19.6153846,16.4615385 L16.8461538,16.4615385 C16.6337432,16.4615385 16.4615385,16.6337432 16.4615385,16.8461538 L16.4615385,19.6153846 C16.4615385,19.8278103 16.6337306,20 16.8461538,20 L19.6153846,20 C19.8278229,20 20,19.8278229 20,19.6153846 L20,16.8461538 C20,16.6337306 19.8278103,16.4615385 19.6153846,16.4615385 Z M13.3846154,4 L10.6153846,4 C10.4029708,4 10.2307692,4.17220099 10.2307692,4.38461538 L10.2307692,7.15384615 C10.2307692,7.36625679 10.402974,7.53846154 10.6153846,7.53846154 L13.3846154,7.53846154 C13.597026,7.53846154 13.7692308,7.36625679 13.7692308,7.15384615 L13.7692308,4.38461538 C13.7692308,4.17220099 13.5970292,4 13.3846154,4 Z" id="Shape" fill-rule="nonzero"/></g></symbol><symbol id="icon-eds-small-arrow-left" viewBox="0 0 16 17"><path stroke="currentColor" stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M14 8.092H2m0 0L8 2M2 8.092l6 6.035"/></symbol><symbol id="icon-eds-small-arrow-right" viewBox="0 0 16 16"><g fill-rule="evenodd" stroke="currentColor" stroke-linecap="round" stroke-linejoin="round" stroke-width="2"><path d="M2 8.092h12M8 2l6 6.092M8 14.127l6-6.035"/></g></symbol><symbol id="icon-orcid-logo" viewBox="0 0 40 40"><path fill-rule="evenodd" d="M12.281 10.453c.875 0 1.578-.719 1.578-1.578 0-.86-.703-1.578-1.578-1.578-.875 0-1.578.703-1.578 1.578 0 .86.703 1.578 1.578 1.578Zm-1.203 18.641h2.406V12.359h-2.406v16.735Z"/><path fill-rule="evenodd" d="M17.016 12.36h6.5c6.187 0 8.906 4.421 8.906 8.374 0 4.297-3.36 8.375-8.875 8.375h-6.531V12.36Zm6.234 14.578h-3.828V14.53h3.703c4.688 0 6.828 2.844 6.828 6.203 0 2.063-1.25 6.203-6.703 6.203Z" clip-rule="evenodd"/></symbol></svg> </div> <a class="c-skip-link" href="#main">Skip to main content</a> <header class="eds-c-header" data-eds-c-header> <div class="eds-c-header__container" data-eds-c-header-expander-anchor> <div class="eds-c-header__brand"> <a href="https://link.springer.com" data-test=springerlink-logo data-track="click_imprint_logo" data-track-context="unified header" data-track-action="click logo link" data-track-category="unified header" data-track-label="link" > <img src="/oscar-static/images/darwin/header/img/logo-springer-nature-link-3149409f62.svg" alt="Springer Nature Link"> </a> </div> <a class="c-header__link eds-c-header__link" id="identity-account-widget" href='https://idp.springer.com/auth/personal/springernature?redirect_uri=https://link.springer.com/article/10.1007/s11263-015-0816-y?fromPaywallRec=false'><span class="eds-c-header__widget-fragment-title">Log in</span></a> </div> <nav class="eds-c-header__nav" aria-label="header navigation"> <div class="eds-c-header__nav-container"> <div class="eds-c-header__item eds-c-header__item--menu"> <a href="#eds-c-header-nav" class="eds-c-header__link" data-eds-c-header-expander> <svg class="eds-c-header__icon" width="24" height="24" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-menu-medium"></use> </svg><span>Menu</span> </a> </div> <div class="eds-c-header__item eds-c-header__item--inline-links"> <a class="eds-c-header__link" href="https://link.springer.com/journals/" data-track="nav_find_a_journal" data-track-context="unified header" data-track-action="click find a journal" data-track-category="unified header" data-track-label="link" > Find a journal </a> <a class="eds-c-header__link" href="https://www.springernature.com/gp/authors" data-track="nav_how_to_publish" data-track-context="unified header" data-track-action="click publish with us link" data-track-category="unified header" data-track-label="link" > Publish with us </a> <a class="eds-c-header__link" href="https://link.springernature.com/home/" data-track="nav_track_your_research" data-track-context="unified header" data-track-action="click track your research" data-track-category="unified header" data-track-label="link" > Track your research </a> </div> <div class="eds-c-header__link-container"> <div class="eds-c-header__item eds-c-header__item--divider"> <a href="#eds-c-header-popup-search" class="eds-c-header__link" data-eds-c-header-expander data-eds-c-header-test-search-btn> <svg class="eds-c-header__icon" width="24" height="24" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-search-medium"></use> </svg><span>Search</span> </a> </div> <div id="ecommerce-header-cart-icon-link" class="eds-c-header__item ecommerce-cart" style="display:inline-block"> <a class="eds-c-header__link" href="https://order.springer.com/public/cart" style="appearance:none;border:none;background:none;color:inherit;position:relative"> <svg id="eds-i-cart" class="eds-c-header__icon" xmlns="http://www.w3.org/2000/svg" height="24" width="24" viewBox="0 0 24 24" aria-hidden="true" focusable="false"> <path fill="currentColor" fill-rule="nonzero" d="M2 1a1 1 0 0 0 0 2l1.659.001 2.257 12.808a2.599 2.599 0 0 0 2.435 2.185l.167.004 9.976-.001a2.613 2.613 0 0 0 2.61-1.748l.03-.106 1.755-7.82.032-.107a2.546 2.546 0 0 0-.311-1.986l-.108-.157a2.604 2.604 0 0 0-2.197-1.076L6.042 5l-.56-3.17a1 1 0 0 0-.864-.82l-.12-.007L2.001 1ZM20.35 6.996a.63.63 0 0 1 .54.26.55.55 0 0 1 .082.505l-.028.1L19.2 15.63l-.022.05c-.094.177-.282.299-.526.317l-10.145.002a.61.61 0 0 1-.618-.515L6.394 6.999l13.955-.003ZM18 19a2 2 0 1 0 0 4 2 2 0 0 0 0-4ZM8 19a2 2 0 1 0 0 4 2 2 0 0 0 0-4Z"></path> </svg><span>Cart</span><span class="cart-info" style="display:none;position:absolute;top:10px;right:45px;background-color:#C65301;color:#fff;width:18px;height:18px;font-size:11px;border-radius:50%;line-height:17.5px;text-align:center"></span></a> <script>(function () { var exports = {}; if (window.fetch) { "use strict"; Object.defineProperty(exports, "__esModule", { value: true }); exports.headerWidgetClientInit = void 0; var headerWidgetClientInit = function (getCartInfo) { document.body.addEventListener("updatedCart", function () { updateCartIcon(); }, false); return updateCartIcon(); function updateCartIcon() { return getCartInfo() .then(function (res) { return res.json(); }) .then(refreshCartState) .catch(function (_) { }); } function refreshCartState(json) { var indicator = document.querySelector("#ecommerce-header-cart-icon-link .cart-info"); /* istanbul ignore else */ if (indicator && json.itemCount) { indicator.style.display = 'block'; indicator.textContent = json.itemCount > 9 ? '9+' : json.itemCount.toString(); var moreThanOneItem = json.itemCount > 1; indicator.setAttribute('title', "there ".concat(moreThanOneItem ? "are" : "is", " ").concat(json.itemCount, " item").concat(moreThanOneItem ? "s" : "", " in your cart")); } return json; } }; exports.headerWidgetClientInit = headerWidgetClientInit; headerWidgetClientInit( function () { return window.fetch("https://cart.springer.com/cart-info", { credentials: "include", headers: { Accept: "application/json" } }) } ) }})()</script> </div> </div> </div> </nav> </header> <article lang="en" id="main" class="app-masthead__colour-18"> <section class="app-masthead " aria-label="article masthead"> <div class="app-masthead__container"> <div class="app-article-masthead u-sans-serif js-context-bar-sticky-point-masthead" data-track-component="article" data-test="masthead-component"> <div class="app-article-masthead__info"> <nav aria-label="breadcrumbs" data-test="breadcrumbs"> <ol class="c-breadcrumbs c-breadcrumbs--contrast" itemscope itemtype="https://schema.org/BreadcrumbList"> <li class="c-breadcrumbs__item" id="breadcrumb0" itemprop="itemListElement" itemscope="" itemtype="https://schema.org/ListItem"> <a href="/" class="c-breadcrumbs__link" itemprop="item" data-track="click_breadcrumb" data-track-context="article page" data-track-category="article" data-track-action="breadcrumbs" data-track-label="breadcrumb1"><span itemprop="name">Home</span></a><meta itemprop="position" content="1"> <svg class="c-breadcrumbs__chevron" role="img" aria-hidden="true" focusable="false" width="10" height="10" viewBox="0 0 10 10"> <path d="m5.96738168 4.70639573 2.39518594-2.41447274c.37913917-.38219212.98637524-.38972225 1.35419292-.01894278.37750606.38054586.37784436.99719163-.00013556 1.37821513l-4.03074001 4.06319683c-.37758093.38062133-.98937525.38100976-1.367372-.00003075l-4.03091981-4.06337806c-.37759778-.38063832-.38381821-.99150444-.01600053-1.3622839.37750607-.38054587.98772445-.38240057 1.37006824.00302197l2.39538588 2.4146743.96295325.98624457z" fill-rule="evenodd" transform="matrix(0 -1 1 0 0 10)"/> </svg> </li> <li class="c-breadcrumbs__item" id="breadcrumb1" itemprop="itemListElement" itemscope="" itemtype="https://schema.org/ListItem"> <a href="/journal/11263" class="c-breadcrumbs__link" itemprop="item" data-track="click_breadcrumb" data-track-context="article page" data-track-category="article" data-track-action="breadcrumbs" data-track-label="breadcrumb2"><span itemprop="name">International Journal of Computer Vision</span></a><meta itemprop="position" content="2"> <svg class="c-breadcrumbs__chevron" role="img" aria-hidden="true" focusable="false" width="10" height="10" viewBox="0 0 10 10"> <path d="m5.96738168 4.70639573 2.39518594-2.41447274c.37913917-.38219212.98637524-.38972225 1.35419292-.01894278.37750606.38054586.37784436.99719163-.00013556 1.37821513l-4.03074001 4.06319683c-.37758093.38062133-.98937525.38100976-1.367372-.00003075l-4.03091981-4.06337806c-.37759778-.38063832-.38381821-.99150444-.01600053-1.3622839.37750607-.38054587.98772445-.38240057 1.37006824.00302197l2.39538588 2.4146743.96295325.98624457z" fill-rule="evenodd" transform="matrix(0 -1 1 0 0 10)"/> </svg> </li> <li class="c-breadcrumbs__item" id="breadcrumb2" itemprop="itemListElement" itemscope="" itemtype="https://schema.org/ListItem"> <span itemprop="name">Article</span><meta itemprop="position" content="3"> </li> </ol> </nav> <h1 class="c-article-title" data-test="article-title" data-article-title="">ImageNet Large Scale Visual Recognition Challenge</h1> <ul class="c-article-identifiers"> <li class="c-article-identifiers__item"> Published: <time datetime="2015-04-11">11 April 2015</time> </li> </ul> <ul class="c-article-identifiers c-article-identifiers--cite-list"> <li class="c-article-identifiers__item"> <span data-test="journal-volume">Volume 115</span>, pages 211–252, (<span data-test="article-publication-year">2015</span>) </li> <li class="c-article-identifiers__item c-article-identifiers__item--cite"> <a href="#citeas" data-track="click" data-track-action="cite this article" data-track-category="article body" data-track-label="link">Cite this article</a> </li> </ul> <div class="app-article-masthead__buttons" data-test="download-article-link-wrapper" data-track-context="masthead"> </div> </div> <div class="app-article-masthead__brand"> <a href="/journal/11263" class="app-article-masthead__journal-link" data-track="click_journal_home" data-track-action="journal homepage" data-track-context="article page" data-track-label="link"> <picture> <source type="image/webp" media="(min-width: 768px)" width="120" height="159" srcset="https://media.springernature.com/w120/springer-static/cover-hires/journal/11263?as=webp, https://media.springernature.com/w316/springer-static/cover-hires/journal/11263?as=webp 2x"> <img width="72" height="95" src="https://media.springernature.com/w72/springer-static/cover-hires/journal/11263?as=webp" srcset="https://media.springernature.com/w144/springer-static/cover-hires/journal/11263?as=webp 2x" alt=""> </picture> <span class="app-article-masthead__journal-title">International Journal of Computer Vision</span> </a> <a href="https://link.springer.com/journal/11263/aims-and-scope" class="app-article-masthead__submission-link" data-track="click_aims_and_scope" data-track-action="aims and scope" data-track-context="article page" data-track-label="link"> Aims and scope <svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-arrow-right-medium"></use></svg> </a> <a href="https://www.editorialmanager.com/visi" class="app-article-masthead__submission-link" data-track="click_submit_manuscript" data-track-context="article masthead on springerlink article page" data-track-action="submit manuscript" data-track-label="link"> Submit manuscript <svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-arrow-right-medium"></use></svg> </a> </div> </div> </div> </section> <div class="c-article-main u-container u-mt-24 u-mb-32 l-with-sidebar" id="main-content" data-component="article-container"> <main class="u-serif js-main-column" data-track-component="article body"> <div class="c-article-header"> <header> <ul class="c-article-author-list c-article-author-list--short" data-test="authors-list" data-component-authors-activator="authors-list"><li class="c-article-author-list__item"><a data-test="author-name" data-track="click" data-track-action="open author" data-track-label="link" href="#auth-Olga-Russakovsky-Aff1" data-author-popup="auth-Olga-Russakovsky-Aff1" data-author-search="Russakovsky, Olga" data-corresp-id="c1">Olga Russakovsky<svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-mail-medium"></use></svg></a><sup class="u-js-hide"><a href="#Aff1">1</a></sup>, </li><li class="c-article-author-list__item"><a data-test="author-name" data-track="click" data-track-action="open author" data-track-label="link" href="#auth-Jia-Deng-Aff2" data-author-popup="auth-Jia-Deng-Aff2" data-author-search="Deng, Jia">Jia Deng</a><sup class="u-js-hide"><a href="#Aff2">2</a></sup>, </li><li class="c-article-author-list__item c-article-author-list__item--hide-small-screen"><a data-test="author-name" data-track="click" data-track-action="open author" data-track-label="link" href="#auth-Hao-Su-Aff1" data-author-popup="auth-Hao-Su-Aff1" data-author-search="Su, Hao">Hao Su</a><sup class="u-js-hide"><a href="#Aff1">1</a></sup>, </li><li class="c-article-author-list__item c-article-author-list__item--hide-small-screen"><a data-test="author-name" data-track="click" data-track-action="open author" data-track-label="link" href="#auth-Jonathan-Krause-Aff1" data-author-popup="auth-Jonathan-Krause-Aff1" data-author-search="Krause, Jonathan">Jonathan Krause</a><sup class="u-js-hide"><a href="#Aff1">1</a></sup>, </li><li class="c-article-author-list__item c-article-author-list__item--hide-small-screen"><a data-test="author-name" data-track="click" data-track-action="open author" data-track-label="link" href="#auth-Sanjeev-Satheesh-Aff1" data-author-popup="auth-Sanjeev-Satheesh-Aff1" data-author-search="Satheesh, Sanjeev">Sanjeev Satheesh</a><sup class="u-js-hide"><a href="#Aff1">1</a></sup>, </li><li class="c-article-author-list__item c-article-author-list__item--hide-small-screen"><a data-test="author-name" data-track="click" data-track-action="open author" data-track-label="link" href="#auth-Sean-Ma-Aff1" data-author-popup="auth-Sean-Ma-Aff1" data-author-search="Ma, Sean">Sean Ma</a><sup class="u-js-hide"><a href="#Aff1">1</a></sup>, </li><li class="c-article-author-list__item c-article-author-list__item--hide-small-screen"><a data-test="author-name" data-track="click" data-track-action="open author" data-track-label="link" href="#auth-Zhiheng-Huang-Aff1" data-author-popup="auth-Zhiheng-Huang-Aff1" data-author-search="Huang, Zhiheng">Zhiheng Huang</a><sup class="u-js-hide"><a href="#Aff1">1</a></sup>, </li><li class="c-article-author-list__item c-article-author-list__item--hide-small-screen"><a data-test="author-name" data-track="click" data-track-action="open author" data-track-label="link" href="#auth-Andrej-Karpathy-Aff1" data-author-popup="auth-Andrej-Karpathy-Aff1" data-author-search="Karpathy, Andrej">Andrej Karpathy</a><sup class="u-js-hide"><a href="#Aff1">1</a></sup>, </li><li class="c-article-author-list__item c-article-author-list__item--hide-small-screen"><a data-test="author-name" data-track="click" data-track-action="open author" data-track-label="link" href="#auth-Aditya-Khosla-Aff3" data-author-popup="auth-Aditya-Khosla-Aff3" data-author-search="Khosla, Aditya">Aditya Khosla</a><sup class="u-js-hide"><a href="#Aff3">3</a></sup>, </li><li class="c-article-author-list__item c-article-author-list__item--hide-small-screen"><a data-test="author-name" data-track="click" data-track-action="open author" data-track-label="link" href="#auth-Michael-Bernstein-Aff1" data-author-popup="auth-Michael-Bernstein-Aff1" data-author-search="Bernstein, Michael">Michael Bernstein</a><sup class="u-js-hide"><a href="#Aff1">1</a></sup>, </li><li class="c-article-author-list__item c-article-author-list__item--hide-small-screen"><a data-test="author-name" data-track="click" data-track-action="open author" data-track-label="link" href="#auth-Alexander_C_-Berg-Aff4" data-author-popup="auth-Alexander_C_-Berg-Aff4" data-author-search="Berg, Alexander C.">Alexander C. Berg</a><sup class="u-js-hide"><a href="#Aff4">4</a></sup> &amp; </li><li class="c-article-author-list__show-more" aria-label="Show all 12 authors for this article" title="Show all 12 authors for this article">…</li><li class="c-article-author-list__item"><a data-test="author-name" data-track="click" data-track-action="open author" data-track-label="link" href="#auth-Li-Fei_Fei-Aff1" data-author-popup="auth-Li-Fei_Fei-Aff1" data-author-search="Fei-Fei, Li">Li Fei-Fei</a><sup class="u-js-hide"><a href="#Aff1">1</a></sup> </li></ul><button aria-expanded="false" class="c-article-author-list__button"><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-down-medium"></use></svg><span>Show authors</span></button> <div data-test="article-metrics"> <ul class="app-article-metrics-bar u-list-reset"> <li class="app-article-metrics-bar__item"> <p class="app-article-metrics-bar__count"><svg class="u-icon app-article-metrics-bar__icon" width="24" height="24" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-accesses-medium"></use> </svg>131k <span class="app-article-metrics-bar__label">Accesses</span></p> </li> <li class="app-article-metrics-bar__item"> <p class="app-article-metrics-bar__count"><svg class="u-icon app-article-metrics-bar__icon" width="24" height="24" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-citations-medium"></use> </svg>23k <span class="app-article-metrics-bar__label">Citations</span></p> </li> <li class="app-article-metrics-bar__item"> <p class="app-article-metrics-bar__count"><svg class="u-icon app-article-metrics-bar__icon" width="24" height="24" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-altmetric-medium"></use> </svg>87 <span class="app-article-metrics-bar__label">Altmetric</span></p> </li> <li class="app-article-metrics-bar__item"> <p class="app-article-metrics-bar__count"><svg class="u-icon app-article-metrics-bar__icon app-article-metrics-bar__icon--mentions" width="24" height="24" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-mentions-medium"></use> </svg>5 <span class="app-article-metrics-bar__label">Mentions</span></p> </li> <li class="app-article-metrics-bar__item app-article-metrics-bar__item--metrics"> <p class="app-article-metrics-bar__details"><a href="/article/10.1007/s11263-015-0816-y/metrics" data-track="click" data-track-action="view metrics" data-track-label="link" rel="nofollow">Explore all metrics <svg class="u-icon app-article-metrics-bar__arrow-icon" width="24" height="24" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-arrow-right-medium"></use> </svg></a></p> </li> </ul> </div> <div class="u-mt-32"> </div> </header> </div> <div data-article-body="true" data-track-component="article body" class="c-article-body"> <section aria-labelledby="Abs1" data-title="Abstract" lang="en"><div class="c-article-section" id="Abs1-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Abs1">Abstract</h2><div class="c-article-section__content" id="Abs1-content"><p>The ImageNet Large Scale Visual Recognition Challenge is a benchmark in object category classification and detection on hundreds of object categories and millions of images. The challenge has been run annually from 2010 to present, attracting participation from more than fifty institutions. This paper describes the creation of this benchmark dataset and the advances in object recognition that have been possible as a result. We discuss the challenges of collecting large-scale ground truth annotation, highlight key breakthroughs in categorical object recognition, provide a detailed analysis of the current state of the field of large-scale image classification and object detection, and compare the state-of-the-art computer vision accuracy with human accuracy. We conclude with lessons learned in the 5 years of the challenge, and propose future directions and improvements.</p></div></div></section> <div class="c-notes"> <p class="c-notes__text c-status-message--info"> <svg width="24" height="24" focusable="false" role="img" aria-hidden="true" class="c-status-message__icon"> <use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-info-filled-medium"></use> </svg> This is a preview of subscription content, <a id="test-login-banner-link" href="//wayf.springernature.com?redirect_uri&#x3D;https%3A%2F%2Flink.springer.com%2Farticle%2F10.1007%2Fs11263-015-0816-y%3FfromPaywallRec%3Dfalse%26error%3Dcookies_not_supported%26code%3D87e99a51-693a-4fca-9006-8a4046de5dd8" data-track="click" data-track-action="login" data-track-label="link" class="c-preview-message__link">log in via an institution</a> <svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon c-external-link__icon"> <use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-external-link-small"></use> </svg> to check access. </p> </div> <div data-test="access-article" class="app-article-access"> <h2 class="app-article-access__heading">Access this article</h2> <div class="u-ma-16 u-clear-both"> <a href="//wayf.springernature.com?redirect_uri&#x3D;https%3A%2F%2Flink.springer.com%2Farticle%2F10.1007%2Fs11263-015-0816-y%3FfromPaywallRec%3Dfalse%26error%3Dcookies_not_supported%26code%3D87e99a51-693a-4fca-9006-8a4046de5dd8" class="u-button u-button--full-width u-button--primary u-justify-content-space-between c-pdf-download__link" data-track="click" data-track-action="institution access" data-track-label="button"> <span data-test="access-via-institution">Log in via an institution</span> <svg aria-hidden="true" focusable="false" width="24" height="24" class="u-icon"> <use xlink:href="#icon-eds-i-arrow-right-medium"></use> </svg> </a> </div> <div data-test="buy-box-mobile" class="c-article-buy-box"> <div class="sprcom-buybox-articleDarwin" id="sprcom-buybox-articleDarwin"> <!-- rendered: 2024-11-23T21:56:01.556376 --><!-- Darwin version --> <div class="buying-option" data-test-id="buy-article-darwin"> <div> <div class="c-springer-plus"> <h2 class="springer-plus-heading">Subscribe and save</h2> <div class="springer-plus"> <div class="springer-plus-headline"> <div class="springer-plus-title"> <svg aria-hidden="true" focusable="false" width="16" height="16" class="u-icon"> <use xlink:href="#icon-eds-i-check-filled-medium"></use> </svg><span>Springer+ Basic</span> </div> <div class="dd price-amount-springer-plus"> €32.70 /Month </div> </div> <ul class="buying-option-usps"> <li>Get 10 units per month</li> <li>Download Article/Chapter or eBook</li> <li>1 Unit = 1 Article or 1 Chapter</li> <li>Cancel anytime</li> </ul><a href="https://link.springer.com/product/springer-plus" id="btn-subscribe-springerPlus" class="u-button u-button--full-width u-button--secondary" data-track="click||click_springer_subscribe" data-track-context="buy box"><span>Subscribe now </span> <svg aria-hidden="true" focusable="false" width="16" height="16" class="u-icon"> <use xlink:href="#icon-eds-i-arrow-right-medium"></use> </svg></a> </div> <h2 class="springer-plus-heading">Buy Now</h2> </div> <div class="buybox__buy"> <form action="https://order.springer.com/public/cart" method="post"> <input type="hidden" name="type" value="article"><input type="hidden" name="doi" value="10.1007/s11263-015-0816-y"><input type="hidden" name="isxn" value="1573-1405"><input type="hidden" name="contenttitle" value="ImageNet Large Scale Visual Recognition Challenge"><input type="hidden" name="copyrightyear" value="2015"><input type="hidden" name="year" value="2015"><input type="hidden" name="authors" value="Olga Russakovsky, et al."><input type="hidden" name="title" value="International Journal of Computer Vision"><input type="hidden" name="mac" value="d82d2a23bed70747dba1952adf2c2eb9"> <div class="u-ma-16"> <button type="submit" class="u-button--small u-button u-button--secondary u-button--full-width" onclick="dataLayer.push({&quot;event&quot;:&quot;addToCart&quot;,&quot;ecommerce&quot;:{&quot;currencyCode&quot;:&quot;EUR&quot;,&quot;add&quot;:{&quot;products&quot;:[{&quot;name&quot;:&quot;ImageNet Large Scale Visual Recognition Challenge&quot;,&quot;id&quot;:&quot;1573-1405&quot;,&quot;price&quot;:39.95,&quot;brand&quot;:&quot;Springer US&quot;,&quot;category&quot;:&quot;Computer Imaging, Vision, Pattern Recognition and Graphics&quot;,&quot;variant&quot;:&quot;ppv-article&quot;,&quot;quantity&quot;:1}]}}});"><span>Buy article PDF 39,95 €</span></button> </div> </form> <p class="c-notes__text c-notes__vat">Price includes VAT (Hong Kong/P.R.China)<br></p> <p class="c-notes__text c-notes__usp">Instant access to the full article PDF.</p> </div> </div> <script>dataLayer.push({"ecommerce":{"currency":"EUR","impressions":[{"name":"ImageNet Large Scale Visual Recognition Challenge","id":"1573-1405","price":39.95,"brand":"Springer US","category":"Computer Imaging, Vision, Pattern Recognition and Graphics","variant":"ppv-article","quantity":1}]}});</script> <script style="display: none"> ;(function () { if (document.cookie.indexOf("feature-monetise-subscriptions-display-springer-plus") > -1) { document.querySelectorAll(".c-springer-plus").forEach(function(node) { node.style.display = "block" }) } // springerPlus roll out 10% starts here var springerPlusGroup = setLocalStorageSpringerPlus(); var rollOutSpringerPlus = springerPlusGroup === "B" function setLocalStorageSpringerPlus() { var selectUserKey = "springerPlusRollOut"; var springerPlusGroup = "X"; if (!window.localStorage) return springerPlusGroup; try { var selectUserValue = window.localStorage.getItem(selectUserKey) springerPlusGroup = selectUserValue || randomDistributionSpringerPlus(selectUserKey) } catch (err) { console.log(err) } return springerPlusGroup; } function randomDistributionSpringerPlus(selectUserKey) { var randomGroup = Math.random() < 0.9 ? "A" : "B" window.localStorage.setItem(selectUserKey, randomGroup) return randomGroup } if (rollOutSpringerPlus) { revealSpringerPlus(); } function revealSpringerPlus() { var article = document.getElementById("sprcom-buybox-articleDarwin"); if(article) { document.querySelectorAll(".c-springer-plus").forEach(function(node) { node.style.display = "block" }) } } //springerPlus ends here })() </script> <style> .springer-plus .buying-option-usps > li::before { background-image: url("data:image/svg+xml,%3Csvg viewBox='0 0 100 100' xmlns='http://www.w3.org/2000/svg' fill='%230070A8'%3E%3Ccircle cx='50' cy='50' r='50'/%3E%3C/svg%3E"); } </style> </div> <article class="buybox__rent-article buybox__access-option u-sans-serif" id="deepdyve" style="display: none" data-test-id="journal-subscription"> <div class="c-box__body"> <div class="buybox__info"> <p>Rent this article via <a class="deepdyve-link" target="deepdyve" rel="nofollow" data-track="click" data-track-action="rent article" data-track-label="rent action, new buybox">DeepDyve</a> <svg focusable="false" role="img" aria-hidden="true" class="u-icon" style="vertical-align: middle"> <use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-external-link-small"></use> </svg></p> </div> </div> <script> function deepDyveResponse(data) { if (data.status === 'ok') { [].slice.call(document.querySelectorAll('.buybox__rent-article')).forEach(function (article) { article.style.display = 'flex' var link = article.querySelector('.deepdyve-link') if (link) { link.setAttribute('href', data.url) } }) } } var script = document.createElement('script') script.src = '//www.deepdyve.com/rental-link?docId=10.1007/s11263-015-0816-y&journal=1573-1405&fieldName=journal_doi&affiliateId=springer&format=jsonp&callback=deepDyveResponse' document.body.appendChild(script) </script> </article> <div class="buybox__access-option buybox__institutional-subs-link u-sans-serif"> <p><a href="https://www.springernature.com/gp/librarians/licensing/agc/journals">Institutional subscriptions <svg aria-hidden="true" focusable="false" width="24" height="24" class="u-icon" style="vertical-align: middle"> <use xlink:href="#icon-eds-i-arrow-right-medium"></use> </svg></a></p> </div> <style>.sprcom-buybox-articleDarwin .buybox__access-option{ border-top: 1px solid #cedbe0; font-size: 1rem; padding: 16px; } .sprcom-buybox-articleDarwin .c-springer-plus{ display: none; } .sprcom-buybox-articleDarwin .springer-plus{ background-color: #EBF6FF; font-family: 'Merriweather Sans', 'Helvetica Neue', Helvetica, Arial, sans-serif; padding: 16px; } .sprcom-buybox-articleDarwin .springer-plus-headline{ display: flex; justify-content: space-between; } .sprcom-buybox-articleDarwin .springer-plus-heading{ border-bottom: 1px solid #c5e0f4; border-top: 1px solid #c5e0f4; font-family: 'Merriweather Sans', 'Helvetica Neue', Helvetica, Arial, sans-serif; font-size: 1.125rem; font-weight: 700; margin: 0; padding: 16px; text-align: center; } .sprcom-buybox-articleDarwin .springer-plus-title{ align-items: center; display: flex; } .sprcom-buybox-articleDarwin .springer-plus-title span{ margin-left: 8px; } .sprcom-buybox-articleDarwin .springer-plus a{ background-color: #fff; border: 1px solid #025e8d; color: #025e8d; font-size: 16px; font-weight: 700; max-height: 44px; } .sprcom-buybox-articleDarwin .springer-plus a span{ margin-right: 8px; } .sprcom-buybox-articleDarwin .springer-plus a:hover{ background-color: #025e8d; border: 4px solid #025e8d; box-shadow: none; color: #fff; font-weight: 700; } .sprcom-buybox-articleDarwin .springer-plus a:visited{ color: #025e8d; } .sprcom-buybox-articleDarwin .springer-plus a:visited:hover{ color: #fff; } .sprcom-buybox-articleDarwin .springer-plus .buying-option-usps{ color: #555; font-size: 1rem; line-height: 1.6; list-style: none; margin: 0; padding: 16px 0 24px 0; } .sprcom-buybox-articleDarwin .springer-plus .buying-option-usps > li{ padding-left: 26px; position: relative; } .sprcom-buybox-articleDarwin .springer-plus .buying-option-usps > li::before{ content: ''; height: 10px; left: 0; position: absolute; top: calc(0.8em - 5px); width: 10px; } .sprcom-buybox-articleDarwin .springer-plus .buying-option-usps > li:not(:first-child){ margin-top: 4px; } </style> </div> </div> </div> <div class="u-display-none"> <div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-1"><figure><figcaption><b id="Fig1" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 1</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><picture><source type="image/webp" srcset="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig1_HTML.gif?as=webp"><img src="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig1_HTML.gif" alt="" loading="lazy"></picture></div></div></figure></div><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-2"><figure><figcaption><b id="Fig2" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 2</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><picture><source type="image/webp" srcset="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig2_HTML.gif?as=webp"><img src="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig2_HTML.gif" alt="" loading="lazy"></picture></div></div></figure></div><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-3"><figure><figcaption><b id="Fig3" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 3</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><picture><source type="image/webp" srcset="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig3_HTML.gif?as=webp"><img src="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig3_HTML.gif" alt="" loading="lazy"></picture></div></div></figure></div><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-4"><figure><figcaption><b id="Fig4" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 4</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><picture><source type="image/webp" srcset="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig4_HTML.gif?as=webp"><img src="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig4_HTML.gif" alt="" loading="lazy"></picture></div></div></figure></div><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-5"><figure><figcaption><b id="Fig5" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 5</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><picture><source type="image/webp" srcset="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig5_HTML.gif?as=webp"><img src="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig5_HTML.gif" alt="" loading="lazy"></picture></div></div></figure></div><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-6"><figure><figcaption><b id="Fig6" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 6</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><picture><source type="image/webp" srcset="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig6_HTML.gif?as=webp"><img src="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig6_HTML.gif" alt="" loading="lazy"></picture></div></div></figure></div><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-7"><figure><figcaption><b id="Fig7" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 7</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><picture><source type="image/webp" srcset="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig7_HTML.gif?as=webp"><img src="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig7_HTML.gif" alt="" loading="lazy"></picture></div></div></figure></div><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-8"><figure><figcaption><b id="Fig8" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 8</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><picture><source type="image/webp" srcset="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig8_HTML.gif?as=webp"><img src="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig8_HTML.gif" alt="" loading="lazy"></picture></div></div></figure></div><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-9"><figure><figcaption><b id="Fig9" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 9</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><picture><source type="image/webp" srcset="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig9_HTML.gif?as=webp"><img src="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig9_HTML.gif" alt="" loading="lazy"></picture></div></div></figure></div><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-10"><figure><figcaption><b id="Fig10" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 10</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><picture><source type="image/webp" srcset="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig10_HTML.gif?as=webp"><img src="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig10_HTML.gif" alt="" loading="lazy"></picture></div></div></figure></div><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-11"><figure><figcaption><b id="Fig11" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 11</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><picture><source type="image/webp" srcset="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig11_HTML.gif?as=webp"><img src="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig11_HTML.gif" alt="" loading="lazy"></picture></div></div></figure></div><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-12"><figure><figcaption><b id="Fig12" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 12</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><picture><source type="image/webp" srcset="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig12_HTML.gif?as=webp"><img src="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig12_HTML.gif" alt="" loading="lazy"></picture></div></div></figure></div><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-13"><figure><figcaption><b id="Fig13" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 13</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><picture><source type="image/webp" srcset="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig13_HTML.gif?as=webp"><img src="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig13_HTML.gif" alt="" loading="lazy"></picture></div></div></figure></div><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-14"><figure><figcaption><b id="Fig14" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 14</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><picture><source type="image/webp" srcset="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig14_HTML.gif?as=webp"><img src="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig14_HTML.gif" alt="" loading="lazy"></picture></div></div></figure></div><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-15"><figure><figcaption><b id="Fig15" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 15</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><picture><source type="image/webp" srcset="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig15_HTML.gif?as=webp"><img src="//media.springernature.com/m312/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig15_HTML.gif" alt="" loading="lazy"></picture></div></div></figure></div> </div> <div data-test="cobranding-download"> </div> <section aria-labelledby="inline-recommendations" data-title="Inline Recommendations" class="c-article-recommendations" data-track-component="inline-recommendations"> <h3 class="c-article-recommendations-title" id="inline-recommendations">Similar content being viewed by others</h3> <div class="c-article-recommendations-list"> <div class="c-article-recommendations-list__item"> <article class="c-article-recommendations-card" itemscope itemtype="http://schema.org/ScholarlyArticle"> <div class="c-article-recommendations-card__img"><img src="https://media.springernature.com/w92h120/springer-static/cover-hires/book/978-3-031-20083-0?as&#x3D;webp" loading="lazy" alt=""></div> <div class="c-article-recommendations-card__main"> <h3 class="c-article-recommendations-card__heading" itemprop="name headline"> <a class="c-article-recommendations-card__link" itemprop="url" href="https://link.springer.com/10.1007/978-3-031-20083-0_1?fromPaywallRec=true" data-track="select_recommendations_1" data-track-context="inline recommendations" data-track-action="click recommendations inline - 1" data-track-label="10.1007/978-3-031-20083-0_1">A Simple Approach and Benchmark for 21,000-Category Object Detection </a> </h3> <div class="c-article-meta-recommendations" data-test="recommendation-info"> <span class="c-article-meta-recommendations__item-type">Chapter</span> <span class="c-article-meta-recommendations__date">© 2022</span> </div> </div> </article> </div> <div class="c-article-recommendations-list__item"> <article class="c-article-recommendations-card" itemscope itemtype="http://schema.org/ScholarlyArticle"> <div class="c-article-recommendations-card__img"><img src="https://media.springernature.com/w92h120/springer-static/cover-hires/book/978-3-319-10602-1?as&#x3D;webp" loading="lazy" alt=""></div> <div class="c-article-recommendations-card__main"> <h3 class="c-article-recommendations-card__heading" itemprop="name headline"> <a class="c-article-recommendations-card__link" itemprop="url" href="https://link.springer.com/10.1007/978-3-319-10602-1_48?fromPaywallRec=true" data-track="select_recommendations_2" data-track-context="inline recommendations" data-track-action="click recommendations inline - 2" data-track-label="10.1007/978-3-319-10602-1_48">Microsoft COCO: Common Objects in Context </a> </h3> <div class="c-article-meta-recommendations" data-test="recommendation-info"> <span class="c-article-meta-recommendations__item-type">Chapter</span> <span class="c-article-meta-recommendations__date">© 2014</span> </div> </div> </article> </div> <div class="c-article-recommendations-list__item"> <article class="c-article-recommendations-card" itemscope itemtype="http://schema.org/ScholarlyArticle"> <div class="c-article-recommendations-card__img"><img src="https://media.springernature.com/w92h120/springer-static/cover-hires/book/978-3-031-20077-9?as&#x3D;webp" loading="lazy" alt=""></div> <div class="c-article-recommendations-card__main"> <h3 class="c-article-recommendations-card__heading" itemprop="name headline"> <a class="c-article-recommendations-card__link" itemprop="url" href="https://link.springer.com/10.1007/978-3-031-20077-9_21?fromPaywallRec=true" data-track="select_recommendations_3" data-track-context="inline recommendations" data-track-action="click recommendations inline - 3" data-track-label="10.1007/978-3-031-20077-9_21">Detecting Twenty-Thousand Classes Using Image-Level Supervision </a> </h3> <div class="c-article-meta-recommendations" data-test="recommendation-info"> <span class="c-article-meta-recommendations__item-type">Chapter</span> <span class="c-article-meta-recommendations__date">© 2022</span> </div> </div> </article> </div> </div> </section> <script> window.dataLayer = window.dataLayer || []; window.dataLayer.push({ recommendations: { recommender: 'semantic', model: 'specter', policy_id: 'NA', timestamp: 1732393429, embedded_user: 'null' } }); </script> <section aria-labelledby="content-related-subjects" data-test="subject-content"> <h3 id="content-related-subjects" class="c-article__sub-heading">Explore related subjects</h3> <span class="u-sans-serif u-text-s u-display-block u-mb-24">Discover the latest articles, news and stories from top researchers in related subjects.</span> <ul class="c-article-subject-list" role="list"> <li class="c-article-subject-list__subject"> <a href="/subject/artificial-intelligence" data-track="select_related_subject_1" data-track-context="related subjects from content page" data-track-label="Artificial Intelligence">Artificial Intelligence</a> </li> </ul> </section> <section data-title="Notes"><div class="c-article-section" id="notes-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="notes">Notes</h2><div class="c-article-section__content" id="notes-content"><ol class="c-article-footnote c-article-footnote--listed"><li class="c-article-footnote--listed__item" id="Fn1" data-counter="1."><div class="c-article-footnote--listed__content"><p>In this paper, we will be using the term <i>object recognition</i> broadly to encompass both <i>image classification</i> (a task requiring an algorithm to determine what object classes are present in the image) as well as <i>object detection</i> (a task requiring an algorithm to localize all objects present in the image).</p></div></li><li class="c-article-footnote--listed__item" id="Fn2" data-counter="2."><div class="c-article-footnote--listed__content"><p>In 2010, the test annotations were later released publicly; since then the test annotation have been kept hidden.</p></div></li><li class="c-article-footnote--listed__item" id="Fn3" data-counter="3."><div class="c-article-footnote--listed__content"><p>In addition, ILSVRC in 2012 also included a taster fine-grained classification task, where algorithms would classify dog photographs into one of 120 dog breeds (Khosla et al. <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 2011" title="Khosla, A., Jayadevaprakash, N., Yao, B., &amp; Fei-Fei, L. (2011). Novel dataset for fine-grained image categorization. In First workshop on fine-grained visual categorization, CVPR." href="/article/10.1007/s11263-015-0816-y#ref-CR42" id="ref-link-section-d138070400e858">2011</a>). Fine-grained classification has evolved into its own Fine-Grained classification challenge in 2013 (Berg et al. <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 2013" title="Berg, A., Farrell, R., Khosla, A., Krause, J., Fei-Fei, L., Li, J., &amp; Maji, S. (2013). Fine-grained competition. &#xA; https://sites.google.com/site/fgcomp2013/&#xA; &#xA; ." href="/article/10.1007/s11263-015-0816-y#ref-CR8" id="ref-link-section-d138070400e861">2013</a>), which is outside the scope of this paper.</p></div></li><li class="c-article-footnote--listed__item" id="Fn4" data-counter="4."><div class="c-article-footnote--listed__content"><p> <a href="http://www.flickr.com">www.flickr.com</a>.</p></div></li><li class="c-article-footnote--listed__item" id="Fn5" data-counter="5."><div class="c-article-footnote--listed__content"><p>Some datasets such as PASCAL VOC (Everingham et al. <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 2010" title="Everingham, M., Van Gool, L., Williams, C. K. I., Winn, J., &amp; Zisserman, A. (2010). The Pascal Visual Object Classes (VOC) challenge. International Journal of Computer Vision, 88(2), 303–338." href="/article/10.1007/s11263-015-0816-y#ref-CR19" id="ref-link-section-d138070400e1620">2010</a>) and LabelMe (Russell et al. <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 2007" title="Russell, B., Torralba, A., Murphy, K., &amp; Freeman, W. T. (2007). LabelMe: A database and web-based tool for image annotation. In IJCV." href="/article/10.1007/s11263-015-0816-y#ref-CR66" id="ref-link-section-d138070400e1623">2007</a>) are able to provide more detailed annotations: for example, marking individual object instances as being <i>truncated</i>. We chose not to provide this level of detail in favor of annotating more images and more object instances.</p></div></li><li class="c-article-footnote--listed__item" id="Fn6" data-counter="6."><div class="c-article-footnote--listed__content"><p>Some of the training objects are actually annotated with more detailed classes: for example, one of the 200 object classes is the category “dog,” and some training instances are annotated with the specific dog breed.</p></div></li><li class="c-article-footnote--listed__item" id="Fn7" data-counter="7."><div class="c-article-footnote--listed__content"><p>The validation/test split is consistent with ILSVRC2012: validation images of ILSVRC2012 remained in the validation set of ILSVRC2013, and ILSVRC2012 test images remained in ILSVRC2013 test set.</p></div></li><li class="c-article-footnote--listed__item" id="Fn8" data-counter="8."><div class="c-article-footnote--listed__content"><p>In this paper we focus on the mean average precision across all categories as the measure of a team’s performance. This is done for simplicity and is justified since the ordering of teams by mean average precision was always the same as the ordering by object categories won.</p></div></li><li class="c-article-footnote--listed__item" id="Fn9" data-counter="9."><div class="c-article-footnote--listed__content"><p>Table <a data-track="click" data-track-label="link" data-track-action="table anchor" href="/article/10.1007/s11263-015-0816-y#Tab8">8</a> omits 4 teams which submitted results but chose not to officially participate in the challenge.</p></div></li><li class="c-article-footnote--listed__item" id="Fn10" data-counter="10."><div class="c-article-footnote--listed__content"><p>Personal communication with members of the UvA team.</p></div></li><li class="c-article-footnote--listed__item" id="Fn11" data-counter="11."><div class="c-article-footnote--listed__content"><p>For rigid versus deformable objects, the average scale in each bin is 34.1–34.2 % for classification and localization, and 13.5–13.7 % for detection. For texture, the average scale in each of the four bins is 31.1–31.3 % for classification and localization, and 12.7–12.8 % for detection.</p></div></li><li class="c-article-footnote--listed__item" id="Fn12" data-counter="12."><div class="c-article-footnote--listed__content"><p>Natural object detection classes are removed from this analysis because there are only 3 and 13 natural untextured and low-textured classes respectively, and none remain after scale normalization. All other bins contain at least 9 object classes after scale normalization.</p></div></li><li class="c-article-footnote--listed__item" id="Fn13" data-counter="13."><div class="c-article-footnote--listed__content"><p> <a href="http://webscope.sandbox.yahoo.com/catalog.php?datatype=i&amp;did=67">http://webscope.sandbox.yahoo.com/catalog.php?datatype=i&amp;did=67</a>.</p></div></li></ol></div></div></section><div id="MagazineFulltextArticleBodySuffix"><section aria-labelledby="Bib1" data-title="References"><div class="c-article-section" id="Bib1-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Bib1">References</h2><div class="c-article-section__content" id="Bib1-content"><div data-container-section="references"><ul class="c-article-references" data-track-component="outbound reference" data-track-context="references section"><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR1">Ahonen, T., Hadid, A., &amp; Pietikinen, M. (2006). Face description with local binary patterns: Application to face recognition. <i>Pattern Analysis and Machine Intelligence</i>, <i>28</i>(14), 2037–2041.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TPAMI.2006.244" data-track-item_id="10.1109/TPAMI.2006.244" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTPAMI.2006.244" aria-label="Article reference 1" data-doi="10.1109/TPAMI.2006.244">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 1" href="http://scholar.google.com/scholar_lookup?&amp;title=Face%20description%20with%20local%20binary%20patterns%3A%20Application%20to%20face%20recognition&amp;journal=Pattern%20Analysis%20and%20Machine%20Intelligence&amp;doi=10.1109%2FTPAMI.2006.244&amp;volume=28&amp;issue=14&amp;pages=2037-2041&amp;publication_year=2006&amp;author=Ahonen%2CT&amp;author=Hadid%2CA&amp;author=Pietikinen%2CM"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR2">Alexe, B., Deselares, T., &amp; Ferrari, V. (2012). Measuring the objectness of image windows. <i>IEEE Transactions on Pattern Analysis and Machine Intelligence</i>, <i>34</i>(11), 2189–2202.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TPAMI.2012.28" data-track-item_id="10.1109/TPAMI.2012.28" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTPAMI.2012.28" aria-label="Article reference 2" data-doi="10.1109/TPAMI.2012.28">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 2" href="http://scholar.google.com/scholar_lookup?&amp;title=Measuring%20the%20objectness%20of%20image%20windows&amp;journal=IEEE%20Transactions%20on%20Pattern%20Analysis%20and%20Machine%20Intelligence&amp;doi=10.1109%2FTPAMI.2012.28&amp;volume=34&amp;issue=11&amp;pages=2189-2202&amp;publication_year=2012&amp;author=Alexe%2CB&amp;author=Deselares%2CT&amp;author=Ferrari%2CV"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR3">Arandjelovic, R., &amp; Zisserman, A. (2012). Three things everyone should know to improve object retrieval. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR4">Arbeláez, P., Pont-Tuset, J., Barron, J., Marques, F., &amp; Malik, J. (2014). Multiscale combinatorial grouping. In <i>Computer vision and pattern recognition</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR5">Arbelaez, P., Maire, M., Fowlkes, C., &amp; Malik, J. (2011). Contour detection and hierarchical image segmentation. <i>IEEE Transaction on Pattern Analysis and Machine Intelligence</i>, <i>33</i>, 898–916.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TPAMI.2010.161" data-track-item_id="10.1109/TPAMI.2010.161" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTPAMI.2010.161" aria-label="Article reference 5" data-doi="10.1109/TPAMI.2010.161">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 5" href="http://scholar.google.com/scholar_lookup?&amp;title=Contour%20detection%20and%20hierarchical%20image%20segmentation&amp;journal=IEEE%20Transaction%20on%20Pattern%20Analysis%20and%20Machine%20Intelligence&amp;doi=10.1109%2FTPAMI.2010.161&amp;volume=33&amp;pages=898-916&amp;publication_year=2011&amp;author=Arbelaez%2CP&amp;author=Maire%2CM&amp;author=Fowlkes%2CC&amp;author=Malik%2CJ"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR6">Batra, D., Agrawal, H., Banik, P., Chavali, N., Mathialagan, C. S., &amp; Alfadda, A. (2013). Cloudcv: Large-scale distributed computer vision as a cloud service.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR7">Bell, S., Upchurch, P., Snavely, N., &amp; Bala, K. (2013). OpenSurfaces: A richly annotated catalog of surface appearance. In <i>ACM transactions on graphics (SIGGRAPH)</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR8">Berg, A., Farrell, R., Khosla, A., Krause, J., Fei-Fei, L., Li, J., &amp; Maji, S. (2013). Fine-grained competition. <a href="https://sites.google.com/site/fgcomp2013/" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="https://sites.google.com/site/fgcomp2013/">https://sites.google.com/site/fgcomp2013/</a>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR9">Chatfield, K., Simonyan, K., Vedaldi, A., &amp; Zisserman, A. (2014). Return of the devil in the details: Delving deep into convolutional nets. CoRR, abs/1405.3531.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR10">Chen, Q., Song, Z., Huang, Z., Hua, Y., &amp; Yan, S. (2014). Contextualizing object detection and classification. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR11">Crammer, K., Dekel, O., Keshet, J., Shalev-Shwartz, S., &amp; Singer, Y. (2006). Online passive-aggressive algorithms. <i>Journal of Machine Learning Research</i>, <i>7</i>, 551–585.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="link" data-track-item_id="link" data-track-value="math reference" data-track-action="math reference" href="http://www.emis.de/MATH-item?1222.68177" aria-label="MATH reference 11">MATH</a>  <a data-track="click_references" rel="nofollow noopener" data-track-label="link" data-track-item_id="link" data-track-value="mathscinet reference" data-track-action="mathscinet reference" href="http://www.ams.org/mathscinet-getitem?mr=2274378" aria-label="MathSciNet reference 11">MathSciNet</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 11" href="http://scholar.google.com/scholar_lookup?&amp;title=Online%20passive-aggressive%20algorithms&amp;journal=Journal%20of%20Machine%20Learning%20Research&amp;volume=7&amp;pages=551-585&amp;publication_year=2006&amp;author=Crammer%2CK&amp;author=Dekel%2CO&amp;author=Keshet%2CJ&amp;author=Shalev-Shwartz%2CS&amp;author=Singer%2CY"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR12">Criminisi, A. (2004). Microsoft Research Cambridge (MSRC) object recognition image database (version 2.0). <a href="http://research.microsoft.com/vision/cambridge/recognition" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="http://research.microsoft.com/vision/cambridge/recognition">http://research.microsoft.com/vision/cambridge/recognition</a>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR13">Dean, T., Ruzon, M., Segal, M., Shlens, J., Vijayanarasimhan, S., &amp; Yagnik, J. (2013). Fast, accurate detection of 100,000 object classes on a single machine. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR14">Deng, J., Dong, W., Socher, R., Li, L.-J., Li, K., &amp; Fei-Fei, L. (2009). ImageNet: A large-scale hierarchical image database. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR15">Deng, J., Russakovsky, O., Krause, J., Bernstein, M., Berg, A. C., &amp; Fei-Fei, L. (2014). Scalable multi-label annotation. In <i>CHI</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR16">Donahue, J., Jia, Y., Vinyals, O., Hoffman, J., Zhang, N., Tzeng, E., &amp; Darrell, T. (2013). Decaf: A deep convolutional activation feature for generic visual recognition. CoRR, abs/1310.1531.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR17">Dubout, C., &amp; Fleuret, F. (2012). Exact acceleration of linear object detectors. In <i>Proceedings of the European conference on computer vision (ECCV)</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR18">Everingham, M., Gool, L. V., Williams, C., Winn, J., &amp; Zisserman, A. (2005–2012). PASCAL Visual Object Classes Challenge (VOC). <a href="http://www.pascal-network.org/challenges/VOC/voc2012/workshop/index.html" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="http://www.pascal-network.org/challenges/VOC/voc2012/workshop/index.html">http://www.pascal-network.org/challenges/VOC/voc2012/workshop/index.html</a>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR19">Everingham, M., Van Gool, L., Williams, C. K. I., Winn, J., &amp; Zisserman, A. (2010). The Pascal Visual Object Classes (VOC) challenge. <i>International Journal of Computer Vision</i>, <i>88</i>(2), 303–338.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="noopener" data-track-label="10.1007/s11263-009-0275-4" data-track-item_id="10.1007/s11263-009-0275-4" data-track-value="article reference" data-track-action="article reference" href="https://link.springer.com/doi/10.1007/s11263-009-0275-4" aria-label="Article reference 19" data-doi="10.1007/s11263-009-0275-4">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 19" href="http://scholar.google.com/scholar_lookup?&amp;title=The%20Pascal%20Visual%20Object%20Classes%20%28VOC%29%20challenge&amp;journal=International%20Journal%20of%20Computer%20Vision&amp;doi=10.1007%2Fs11263-009-0275-4&amp;volume=88&amp;issue=2&amp;pages=303-338&amp;publication_year=2010&amp;author=Everingham%2CM&amp;author=Gool%2CL&amp;author=Williams%2CCKI&amp;author=Winn%2CJ&amp;author=Zisserman%2CA"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR20">Everingham, M., Eslami, S. M. A., Van Gool, L., Williams, C. K. I., Winn, J., &amp; Zisserman, A. (2014). The Pascal Visual Object Classes (VOC) challenge—A retrospective. <i>International Journal of Computer Vision</i>, <i>111</i>, 98–136.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="noopener" data-track-label="10.1007/s11263-014-0733-5" data-track-item_id="10.1007/s11263-014-0733-5" data-track-value="article reference" data-track-action="article reference" href="https://link.springer.com/doi/10.1007/s11263-014-0733-5" aria-label="Article reference 20" data-doi="10.1007/s11263-014-0733-5">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 20" href="http://scholar.google.com/scholar_lookup?&amp;title=The%20Pascal%20Visual%20Object%20Classes%20%28VOC%29%20challenge%E2%80%94A%20retrospective&amp;journal=International%20Journal%20of%20Computer%20Vision&amp;doi=10.1007%2Fs11263-014-0733-5&amp;volume=111&amp;pages=98-136&amp;publication_year=2014&amp;author=Everingham%2CM&amp;author=Eslami%2CSMA&amp;author=Gool%2CL&amp;author=Williams%2CCKI&amp;author=Winn%2CJ&amp;author=Zisserman%2CA"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR21">Fei-Fei, L., &amp; Perona, P. (2005). A Bayesian hierarchical model for learning natural scene categories. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR22">Fei-Fei, L., Fergus, R., &amp; Perona, P. (2004). Learning generative visual models from few examples: An incremental bayesian approach tested on 101 object categories. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR23">Felzenszwalb, P., Girshick, R., McAllester, D., &amp; Ramanan, D. (2010). Object detection with discriminatively trained part based models. <i>IEEE Transactions on Pattern Analysis and Machine Intelligence</i>, <i>32</i>(9), 1627–1645.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TPAMI.2009.167" data-track-item_id="10.1109/TPAMI.2009.167" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTPAMI.2009.167" aria-label="Article reference 23" data-doi="10.1109/TPAMI.2009.167">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 23" href="http://scholar.google.com/scholar_lookup?&amp;title=Object%20detection%20with%20discriminatively%20trained%20part%20based%20models&amp;journal=IEEE%20Transactions%20on%20Pattern%20Analysis%20and%20Machine%20Intelligence&amp;doi=10.1109%2FTPAMI.2009.167&amp;volume=32&amp;issue=9&amp;pages=1627-1645&amp;publication_year=2010&amp;author=Felzenszwalb%2CP&amp;author=Girshick%2CR&amp;author=McAllester%2CD&amp;author=Ramanan%2CD"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR24">Frome, A., Corrado, G., Shlens, J., Bengio, S., Dean, J., Ranzato, M., &amp; Mikolov, T. (2013). Devise: A deep visual-semantic embedding model. In <i>Advances in neural information processing systems, NIPS</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR25">Geiger, A., Lenz, P., Stiller, C., &amp; Urtasun, R. (2013). Vision meets robotics: The kitti dataset. <i>International Journal of Robotics Research</i>, <i>32</i>, 1231–1237.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1177/0278364913491297" data-track-item_id="10.1177/0278364913491297" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1177%2F0278364913491297" aria-label="Article reference 25" data-doi="10.1177/0278364913491297">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 25" href="http://scholar.google.com/scholar_lookup?&amp;title=Vision%20meets%20robotics%3A%20The%20kitti%20dataset&amp;journal=International%20Journal%20of%20Robotics%20Research&amp;doi=10.1177%2F0278364913491297&amp;volume=32&amp;pages=1231-1237&amp;publication_year=2013&amp;author=Geiger%2CA&amp;author=Lenz%2CP&amp;author=Stiller%2CC&amp;author=Urtasun%2CR"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR26">Girshick, R. B., Donahue, J., Darrell, T., &amp; Malik, J. (2013). Rich feature hierarchies for accurate object detection and semantic segmentation (v4). CoRR.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR27">Girshick, R., Donahue, J., Darrell, T., &amp; Malik., J. (2014). Rich feature hierarchies for accurate object detection and semantic segmentation. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR28">Gould, S., Fulton, R., &amp; Koller, D. (2009). Decomposing a scene into geometric and semantically consistent regions. In <i>ICCV</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR29">Graham, B. (2013). Sparse arrays of signatures for online character recognition. CoRR.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR30">Griffin, G., Holub, A., &amp; Perona, P. (2007). Caltech-256 object category dataset. Technical report 7694, Caltech.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR31">Harada, T., &amp; Kuniyoshi, Y. (2012). Graphical Gaussian vector for image categorization. In <i>NIPS</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR32">Harel, J., Koch, C., &amp; Perona, P. (2007). Graph-based visual saliency. In <i>NIPS</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR33">He, K., Zhang, X., Ren, S., &amp; Su, J. (2014). Spatial pyramid pooling in deep convolutional networks for visual recognition. In <i>ECCV</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR34">Hinton, G. E., Srivastava, N., Krizhevsky, A., Sutskever, I., &amp; Salakhutdinov, R. (2012). Improving neural networks by preventing co-adaptation of feature detectors. CoRR, abs/1207.0580.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR35">Hoiem, D., Chodpathumwan, Y., &amp; Dai, Q. (2012). Diagnosing error in object detectors. In <i>ECCV</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR36">Howard, A. (2014). Some improvements on deep convolutional neural network based image classification. In <i>ICLR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR37">Huang, G. B., Ramesh, M., Berg, T., &amp; Learned-Miller, E. (2007). Labeled faces in the wild: A database for studying face recognition in unconstrained environments. Technical report 07–49, University of Massachusetts, Amherst.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR38">Iandola, F. N., Moskewicz, M. W., Karayev, S., Girshick, R. B., Darrell, T., &amp; Keutzer, K. (2014). Densenet: Implementing efficient convnet descriptor pyramids. CoRR.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR39">Jia, Y. (2013). Caffe: An open source convolutional architecture for fast feature embedding. <a href="http://caffe.berkeleyvision.org/" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="http://caffe.berkeleyvision.org/">http://caffe.berkeleyvision.org/</a>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR40">Jojic, N., Frey, B. J., &amp; Kannan, A. (2003). Epitomic analysis of appearance and shape. In <i>ICCV</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR41">Kanezaki, A., Inaba, S., Ushiku, Y., Yamashita, Y., Muraoka, H., Kuniyoshi, Y., &amp; Harada, T. (2014). Hard negative classes for multiple object detection. In <i>ICRA</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR42">Khosla, A., Jayadevaprakash, N., Yao, B., &amp; Fei-Fei, L. (2011). Novel dataset for fine-grained image categorization. In <i>First workshop on fine-grained visual categorization, CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR43">Krizhevsky, A., Sutskever, I., &amp; Hinton, G. (2012). ImageNet classification with deep convolutional neural networks. In <i>NIPS</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR44">Kuettel, D., Guillaumin, M., &amp; Ferrari, V. (2012). Segmentation propagation in ImageNet. In <i>ECCV</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR45">Lazebnik, S., Schmid, C., &amp; Ponce, J. (2006). Beyond bags of features: Spatial pyramid matching for recognizing natural scene categories. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR46">Lin, M., Chen, Q., &amp; Yan, S. (2014a). Network in network. In <i>ICLR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR47">Lin, Y., Lv, F., Cao, L., Zhu, S., Yang, M., Cour, T., Yu, K., &amp; Huang, T. (2011). Large-scale image classification: Fast feature extraction and SVM training. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR48">Lin, T.-Y., Maire, M., Belongie, S., Hays, J., Perona, P., Ramanan, D., Dollr, P., &amp; Zitnick, C. L. (2014b). Microsoft COCO: Common objects in context. In <i>ECCV</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR49">Liu, C., Yuen, J., &amp; Torralba, A. (2011). Nonparametric scene parsing via label transfer. <i>IEEE Transactions on Pattern Analysis and Machine Intelligence</i>, <i>32</i>, 2368–2382.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TPAMI.2011.131" data-track-item_id="10.1109/TPAMI.2011.131" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTPAMI.2011.131" aria-label="Article reference 49" data-doi="10.1109/TPAMI.2011.131">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 49" href="http://scholar.google.com/scholar_lookup?&amp;title=Nonparametric%20scene%20parsing%20via%20label%20transfer&amp;journal=IEEE%20Transactions%20on%20Pattern%20Analysis%20and%20Machine%20Intelligence&amp;doi=10.1109%2FTPAMI.2011.131&amp;volume=32&amp;pages=2368-2382&amp;publication_year=2011&amp;author=Liu%2CC&amp;author=Yuen%2CJ&amp;author=Torralba%2CA"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR50">Lowe, D. G. (2004). Distinctive image features from scale-invariant keypoints. <i>International Journal of Computer Vision</i>, <i>60</i>(2), 91–110.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1023/B:VISI.0000029664.99615.94" data-track-item_id="10.1023/B:VISI.0000029664.99615.94" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1023%2FB%3AVISI.0000029664.99615.94" aria-label="Article reference 50" data-doi="10.1023/B:VISI.0000029664.99615.94">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 50" href="http://scholar.google.com/scholar_lookup?&amp;title=Distinctive%20image%20features%20from%20scale-invariant%20keypoints&amp;journal=International%20Journal%20of%20Computer%20Vision&amp;doi=10.1023%2FB%3AVISI.0000029664.99615.94&amp;volume=60&amp;issue=2&amp;pages=91-110&amp;publication_year=2004&amp;author=Lowe%2CDG"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR51">Maji, S., &amp; Malik, J. (2009). Object detection using a max-margin hough transform. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR52">Manen, S., Guillaumin, M., &amp; Van Gool, L. (2013). Prime object proposals with randomized Prim’s algorithm. In <i>ICCV</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR53">Mensink, T., Verbeek, J., Perronnin, F., &amp; Csurka, G. (2012). Metric learning for large scale image classification: Generalizing to new classes at near-zero cost. In <i>ECCV</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR54">Mikolov, T., Chen, K., Corrado, G., &amp; Dean, J. (2013). Efficient estimation of word representations in vector space. In <i>ICLR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR55">Miller, G. A. (1995). Wordnet: A lexical database for English. <i>Commun. ACM</i>, <i>38</i>(11), 39–41.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1145/219717.219748" data-track-item_id="10.1145/219717.219748" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1145%2F219717.219748" aria-label="Article reference 55" data-doi="10.1145/219717.219748">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 55" href="http://scholar.google.com/scholar_lookup?&amp;title=Wordnet%3A%20A%20lexical%20database%20for%20English&amp;journal=Commun.%20ACM&amp;doi=10.1145%2F219717.219748&amp;volume=38&amp;issue=11&amp;pages=39-41&amp;publication_year=1995&amp;author=Miller%2CGA"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR56">Oliva, A., &amp; Torralba, A. (2001). Modeling the shape of the scene: A holistic representation of the spatial envelope. In <i>IJCV</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR57">Ordonez, V., Deng, J., Choi, Y., Berg, A. C., &amp; Berg, T. L. (2013). From large scale image categorization to entry-level categories. In <i>IEEE international conference on computer vision (ICCV)</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR58">Ouyang, W., &amp; Wang, X. (2013). Joint deep learning for pedestrian detection. In <i>ICCV</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR59">Ouyang, W., Luo, P., Zeng, X., Qiu, S., Tian, Y., Li, H., Yang, S., Wang, Z., Xiong, Y., Qian, C., Zhu, Z., Wang, R., Loy, C. C., Wang, X., &amp; Tang, X. (2014). Deepid-net: multi-stage and deformable deep convolutional neural networks for object detection. CoRR, abs/1409.3505.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR60">Papandreou, G. (2014). Deep epitomic convolutional neural networks. CoRR.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR61">Papandreou, G., Chen, L.-C., &amp; Yuille, A. L. (2014). Modeling image patches with a generic dictionary of mini-epitomes.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR62">Perronnin, F., &amp; Dance, C. R. (2007). Fisher kernels on visual vocabularies for image categorization. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR63">Perronnin, F., Akata, Z., Harchaoui, Z., &amp; Schmid, C. (2012). Towards good practice in large-scale learning for image classification. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR64">Perronnin, F., Sánchez, J., &amp; Mensink, T. (2010). Improving the fisher kernel for large-scale image classification. In <i>ECCV</i> (4).</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR65">Russakovsky, O., Deng, J., Huang, Z., Berg, A., &amp; Fei-Fei, L. (2013). Detecting avocados to zucchinis: What have we done, &amp; where are we going? In <i>ICCV</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR66">Russell, B., Torralba, A., Murphy, K., &amp; Freeman, W. T. (2007). LabelMe: A database and web-based tool for image annotation. In <i>IJCV</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR67">Sanchez, J., &amp; Perronnin, F. (2011). High-dim. signature compression for large-scale image classification. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR68">Sanchez, J., Perronnin, F., &amp; de Campos, T. (2012). Modeling spatial layout of images beyond spatial pyramids. In <i>PRL</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR69">Scheirer, W., Kumar, N., Belhumeur, P. N., &amp; Boult, T. E. (2012). Multi-attribute spaces: Calibration for attribute fusion and similarity search. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR70">Schmidhuber, J. (2012). Multi-column deep neural networks for image classification. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR71">Sermanet, P., Eigen, D., Zhang, X., Mathieu, M., Fergus, R., &amp; LeCun, Y. (2013). Overfeat: Integrated recognition, localization and detection using convolutional networks. CoRR, abs/1312.6229.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR72">Sheng, V. S., Provost, F., &amp; Ipeirotis, P. G. (2008). Get another label? Improving data quality and data mining using multiple, noisy labelers. In <i>SIGKDD</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR73">Simonyan, K., &amp; Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. CoRR, abs/1409.1556.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR74">Simonyan, K., Vedaldi, A., &amp; Zisserman, A. (2013). Deep fisher networks for large-scale image classification. In <i>NIPS</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR75">Sorokin, A., &amp; Forsyth, D. (2008). Utility data annotation with Amazon Mechanical Turk. In <i>InterNet08</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR76">Su, H., Deng, J., &amp; Fei-Fei, L. (2012). Crowdsourcing annotations for visual object detection. In <i>AAAI human computation workshop</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR77">Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., Erhan, D., &amp; Rabinovich, A. (2014). Going deeper with convolutions. Technical report.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR78">Tang, Y. (2013). Deep learning using support vector machines. CoRR, abs/1306.0239.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR79">Thorpe, S., Fize, D., Marlot, C., et al. (1996). Speed of processing in the human visual system. <i>Nature</i>, <i>381</i>(6582), 520–522.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1038/381520a0" data-track-item_id="10.1038/381520a0" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1038%2F381520a0" aria-label="Article reference 79" data-doi="10.1038/381520a0">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 79" href="http://scholar.google.com/scholar_lookup?&amp;title=Speed%20of%20processing%20in%20the%20human%20visual%20system&amp;journal=Nature&amp;doi=10.1038%2F381520a0&amp;volume=381&amp;issue=6582&amp;pages=520-522&amp;publication_year=1996&amp;author=Thorpe%2CS&amp;author=Fize%2CD&amp;author=Marlot%2CC"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR80">Torralba, A., &amp; Efros, A. A. (2011). Unbiased look at dataset bias. In <i>CVPR’11</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR81">Torralba, A., Fergus, R., &amp; Freeman, W. (2008). 80 million tiny images: A large data set for nonparametric object and scene recognition. <i>IEEE Transactions on Pattern Analysis and Machine Intelligence</i>, <i>30</i>, 1958–1970.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TPAMI.2008.128" data-track-item_id="10.1109/TPAMI.2008.128" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTPAMI.2008.128" aria-label="Article reference 81" data-doi="10.1109/TPAMI.2008.128">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 81" href="http://scholar.google.com/scholar_lookup?&amp;title=80%20million%20tiny%20images%3A%20A%20large%20data%20set%20for%20nonparametric%20object%20and%20scene%20recognition&amp;journal=IEEE%20Transactions%20on%20Pattern%20Analysis%20and%20Machine%20Intelligence&amp;doi=10.1109%2FTPAMI.2008.128&amp;volume=30&amp;pages=1958-1970&amp;publication_year=2008&amp;author=Torralba%2CA&amp;author=Fergus%2CR&amp;author=Freeman%2CW"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR82">Uijlings, J., van de Sande, K., Gevers, T., &amp; Smeulders, A. (2013). Selective search for object recognition. <i>International Journal of Computer Vision</i>, <i>104</i>, 154–171.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="noopener" data-track-label="10.1007/s11263-013-0620-5" data-track-item_id="10.1007/s11263-013-0620-5" data-track-value="article reference" data-track-action="article reference" href="https://link.springer.com/doi/10.1007/s11263-013-0620-5" aria-label="Article reference 82" data-doi="10.1007/s11263-013-0620-5">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 82" href="http://scholar.google.com/scholar_lookup?&amp;title=Selective%20search%20for%20object%20recognition&amp;journal=International%20Journal%20of%20Computer%20Vision&amp;doi=10.1007%2Fs11263-013-0620-5&amp;volume=104&amp;pages=154-171&amp;publication_year=2013&amp;author=Uijlings%2CJ&amp;author=Sande%2CK&amp;author=Gevers%2CT&amp;author=Smeulders%2CA"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR83">Urtasun, R., Fergus, R., Hoiem, D., Torralba, A., Geiger, A., Lenz, P., Silberman, N., Xiao, J., &amp; Fidler, S. (2013–2014). Reconstruction meets recognition challenge. <a href="http://ttic.uchicago.edu/rurtasun/rmrc/" data-track="click_references" data-track-action="external reference" data-track-value="external reference" data-track-label="http://ttic.uchicago.edu/rurtasun/rmrc/">http://ttic.uchicago.edu/rurtasun/rmrc/</a>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR84">van de Sande, K. E. A., Snoek, C. G. M., &amp; Smeulders, A. W. M. (2014). Fisher and vlad with flair. In <i>Proceedings of the IEEE conference on computer vision and pattern recognition</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR85">van de Sande, K. E. A., Uijlings, J. R. R., Gevers, T., &amp; Smeulders, A. W. M. (2011b). Segmentation as selective search for object recognition. In <i>ICCV</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR86">van de Sande, K. E. A., Gevers, T., &amp; Snoek, C. G. M. (2010). Evaluating color descriptors for object and scene recognition. <i>IEEE Transactions on Pattern Analysis and Machine Intelligence</i>, <i>32</i>(9), 1582–1596.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TPAMI.2009.154" data-track-item_id="10.1109/TPAMI.2009.154" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTPAMI.2009.154" aria-label="Article reference 86" data-doi="10.1109/TPAMI.2009.154">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 86" href="http://scholar.google.com/scholar_lookup?&amp;title=Evaluating%20color%20descriptors%20for%20object%20and%20scene%20recognition&amp;journal=IEEE%20Transactions%20on%20Pattern%20Analysis%20and%20Machine%20Intelligence&amp;doi=10.1109%2FTPAMI.2009.154&amp;volume=32&amp;issue=9&amp;pages=1582-1596&amp;publication_year=2010&amp;author=Sande%2CKEA&amp;author=Gevers%2CT&amp;author=Snoek%2CCGM"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR87">van de Sande, K. E. A., Gevers, T., &amp; Snoek, C. G. M. (2011a). Empowering visual categorization with the GPU. <i>IEEE Transactions on Multimedia</i>, <i>13</i>(1), 60–70.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" rel="nofollow noopener" data-track-label="10.1109/TMM.2010.2091400" data-track-item_id="10.1109/TMM.2010.2091400" data-track-value="article reference" data-track-action="article reference" href="https://doi.org/10.1109%2FTMM.2010.2091400" aria-label="Article reference 87" data-doi="10.1109/TMM.2010.2091400">Article</a>  <a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 87" href="http://scholar.google.com/scholar_lookup?&amp;title=Empowering%20visual%20categorization%20with%20the%20GPU&amp;journal=IEEE%20Transactions%20on%20Multimedia&amp;doi=10.1109%2FTMM.2010.2091400&amp;volume=13&amp;issue=1&amp;pages=60-70&amp;publication_year=2011&amp;author=Sande%2CKEA&amp;author=Gevers%2CT&amp;author=Snoek%2CCGM"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR88">Vittayakorn, S., &amp; Hays, J. (2011). Quality assessment for crowdsourced object annotations. In <i>BMVC</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR89">von Ahn, L., &amp; Dabbish, L. (2005). Esp: Labeling images with a computer game. In <i>AAAI spring symposium: Knowledge collection from volunteer contributors</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR90">Vondrick, C., Patterson, D., &amp; Ramanan, D. (2012). Efficiently scaling up crowdsourced video annotation. <i>International Journal of Computer Vision</i>, <i>1010</i>, 184–204.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 90" href="http://scholar.google.com/scholar_lookup?&amp;title=Efficiently%20scaling%20up%20crowdsourced%20video%20annotation&amp;journal=International%20Journal%20of%20Computer%20Vision&amp;volume=1010&amp;pages=184-204&amp;publication_year=2012&amp;author=Vondrick%2CC&amp;author=Patterson%2CD&amp;author=Ramanan%2CD"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR91">Wan, L., Zeiler, M., Zhang, S., LeCun, Y., &amp; Fergus, R. (2013). Regularization of neural networks using dropconnect. In <i>Proceedings of the international conference on machine learning (ICML’13)</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR92">Wang, M., Xiao, T., Li, J., Hong, C., Zhang, J., &amp; Zhang, Z. (2014). Minerva: A scalable and highly efficient training platform for deep learning. In <i>APSys</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR93">Wang, J., Yang, J., Yu, K., Lv, F., Huang, T., &amp; Gong, Y. (2010). Locality-constrained linear coding for image classification. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR94">Wang, X., Yang, M., Zhu, S., &amp; Lin, Y. (2013). Regionlets for generic object detection. In <i>ICCV</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR95">Welinder, P., Branson, S., Belongie, S., &amp; Perona, P. (2010). The multidimensional wisdom of crowds. In <i>NIPS</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR96">Xiao, J., Hays, J., Ehinger, K., Oliva, A., &amp; Torralba., A. (2010). SUN database: Large-scale scene recognition from Abbey to Zoo. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR97">Yang, J., Yu, K., Gong, Y., &amp; Huang, T. (2009). Linear spatial pyramid matching using sparse coding for image classification. In <i>CVPR</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR98">Yao, B., Yang, X., &amp; Zhu, S.-C. (2007). <i>Introduction to a large scale general purpose ground truth dataset: methodology, annotation tool, and benchmarks</i>. Berlin: Springer.</p><p class="c-article-references__links u-hide-print"><a data-track="click_references" data-track-action="google scholar reference" data-track-value="google scholar reference" data-track-label="link" data-track-item_id="link" rel="nofollow noopener" aria-label="Google Scholar reference 98" href="http://scholar.google.com/scholar_lookup?&amp;title=Introduction%20to%20a%20large%20scale%20general%20purpose%20ground%20truth%20dataset%3A%20methodology%2C%20annotation%20tool%2C%20and%20benchmarks&amp;publication_year=2007&amp;author=Yao%2CB&amp;author=Yang%2CX&amp;author=Zhu%2CS-C"> Google Scholar</a>  </p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR99">Zeiler, M. D., &amp; Fergus, R. (2013). Visualizing and understanding convolutional networks. CoRR, abs/1311.2901.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR100">Zeiler, M. D., Taylor, G. W., &amp; Fergus, R. (2011). Adaptive deconvolutional networks for mid and high level feature learning. In <i>ICCV</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR101">Zhou, B., Lapedriza, A., Xiao, J., Torralba, A., &amp; Oliva, A. (2014). Learning deep features for scene recognition using places database. In <i>NIPS</i>.</p></li><li class="c-article-references__item js-c-reading-companion-references-item"><p class="c-article-references__text" id="ref-CR102">Zhou, X., Yu, K., Zhang, T., &amp; Huang, T. (2010). Image classification using super-vector coding of local image descriptors. In <i>ECCV</i>.</p></li></ul><p class="c-article-references__download u-hide-print"><a data-track="click" data-track-action="download citation references" data-track-label="link" rel="nofollow" href="https://citation-needed.springer.com/v2/references/10.1007/s11263-015-0816-y?format=refman&amp;flavour=references">Download references<svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-download-medium"></use></svg></a></p></div></div></div></section></div><section data-title="Acknowledgments"><div class="c-article-section" id="Ack1-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="Ack1">Acknowledgments</h2><div class="c-article-section__content" id="Ack1-content"><p>We thank Stanford University, UNC Chapel Hill, Google and Facebook for sponsoring the challenges, and NVIDIA for providing computational resources to participants of ILSVRC2014. We thank our advisors over the years: Lubomir Bourdev, Alexei Efros, Derek Hoiem, Jitendra Malik, Chuck Rosenberg and Andrew Zisserman. We thank the PASCAL VOC organizers for partnering with us in running ILSVRC2010-2012. We thank all members of the Stanford vision lab for supporting the challenges and putting up with us along the way. Finally, and most importantly, we thank all researchers that have made the ILSVRC effort a success by competing in the challenges and by using the datasets to advance computer vision.</p></div></div></section><section aria-labelledby="author-information" data-title="Author information"><div class="c-article-section" id="author-information-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="author-information">Author information</h2><div class="c-article-section__content" id="author-information-content"><span class="c-article-author-information__subtitle u-visually-hidden" id="author-notes">Author notes</span><ol class="c-article-author-information__list"></ol><h3 class="c-article__sub-heading" id="affiliations">Authors and Affiliations</h3><ol class="c-article-author-affiliation__list"><li id="Aff1"><p class="c-article-author-affiliation__address">Stanford University, Stanford, CA, USA</p><p class="c-article-author-affiliation__authors-list">Olga Russakovsky, Hao Su, Jonathan Krause, Sanjeev Satheesh, Sean Ma, Zhiheng Huang, Andrej Karpathy, Michael Bernstein &amp; Li Fei-Fei</p></li><li id="Aff2"><p class="c-article-author-affiliation__address">University of Michigan, Ann Arbor, MI, USA</p><p class="c-article-author-affiliation__authors-list">Jia Deng</p></li><li id="Aff3"><p class="c-article-author-affiliation__address">Massachusetts Institute of Technology, Cambridge, MA, USA</p><p class="c-article-author-affiliation__authors-list">Aditya Khosla</p></li><li id="Aff4"><p class="c-article-author-affiliation__address">UNC Chapel Hill, Chapel Hill, NC, USA</p><p class="c-article-author-affiliation__authors-list">Alexander C. Berg</p></li></ol><div class="u-js-hide u-hide-print" data-test="author-info"><span class="c-article__sub-heading">Authors</span><ol class="c-article-authors-search u-list-reset"><li id="auth-Olga-Russakovsky-Aff1"><span class="c-article-authors-search__title u-h3 js-search-name">Olga Russakovsky</span><div class="c-article-authors-search__list"><div class="c-article-authors-search__item c-article-authors-search__list-item--left"><a href="/search?dc.creator=Olga%20Russakovsky" class="c-article-button" data-track="click" data-track-action="author link - publication" data-track-label="link" rel="nofollow">View author publications</a></div><div class="c-article-authors-search__item c-article-authors-search__list-item--right"><p class="search-in-title-js c-article-authors-search__text">You can also search for this author in <span class="c-article-identifiers"><a class="c-article-identifiers__item" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=search&amp;term=Olga%20Russakovsky" data-track="click" data-track-action="author link - pubmed" data-track-label="link" rel="nofollow">PubMed</a><span class="u-hide"> </span><a class="c-article-identifiers__item" href="http://scholar.google.co.uk/scholar?as_q=&amp;num=10&amp;btnG=Search+Scholar&amp;as_epq=&amp;as_oq=&amp;as_eq=&amp;as_occt=any&amp;as_sauthors=%22Olga%20Russakovsky%22&amp;as_publication=&amp;as_ylo=&amp;as_yhi=&amp;as_allsubj=all&amp;hl=en" data-track="click" data-track-action="author link - scholar" data-track-label="link" rel="nofollow">Google Scholar</a></span></p></div></div></li><li id="auth-Jia-Deng-Aff2"><span class="c-article-authors-search__title u-h3 js-search-name">Jia Deng</span><div class="c-article-authors-search__list"><div class="c-article-authors-search__item c-article-authors-search__list-item--left"><a href="/search?dc.creator=Jia%20Deng" class="c-article-button" data-track="click" data-track-action="author link - publication" data-track-label="link" rel="nofollow">View author publications</a></div><div class="c-article-authors-search__item c-article-authors-search__list-item--right"><p class="search-in-title-js c-article-authors-search__text">You can also search for this author in <span class="c-article-identifiers"><a class="c-article-identifiers__item" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=search&amp;term=Jia%20Deng" data-track="click" data-track-action="author link - pubmed" data-track-label="link" rel="nofollow">PubMed</a><span class="u-hide"> </span><a class="c-article-identifiers__item" href="http://scholar.google.co.uk/scholar?as_q=&amp;num=10&amp;btnG=Search+Scholar&amp;as_epq=&amp;as_oq=&amp;as_eq=&amp;as_occt=any&amp;as_sauthors=%22Jia%20Deng%22&amp;as_publication=&amp;as_ylo=&amp;as_yhi=&amp;as_allsubj=all&amp;hl=en" data-track="click" data-track-action="author link - scholar" data-track-label="link" rel="nofollow">Google Scholar</a></span></p></div></div></li><li id="auth-Hao-Su-Aff1"><span class="c-article-authors-search__title u-h3 js-search-name">Hao Su</span><div class="c-article-authors-search__list"><div class="c-article-authors-search__item c-article-authors-search__list-item--left"><a href="/search?dc.creator=Hao%20Su" class="c-article-button" data-track="click" data-track-action="author link - publication" data-track-label="link" rel="nofollow">View author publications</a></div><div class="c-article-authors-search__item c-article-authors-search__list-item--right"><p class="search-in-title-js c-article-authors-search__text">You can also search for this author in <span class="c-article-identifiers"><a class="c-article-identifiers__item" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=search&amp;term=Hao%20Su" data-track="click" data-track-action="author link - pubmed" data-track-label="link" rel="nofollow">PubMed</a><span class="u-hide"> </span><a class="c-article-identifiers__item" href="http://scholar.google.co.uk/scholar?as_q=&amp;num=10&amp;btnG=Search+Scholar&amp;as_epq=&amp;as_oq=&amp;as_eq=&amp;as_occt=any&amp;as_sauthors=%22Hao%20Su%22&amp;as_publication=&amp;as_ylo=&amp;as_yhi=&amp;as_allsubj=all&amp;hl=en" data-track="click" data-track-action="author link - scholar" data-track-label="link" rel="nofollow">Google Scholar</a></span></p></div></div></li><li id="auth-Jonathan-Krause-Aff1"><span class="c-article-authors-search__title u-h3 js-search-name">Jonathan Krause</span><div class="c-article-authors-search__list"><div class="c-article-authors-search__item c-article-authors-search__list-item--left"><a href="/search?dc.creator=Jonathan%20Krause" class="c-article-button" data-track="click" data-track-action="author link - publication" data-track-label="link" rel="nofollow">View author publications</a></div><div class="c-article-authors-search__item c-article-authors-search__list-item--right"><p class="search-in-title-js c-article-authors-search__text">You can also search for this author in <span class="c-article-identifiers"><a class="c-article-identifiers__item" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=search&amp;term=Jonathan%20Krause" data-track="click" data-track-action="author link - pubmed" data-track-label="link" rel="nofollow">PubMed</a><span class="u-hide"> </span><a class="c-article-identifiers__item" href="http://scholar.google.co.uk/scholar?as_q=&amp;num=10&amp;btnG=Search+Scholar&amp;as_epq=&amp;as_oq=&amp;as_eq=&amp;as_occt=any&amp;as_sauthors=%22Jonathan%20Krause%22&amp;as_publication=&amp;as_ylo=&amp;as_yhi=&amp;as_allsubj=all&amp;hl=en" data-track="click" data-track-action="author link - scholar" data-track-label="link" rel="nofollow">Google Scholar</a></span></p></div></div></li><li id="auth-Sanjeev-Satheesh-Aff1"><span class="c-article-authors-search__title u-h3 js-search-name">Sanjeev Satheesh</span><div class="c-article-authors-search__list"><div class="c-article-authors-search__item c-article-authors-search__list-item--left"><a href="/search?dc.creator=Sanjeev%20Satheesh" class="c-article-button" data-track="click" data-track-action="author link - publication" data-track-label="link" rel="nofollow">View author publications</a></div><div class="c-article-authors-search__item c-article-authors-search__list-item--right"><p class="search-in-title-js c-article-authors-search__text">You can also search for this author in <span class="c-article-identifiers"><a class="c-article-identifiers__item" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=search&amp;term=Sanjeev%20Satheesh" data-track="click" data-track-action="author link - pubmed" data-track-label="link" rel="nofollow">PubMed</a><span class="u-hide"> </span><a class="c-article-identifiers__item" href="http://scholar.google.co.uk/scholar?as_q=&amp;num=10&amp;btnG=Search+Scholar&amp;as_epq=&amp;as_oq=&amp;as_eq=&amp;as_occt=any&amp;as_sauthors=%22Sanjeev%20Satheesh%22&amp;as_publication=&amp;as_ylo=&amp;as_yhi=&amp;as_allsubj=all&amp;hl=en" data-track="click" data-track-action="author link - scholar" data-track-label="link" rel="nofollow">Google Scholar</a></span></p></div></div></li><li id="auth-Sean-Ma-Aff1"><span class="c-article-authors-search__title u-h3 js-search-name">Sean Ma</span><div class="c-article-authors-search__list"><div class="c-article-authors-search__item c-article-authors-search__list-item--left"><a href="/search?dc.creator=Sean%20Ma" class="c-article-button" data-track="click" data-track-action="author link - publication" data-track-label="link" rel="nofollow">View author publications</a></div><div class="c-article-authors-search__item c-article-authors-search__list-item--right"><p class="search-in-title-js c-article-authors-search__text">You can also search for this author in <span class="c-article-identifiers"><a class="c-article-identifiers__item" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=search&amp;term=Sean%20Ma" data-track="click" data-track-action="author link - pubmed" data-track-label="link" rel="nofollow">PubMed</a><span class="u-hide"> </span><a class="c-article-identifiers__item" href="http://scholar.google.co.uk/scholar?as_q=&amp;num=10&amp;btnG=Search+Scholar&amp;as_epq=&amp;as_oq=&amp;as_eq=&amp;as_occt=any&amp;as_sauthors=%22Sean%20Ma%22&amp;as_publication=&amp;as_ylo=&amp;as_yhi=&amp;as_allsubj=all&amp;hl=en" data-track="click" data-track-action="author link - scholar" data-track-label="link" rel="nofollow">Google Scholar</a></span></p></div></div></li><li id="auth-Zhiheng-Huang-Aff1"><span class="c-article-authors-search__title u-h3 js-search-name">Zhiheng Huang</span><div class="c-article-authors-search__list"><div class="c-article-authors-search__item c-article-authors-search__list-item--left"><a href="/search?dc.creator=Zhiheng%20Huang" class="c-article-button" data-track="click" data-track-action="author link - publication" data-track-label="link" rel="nofollow">View author publications</a></div><div class="c-article-authors-search__item c-article-authors-search__list-item--right"><p class="search-in-title-js c-article-authors-search__text">You can also search for this author in <span class="c-article-identifiers"><a class="c-article-identifiers__item" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=search&amp;term=Zhiheng%20Huang" data-track="click" data-track-action="author link - pubmed" data-track-label="link" rel="nofollow">PubMed</a><span class="u-hide"> </span><a class="c-article-identifiers__item" href="http://scholar.google.co.uk/scholar?as_q=&amp;num=10&amp;btnG=Search+Scholar&amp;as_epq=&amp;as_oq=&amp;as_eq=&amp;as_occt=any&amp;as_sauthors=%22Zhiheng%20Huang%22&amp;as_publication=&amp;as_ylo=&amp;as_yhi=&amp;as_allsubj=all&amp;hl=en" data-track="click" data-track-action="author link - scholar" data-track-label="link" rel="nofollow">Google Scholar</a></span></p></div></div></li><li id="auth-Andrej-Karpathy-Aff1"><span class="c-article-authors-search__title u-h3 js-search-name">Andrej Karpathy</span><div class="c-article-authors-search__list"><div class="c-article-authors-search__item c-article-authors-search__list-item--left"><a href="/search?dc.creator=Andrej%20Karpathy" class="c-article-button" data-track="click" data-track-action="author link - publication" data-track-label="link" rel="nofollow">View author publications</a></div><div class="c-article-authors-search__item c-article-authors-search__list-item--right"><p class="search-in-title-js c-article-authors-search__text">You can also search for this author in <span class="c-article-identifiers"><a class="c-article-identifiers__item" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=search&amp;term=Andrej%20Karpathy" data-track="click" data-track-action="author link - pubmed" data-track-label="link" rel="nofollow">PubMed</a><span class="u-hide"> </span><a class="c-article-identifiers__item" href="http://scholar.google.co.uk/scholar?as_q=&amp;num=10&amp;btnG=Search+Scholar&amp;as_epq=&amp;as_oq=&amp;as_eq=&amp;as_occt=any&amp;as_sauthors=%22Andrej%20Karpathy%22&amp;as_publication=&amp;as_ylo=&amp;as_yhi=&amp;as_allsubj=all&amp;hl=en" data-track="click" data-track-action="author link - scholar" data-track-label="link" rel="nofollow">Google Scholar</a></span></p></div></div></li><li id="auth-Aditya-Khosla-Aff3"><span class="c-article-authors-search__title u-h3 js-search-name">Aditya Khosla</span><div class="c-article-authors-search__list"><div class="c-article-authors-search__item c-article-authors-search__list-item--left"><a href="/search?dc.creator=Aditya%20Khosla" class="c-article-button" data-track="click" data-track-action="author link - publication" data-track-label="link" rel="nofollow">View author publications</a></div><div class="c-article-authors-search__item c-article-authors-search__list-item--right"><p class="search-in-title-js c-article-authors-search__text">You can also search for this author in <span class="c-article-identifiers"><a class="c-article-identifiers__item" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=search&amp;term=Aditya%20Khosla" data-track="click" data-track-action="author link - pubmed" data-track-label="link" rel="nofollow">PubMed</a><span class="u-hide"> </span><a class="c-article-identifiers__item" href="http://scholar.google.co.uk/scholar?as_q=&amp;num=10&amp;btnG=Search+Scholar&amp;as_epq=&amp;as_oq=&amp;as_eq=&amp;as_occt=any&amp;as_sauthors=%22Aditya%20Khosla%22&amp;as_publication=&amp;as_ylo=&amp;as_yhi=&amp;as_allsubj=all&amp;hl=en" data-track="click" data-track-action="author link - scholar" data-track-label="link" rel="nofollow">Google Scholar</a></span></p></div></div></li><li id="auth-Michael-Bernstein-Aff1"><span class="c-article-authors-search__title u-h3 js-search-name">Michael Bernstein</span><div class="c-article-authors-search__list"><div class="c-article-authors-search__item c-article-authors-search__list-item--left"><a href="/search?dc.creator=Michael%20Bernstein" class="c-article-button" data-track="click" data-track-action="author link - publication" data-track-label="link" rel="nofollow">View author publications</a></div><div class="c-article-authors-search__item c-article-authors-search__list-item--right"><p class="search-in-title-js c-article-authors-search__text">You can also search for this author in <span class="c-article-identifiers"><a class="c-article-identifiers__item" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=search&amp;term=Michael%20Bernstein" data-track="click" data-track-action="author link - pubmed" data-track-label="link" rel="nofollow">PubMed</a><span class="u-hide"> </span><a class="c-article-identifiers__item" href="http://scholar.google.co.uk/scholar?as_q=&amp;num=10&amp;btnG=Search+Scholar&amp;as_epq=&amp;as_oq=&amp;as_eq=&amp;as_occt=any&amp;as_sauthors=%22Michael%20Bernstein%22&amp;as_publication=&amp;as_ylo=&amp;as_yhi=&amp;as_allsubj=all&amp;hl=en" data-track="click" data-track-action="author link - scholar" data-track-label="link" rel="nofollow">Google Scholar</a></span></p></div></div></li><li id="auth-Alexander_C_-Berg-Aff4"><span class="c-article-authors-search__title u-h3 js-search-name">Alexander C. Berg</span><div class="c-article-authors-search__list"><div class="c-article-authors-search__item c-article-authors-search__list-item--left"><a href="/search?dc.creator=Alexander%20C.%20Berg" class="c-article-button" data-track="click" data-track-action="author link - publication" data-track-label="link" rel="nofollow">View author publications</a></div><div class="c-article-authors-search__item c-article-authors-search__list-item--right"><p class="search-in-title-js c-article-authors-search__text">You can also search for this author in <span class="c-article-identifiers"><a class="c-article-identifiers__item" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=search&amp;term=Alexander%20C.%20Berg" data-track="click" data-track-action="author link - pubmed" data-track-label="link" rel="nofollow">PubMed</a><span class="u-hide"> </span><a class="c-article-identifiers__item" href="http://scholar.google.co.uk/scholar?as_q=&amp;num=10&amp;btnG=Search+Scholar&amp;as_epq=&amp;as_oq=&amp;as_eq=&amp;as_occt=any&amp;as_sauthors=%22Alexander%20C.%20Berg%22&amp;as_publication=&amp;as_ylo=&amp;as_yhi=&amp;as_allsubj=all&amp;hl=en" data-track="click" data-track-action="author link - scholar" data-track-label="link" rel="nofollow">Google Scholar</a></span></p></div></div></li><li id="auth-Li-Fei_Fei-Aff1"><span class="c-article-authors-search__title u-h3 js-search-name">Li Fei-Fei</span><div class="c-article-authors-search__list"><div class="c-article-authors-search__item c-article-authors-search__list-item--left"><a href="/search?dc.creator=Li%20Fei-Fei" class="c-article-button" data-track="click" data-track-action="author link - publication" data-track-label="link" rel="nofollow">View author publications</a></div><div class="c-article-authors-search__item c-article-authors-search__list-item--right"><p class="search-in-title-js c-article-authors-search__text">You can also search for this author in <span class="c-article-identifiers"><a class="c-article-identifiers__item" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=search&amp;term=Li%20Fei-Fei" data-track="click" data-track-action="author link - pubmed" data-track-label="link" rel="nofollow">PubMed</a><span class="u-hide"> </span><a class="c-article-identifiers__item" href="http://scholar.google.co.uk/scholar?as_q=&amp;num=10&amp;btnG=Search+Scholar&amp;as_epq=&amp;as_oq=&amp;as_eq=&amp;as_occt=any&amp;as_sauthors=%22Li%20Fei-Fei%22&amp;as_publication=&amp;as_ylo=&amp;as_yhi=&amp;as_allsubj=all&amp;hl=en" data-track="click" data-track-action="author link - scholar" data-track-label="link" rel="nofollow">Google Scholar</a></span></p></div></div></li></ol></div><h3 class="c-article__sub-heading" id="corresponding-author">Corresponding author</h3><p id="corresponding-author-list">Correspondence to <a id="corresp-c1" href="mailto:olga@cs.stanford.edu">Olga Russakovsky</a>.</p></div></div></section><section data-title="Additional information"><div class="c-article-section" id="additional-information-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="additional-information">Additional information</h2><div class="c-article-section__content" id="additional-information-content"><p>Communicated by M. Hebert.</p><p>Olga Russakovsky and Jia Deng authors contributed equally.</p></div></div></section><section aria-labelledby="appendices"><div class="c-article-section" id="appendices-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="appendices">Appendices</h2><div class="c-article-section__content" id="appendices-content"><h3 class="c-article__sub-heading" id="App1">Appendix 1: ILSVRC2012-2014 Image Classification and Single-Object Localization Object Categories</h3> <div class="c-article-section__figure c-article-section__figure--no-border" data-test="figure" data-container-section="figure" id="figure-c"><figure><div class="c-article-section__figure-content" id="Figc"><div class="c-article-section__figure-item"><div class="c-article-section__figure-content"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Figc_HTML.gif?as=webp"><img aria-describedby="Figc" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Figc_HTML.gif" alt="figure c" loading="lazy"></picture></div></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-c-desc"></div></div></figure></div> <h3 class="c-article__sub-heading" id="App2">Appendix 2: Additional Single-Object Localization Dataset Statistics</h3><p>We consider two additional metrics of object localization difficulty: chance performance of localization and the level of clutter. We use these metrics to compare ILSVRC2012-2014 single-object localization dataset to the PASCAL VOC 2012 object detection benchmark. The measures of localization difficulty are computed on the validation set of both datasets. According to both of these measures of difficulty there is a subset of ILSVRC which is as challenging as PASCAL but more than an order of magnitude greater in size. Figure <a data-track="click" data-track-label="link" data-track-action="figure anchor" href="/article/10.1007/s11263-015-0816-y#Fig16">16</a> shows the distributions of different properties (object scale, chance performance of localization and level of clutter) across the different classes in the two datasets.</p><div class="c-article-section__figure js-c-reading-companion-figures-item" data-test="figure" data-container-section="figure" id="figure-16" data-title="Fig. 16"><figure><figcaption><b id="Fig16" class="c-article-section__figure-caption" data-test="figure-caption-text">Fig. 16</b></figcaption><div class="c-article-section__figure-content"><div class="c-article-section__figure-item"><a class="c-article-section__figure-link" data-test="img-link" data-track="click" data-track-label="image" data-track-action="view figure" href="/article/10.1007/s11263-015-0816-y/figures/16" rel="nofollow"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig16_HTML.gif?as=webp"><img aria-describedby="Fig16" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fig16_HTML.gif" alt="figure 16" loading="lazy"></picture></a></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-16-desc"><p>Distribution of various measures of localization difficulty on the ILSVRC2012-2014 single-object localization (<i>dark green</i>) and PASCAL VOC 2012 (<i>light blue</i>) validation sets. Object scale is fraction of image area occupied by an average object instance. Chance performance of localization and level of clutter are defined in Appendix <a data-track="click" data-track-label="link" data-track-action="section anchor" href="/article/10.1007/s11263-015-0816-y#Sec48">1</a>. The plots on <i>top</i> contain the full ILSVRC validation set with 1000 classes; the plots on the <i>bottom</i> contain 200 ILSVRC classes with the lowest chance performance of localization. All plots contain all 20 classes of PASCAL VOC</p></div></div><div class="u-text-right u-hide-print"><a class="c-article__pill-button" data-test="article-link" data-track="click" data-track-label="button" data-track-action="view figure" href="/article/10.1007/s11263-015-0816-y/figures/16" data-track-dest="link:Figure16 Full size image" aria-label="Full size image figure 16" rel="nofollow"><span>Full size image</span><svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-chevron-right-small"></use></svg></a></div></figure></div> <p> <i>Chance Performance of Localization (CPL)</i> Chance performance on a dataset is a common metric to consider. We define the CPL measure as the expected accuracy of a detector which first randomly samples an object instance of that class and then uses its bounding box directly as the proposed localization window on all other images (after rescaling the images to the same size). Concretely, let <span class="mathjax-tex">\(B_1,B_2,\dots ,B_N\)</span> be all the bounding boxes of the object instances within a class, then</p><div id="Equ6" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\begin{aligned} \text{ CPL } = \frac{\sum _i \sum _{j \ne i} IOU(B_i,B_j)\ge 0.5}{N(N-1)} \end{aligned}$$</span></div><div class="c-article-equation__number"> (6) </div></div><p>Some of the most difficult ILSVRC categories to localize according to this metric are basketball, swimming trunks, ping pong ball and rubber eraser, all with less than <span class="mathjax-tex">\(0.2\,\%\)</span> CPL. This measure correlates strongly (<span class="mathjax-tex">\(\rho = 0.9\)</span>) with the average scale of the object (fraction of image occupied by object). The average CPL across the <span class="mathjax-tex">\(1000\)</span> ILSVRC categories is <span class="mathjax-tex">\(20.8\,\%\)</span>. The 20 PASCAL categories have an average CPL of <span class="mathjax-tex">\(8.7\,\%\)</span>, which is the same as the CPL of the <span class="mathjax-tex">\(562\)</span> most difficult categories of ILSVRC.</p><p> <i>Clutter</i> Intuitively, even small objects are easy to localize on a plain background. To quantify clutter we employ the objectness measure of (Alexe et al. <a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 2012" title="Alexe, B., Deselares, T., &amp; Ferrari, V. (2012). Measuring the objectness of image windows. IEEE Transactions on Pattern Analysis and Machine Intelligence, 34(11), 2189–2202." href="/article/10.1007/s11263-015-0816-y#ref-CR2" id="ref-link-section-d138070400e21220">2012</a>), which is a class-generic object detector evaluating how likely a window in the image contains a coherent object (of any class) as opposed to background (sky, water, grass). For every image <span class="mathjax-tex">\(m\)</span> containing target object instances at positions <span class="mathjax-tex">\(B_1^m,B_2^m,\dots \)</span>, we use the publicly available objectness software to sample 1000 windows <span class="mathjax-tex">\(W_1^m,W_2^m,\dots W_{1000}^m\)</span>, in order of decreasing probability of the window containing any generic object. Let <span class="u-small-caps">obj</span>(m) be the number of generic object-looking windows sampled before localizing an instance of the target category, i.e., <span class="mathjax-tex">\(\text{ obj }(m) = \min \{k: \max _i \text{ iou }(W_k^m,B_i^m) \ge 0.5\}\)</span>. For a category containing M images, we compute the average number of such windows per image and define</p><div id="Equ7" class="c-article-equation"><div class="c-article-equation__content"><span class="mathjax-tex">$$\begin{aligned} \textsc {Clutter} = \log _{2} \large \left( \frac{1}{M}\sum _m \textsc {Obj}(m) \large \right) \end{aligned}$$</span></div><div class="c-article-equation__number"> (7) </div></div><p>The higher the clutter of a category, the harder the objects are to localize according to generic cues. If an object can’t be localized with the first 1000 windows (as is the case for <span class="mathjax-tex">\(1\,\%\)</span> of images on average per category in ILSVRC and <span class="mathjax-tex">\(5\,\%\)</span> in PASCAL), we set <span class="u-small-caps">obj</span> <span class="mathjax-tex">\((m)=1001\)</span>. The fact that more than <span class="mathjax-tex">\(95\,\%\)</span> of objects can be localized with these windows imply that the objectness cue is already quite strong, so objects that require many windows on average will be extremely difficult to detect: e.g., ping pong ball (clutter of 9.57, or 758 windows on average), basketball (clutter of 9.21), puck (clutter of 9.17) in ILSVRC. The most difficult object in PASCAL is bottle with clutter score of <span class="mathjax-tex">\(8.47\)</span>. On average, ILSVRC has clutter score of <span class="mathjax-tex">\(3.59\)</span>. The most difficult subset of ILSVRC with 250 object categories has an order of magnitude more categories and the same average amount of clutter (of <span class="mathjax-tex">\(5.90\)</span>) as the PASCAL dataset.</p><h3 class="c-article__sub-heading" id="App3">Appendix 3: Manually Curated Queries for Obtaining Object Detection Scene Images</h3><p>In Sect. <a data-track="click" data-track-label="link" data-track-action="section anchor" href="/article/10.1007/s11263-015-0816-y#Sec19">3.3.2</a> we discussed three types of queries we used for collecting the object detection images: (1) single object category name or a synonym; (2) a pair of object category names; (3) a manual query, typically targetting one or more object categories with insufficient data. Here we provide a list of the 129 manually curated queries:</p><div class="c-article-section__figure c-article-section__figure--no-border" data-test="figure" data-container-section="figure" id="figure-d"><figure><div class="c-article-section__figure-content" id="Figd"><div class="c-article-section__figure-item"><div class="c-article-section__figure-content"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Figd_HTML.gif?as=webp"><img aria-describedby="Figd" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Figd_HTML.gif" alt="figure d" loading="lazy"></picture></div></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-d-desc"></div></div></figure></div> <h3 class="c-article__sub-heading" id="App4">Appendix 4: Hierarchy of Questions for Full Image Annotation</h3><p>The following is a hierarchy of questions manually constructed for crowdsourcing full annotation of images with the presence or absence of 200 object detection categories in ILSVRC2013 and ILSVRC2014. All questions are of the form “is there a ... in the image?” Questions marked with <span class="mathjax-tex">\(\bullet \)</span> are asked on every image. If the answer to a question is determined to be “no” then the answer to all descendant questions is assumed to be “no”. The 200 numbered leaf nodes correspond to the 200 object detection categories.</p><p>The goal in the hierarchy construction is to save cost (by asking as few questions as possible on every image) while avoiding any ambiguity in questions which would lead to false negatives during annotation. This hierarchy is not tree-structured; some questions have multiple parents.</p><div class="c-article-section__figure c-article-section__figure--no-border" data-test="figure" data-container-section="figure" id="figure-e"><figure><div class="c-article-section__figure-content" id="Fige"><div class="c-article-section__figure-item"><div class="c-article-section__figure-content"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fige_HTML.gif?as=webp"><img aria-describedby="Fige" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Fige_HTML.gif" alt="figure e" loading="lazy"></picture></div></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-e-desc"></div></div></figure></div> <div class="c-article-section__figure c-article-section__figure--no-border" data-test="figure" data-container-section="figure" id="figure-f"><figure><div class="c-article-section__figure-content" id="Figf"><div class="c-article-section__figure-item"><div class="c-article-section__figure-content"><picture><source type="image/webp" srcset="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Figf_HTML.gif?as=webp"><img aria-describedby="Figf" src="//media.springernature.com/lw685/springer-static/image/art%3A10.1007%2Fs11263-015-0816-y/MediaObjects/11263_2015_816_Figf_HTML.gif" alt="figure f" loading="lazy"></picture></div></div><div class="c-article-section__figure-description" data-test="bottom-caption" id="figure-f-desc"></div></div></figure></div> <h3 class="c-article__sub-heading" id="App5">Appendix 5: Modification to Bounding Box System for Object Detection</h3><p>The bounding box annotation system described in Sect. <a data-track="click" data-track-label="link" data-track-action="section anchor" href="/article/10.1007/s11263-015-0816-y#Sec15">3.2.1</a> is used for annotating images for both the single-object localization dataset and the object detection dataset. However, two additional manual post-processing are needed to ensure accuracy in the object detection scenario:</p><p> <i>Ambiguous Objects</i> The first common source of error was that workers were not able to accurately differentiate some object classes during annotation. Some commonly confused labels were seal and sea otter, backpack and purse, banjo and guitar, violin and cello, brass instruments (trumpet, trombone, french horn and brass), flute and oboe, ladle and spatula. Despite our best efforts (providing positive and negative example images in the annotation task, adding text explanations to alert the user to the distinction between these categories) these errors persisted.</p><p>In the single-object localization setting, this problem was not as prominent for two reasons. First, the way the data was collected imposed a strong prior on the object class which was present. Second, since only one object category needed to be annotated per image, ambiguous images could be discarded: for example, if workers couldn’t agree on whether or not a trumpet was in fact present, this image could simply be removed. In contrast, for the object detection setting consensus had to be reached for all target categories on all images.</p><p>To fix this problem, once bounding box annotations were collected we manually looked through all cases where the bounding boxes for two different object classes had significant overlap with each other (about <span class="mathjax-tex">\(3\,\%\)</span> of the collected boxes). About a quarter of these boxes were found to correspond to incorrect objects and were removed. Crowdsourcing this post-processing step (with very stringent accuracy constraints) would be possible but it occurred in few enough cases that it was faster (and more accurate) to do this in-house.</p><p> <i>Duplicate Annotations</i> The second common source of error were duplicate bounding boxes drawn on the same object instance. Despite instructions not to draw more than one bounding box around the same object instance and constraints in the annotation UI enforcing at least a 5 pixel difference between different bounding boxes, these errors persisted. One reason was that sometimes the initial bounding box was not perfect and subsequent labelers drew a slightly improved alternative.</p><p>This type of error was also present in the single-object localization scenario but was not a major cause for concern. A duplicate bounding box is a slightly perturbed but still correct positive example, and single-object localization is only concerned with correctly localizing one object instance. For the detection task algorithms are evaluated on the ability to localize <i>every</i> object instance, and penalized for duplicate detections, so it is imperative that these labeling errors are corrected (even if they only appear in about <span class="mathjax-tex">\(0.6\,\%\)</span> of cases).</p><p>Approximately <span class="mathjax-tex">\(1\,\%\)</span> of bounding boxes were found to have significant overlap of more than <span class="mathjax-tex">\(50\,\%\)</span> with another bounding box of the same object class.We again manually verified all of these cases in-house. In approximately <span class="mathjax-tex">\(40\,\%\)</span> of the cases the two bounding boxes correctly corresponded to different people in a crowd, to stacked plates, or to musical instruments nearby in an orchestra. In the other <span class="mathjax-tex">\(60\,\%\)</span> of cases one of the boxes was randomly removed.</p><p>These verification steps complete the annotation procedure of bounding boxes around every instance of every object class in validation, test and a subset of training images for the detection task.</p><p> <i>Training Set Annotation</i> With the optimized algorithm of Sect. <a data-track="click" data-track-label="link" data-track-action="section anchor" href="/article/10.1007/s11263-015-0816-y#Sec20">3.3.3</a> we fully annotated the validation and test sets. However, annotating <i>all</i> training images with all target object classes was still a budget challenge. Positive training images taken from the single-object localization dataset already had bounding box annotations of all instances of one object class on each image. We extended the existing annotations to the detection dataset by making two modification. First, we corrected any bounding box omissions resulting from merging fine-grained categories: i.e., if an image belonged to the “dalmatian” category and all instances of “dalmatian” were annotated with bounding boxes for single-object localization, we ensured that all remaining “dog” instances are also annotated for the object detection task. Second, we collected significantly more training data for the person class because the existing annotation set was not diverse enough to be representative (the only people categories in the single-object localization task are scuba diver, groom, and ballplayer). To compensate, we additionally annotated people in a large fraction of the existing training set images.</p><h3 class="c-article__sub-heading" id="App6">Appendix 6: Competition Protocol</h3><p> <i>Competition Format</i> At the beginning of the competition period each year we release the new training/validation/test images, training/validation annotations, and competition specification for the year. We then specify a deadline for submission, usually approximately 4 months after the release of data. Teams are asked to upload a text file of their predicted annotations on test images by this deadline to a provided server. We then evaluate all submissions and release the results.</p><p>For every task we released code that takes a text file of automatically generated image annotations and compares it with the ground truth annotations to return a quantitative measure of algorithm accuracy. Teams can use this code to evaluate their performance on the validation data.</p><p>As described in Everingham et al. (<a data-track="click" data-track-action="reference anchor" data-track-label="link" data-test="citation-ref" aria-label="Reference 2014" title="Everingham, M., Eslami, S. M. A., Van Gool, L., Williams, C. K. I., Winn, J., &amp; Zisserman, A. (2014). The Pascal Visual Object Classes (VOC) challenge—A retrospective. International Journal of Computer Vision, 111, 98–136." href="/article/10.1007/s11263-015-0816-y#ref-CR20" id="ref-link-section-d138070400e22041">2014</a>), there are three options for measuring performance on test data: (i) Release test images and annotations, and allow participants to assess performance themselves; (ii) Release test images but not test annotations—participants submit results and organizers assess performance; (iii) Neither test images nor annotations are released—participants submit software and organizers run it on new data and assess performance. In line with the PASCAL VOC choice, we opted for option (ii). Option (i) allows too much leeway in overfitting to the test data; option (iii) is infeasible, especially given the scale of our test set (40K–100K images).</p><p>We released ILSVRC2010 test annotations for the image classification task, but all other test annotations have remained hidden to discourage fine-tuning results on the test data.</p><p> <i>Evaluation Protocol After the Challenge</i> After the challenge period we set up an automatic evaluation server that researchers can use throughout the year to continue evaluating their algorithms against the ground truth test annotations. We limit teams to 2 submissions per week to discourage parameter tuning on the test data, and in practice we have never had a problem with researchers abusing the system.</p></div></div></section><section data-title="Rights and permissions"><div class="c-article-section" id="rightslink-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="rightslink">Rights and permissions</h2><div class="c-article-section__content" id="rightslink-content"><p class="c-article-rights"><a data-track="click" data-track-action="view rights and permissions" data-track-label="link" href="https://s100.copyright.com/AppDispatchServlet?title=ImageNet%20Large%20Scale%20Visual%20Recognition%20Challenge&amp;author=Olga%20Russakovsky%20et%20al&amp;contentID=10.1007%2Fs11263-015-0816-y&amp;copyright=Springer%20Science%2BBusiness%20Media%20New%20York&amp;publication=0920-5691&amp;publicationDate=2015-04-11&amp;publisherName=SpringerNature&amp;orderBeanReset=true">Reprints and permissions</a></p></div></div></section><section aria-labelledby="article-info" data-title="About this article"><div class="c-article-section" id="article-info-section"><h2 class="c-article-section__title js-section-title js-c-reading-companion-sections-item" id="article-info">About this article</h2><div class="c-article-section__content" id="article-info-content"><div class="c-bibliographic-information"><div class="u-hide-print c-bibliographic-information__column c-bibliographic-information__column--border"><a data-crossmark="10.1007/s11263-015-0816-y" target="_blank" rel="noopener" href="https://crossmark.crossref.org/dialog/?doi=10.1007/s11263-015-0816-y" data-track="click" data-track-action="Click Crossmark" data-track-label="link" data-test="crossmark"><img loading="lazy" width="57" height="81" alt="Check for updates. Verify currency and authenticity via CrossMark" src="data:image/svg+xml;base64,<svg height="81" width="57" xmlns="http://www.w3.org/2000/svg"><g fill="none" fill-rule="evenodd"><path d="m17.35 35.45 21.3-14.2v-17.03h-21.3" fill="#989898"/><path d="m38.65 35.45-21.3-14.2v-17.03h21.3" fill="#747474"/><path d="m28 .5c-12.98 0-23.5 10.52-23.5 23.5s10.52 23.5 23.5 23.5 23.5-10.52 23.5-23.5c0-6.23-2.48-12.21-6.88-16.62-4.41-4.4-10.39-6.88-16.62-6.88zm0 41.25c-9.8 0-17.75-7.95-17.75-17.75s7.95-17.75 17.75-17.75 17.75 7.95 17.75 17.75c0 4.71-1.87 9.22-5.2 12.55s-7.84 5.2-12.55 5.2z" fill="#535353"/><path d="m41 36c-5.81 6.23-15.23 7.45-22.43 2.9-7.21-4.55-10.16-13.57-7.03-21.5l-4.92-3.11c-4.95 10.7-1.19 23.42 8.78 29.71 9.97 6.3 23.07 4.22 30.6-4.86z" fill="#9c9c9c"/><path d="m.2 58.45c0-.75.11-1.42.33-2.01s.52-1.09.91-1.5c.38-.41.83-.73 1.34-.94.51-.22 1.06-.32 1.65-.32.56 0 1.06.11 1.51.35.44.23.81.5 1.1.81l-.91 1.01c-.24-.24-.49-.42-.75-.56-.27-.13-.58-.2-.93-.2-.39 0-.73.08-1.05.23-.31.16-.58.37-.81.66-.23.28-.41.63-.53 1.04-.13.41-.19.88-.19 1.39 0 1.04.23 1.86.68 2.46.45.59 1.06.88 1.84.88.41 0 .77-.07 1.07-.23s.59-.39.85-.68l.91 1c-.38.43-.8.76-1.28.99-.47.22-1 .34-1.58.34-.59 0-1.13-.1-1.64-.31-.5-.2-.94-.51-1.31-.91-.38-.4-.67-.9-.88-1.48-.22-.59-.33-1.26-.33-2.02zm8.4-5.33h1.61v2.54l-.05 1.33c.29-.27.61-.51.96-.72s.76-.31 1.24-.31c.73 0 1.27.23 1.61.71.33.47.5 1.14.5 2.02v4.31h-1.61v-4.1c0-.57-.08-.97-.25-1.21-.17-.23-.45-.35-.83-.35-.3 0-.56.08-.79.22-.23.15-.49.36-.78.64v4.8h-1.61zm7.37 6.45c0-.56.09-1.06.26-1.51.18-.45.42-.83.71-1.14.29-.3.63-.54 1.01-.71.39-.17.78-.25 1.18-.25.47 0 .88.08 1.23.24.36.16.65.38.89.67s.42.63.54 1.03c.12.41.18.84.18 1.32 0 .32-.02.57-.07.76h-4.36c.07.62.29 1.1.65 1.44.36.33.82.5 1.38.5.29 0 .57-.04.83-.13s.51-.21.76-.37l.55 1.01c-.33.21-.69.39-1.09.53-.41.14-.83.21-1.26.21-.48 0-.92-.08-1.34-.25-.41-.16-.76-.4-1.07-.7-.31-.31-.55-.69-.72-1.13-.18-.44-.26-.95-.26-1.52zm4.6-.62c0-.55-.11-.98-.34-1.28-.23-.31-.58-.47-1.06-.47-.41 0-.77.15-1.07.45-.31.29-.5.73-.58 1.3zm2.5.62c0-.57.09-1.08.28-1.53.18-.44.43-.82.75-1.13s.69-.54 1.1-.71c.42-.16.85-.24 1.31-.24.45 0 .84.08 1.17.23s.61.34.85.57l-.77 1.02c-.19-.16-.38-.28-.56-.37-.19-.09-.39-.14-.61-.14-.56 0-1.01.21-1.35.63-.35.41-.52.97-.52 1.67 0 .69.17 1.24.51 1.66.34.41.78.62 1.32.62.28 0 .54-.06.78-.17.24-.12.45-.26.64-.42l.67 1.03c-.33.29-.69.51-1.08.65-.39.15-.78.23-1.18.23-.46 0-.9-.08-1.31-.24-.4-.16-.75-.39-1.05-.7s-.53-.69-.7-1.13c-.17-.45-.25-.96-.25-1.53zm6.91-6.45h1.58v6.17h.05l2.54-3.16h1.77l-2.35 2.8 2.59 4.07h-1.75l-1.77-2.98-1.08 1.23v1.75h-1.58zm13.69 1.27c-.25-.11-.5-.17-.75-.17-.58 0-.87.39-.87 1.16v.75h1.34v1.27h-1.34v5.6h-1.61v-5.6h-.92v-1.2l.92-.07v-.72c0-.35.04-.68.13-.98.08-.31.21-.57.4-.79s.42-.39.71-.51c.28-.12.63-.18 1.04-.18.24 0 .48.02.69.07.22.05.41.1.57.17zm.48 5.18c0-.57.09-1.08.27-1.53.17-.44.41-.82.72-1.13.3-.31.65-.54 1.04-.71.39-.16.8-.24 1.23-.24s.84.08 1.24.24c.4.17.74.4 1.04.71s.54.69.72 1.13c.19.45.28.96.28 1.53s-.09 1.08-.28 1.53c-.18.44-.42.82-.72 1.13s-.64.54-1.04.7-.81.24-1.24.24-.84-.08-1.23-.24-.74-.39-1.04-.7c-.31-.31-.55-.69-.72-1.13-.18-.45-.27-.96-.27-1.53zm1.65 0c0 .69.14 1.24.43 1.66.28.41.68.62 1.18.62.51 0 .9-.21 1.19-.62.29-.42.44-.97.44-1.66 0-.7-.15-1.26-.44-1.67-.29-.42-.68-.63-1.19-.63-.5 0-.9.21-1.18.63-.29.41-.43.97-.43 1.67zm6.48-3.44h1.33l.12 1.21h.05c.24-.44.54-.79.88-1.02.35-.24.7-.36 1.07-.36.32 0 .59.05.78.14l-.28 1.4-.33-.09c-.11-.01-.23-.02-.38-.02-.27 0-.56.1-.86.31s-.55.58-.77 1.1v4.2h-1.61zm-47.87 15h1.61v4.1c0 .57.08.97.25 1.2.17.24.44.35.81.35.3 0 .57-.07.8-.22.22-.15.47-.39.73-.73v-4.7h1.61v6.87h-1.32l-.12-1.01h-.04c-.3.36-.63.64-.98.86-.35.21-.76.32-1.24.32-.73 0-1.27-.24-1.61-.71-.33-.47-.5-1.14-.5-2.02zm9.46 7.43v2.16h-1.61v-9.59h1.33l.12.72h.05c.29-.24.61-.45.97-.63.35-.17.72-.26 1.1-.26.43 0 .81.08 1.15.24.33.17.61.4.84.71.24.31.41.68.53 1.11.13.42.19.91.19 1.44 0 .59-.09 1.11-.25 1.57-.16.47-.38.85-.65 1.16-.27.32-.58.56-.94.73-.35.16-.72.25-1.1.25-.3 0-.6-.07-.9-.2s-.59-.31-.87-.56zm0-2.3c.26.22.5.37.73.45.24.09.46.13.66.13.46 0 .84-.2 1.15-.6.31-.39.46-.98.46-1.77 0-.69-.12-1.22-.35-1.61-.23-.38-.61-.57-1.13-.57-.49 0-.99.26-1.52.77zm5.87-1.69c0-.56.08-1.06.25-1.51.16-.45.37-.83.65-1.14.27-.3.58-.54.93-.71s.71-.25 1.08-.25c.39 0 .73.07 1 .2.27.14.54.32.81.55l-.06-1.1v-2.49h1.61v9.88h-1.33l-.11-.74h-.06c-.25.25-.54.46-.88.64-.33.18-.69.27-1.06.27-.87 0-1.56-.32-2.07-.95s-.76-1.51-.76-2.65zm1.67-.01c0 .74.13 1.31.4 1.7.26.38.65.58 1.15.58.51 0 .99-.26 1.44-.77v-3.21c-.24-.21-.48-.36-.7-.45-.23-.08-.46-.12-.7-.12-.45 0-.82.19-1.13.59-.31.39-.46.95-.46 1.68zm6.35 1.59c0-.73.32-1.3.97-1.71.64-.4 1.67-.68 3.08-.84 0-.17-.02-.34-.07-.51-.05-.16-.12-.3-.22-.43s-.22-.22-.38-.3c-.15-.06-.34-.1-.58-.1-.34 0-.68.07-1 .2s-.63.29-.93.47l-.59-1.08c.39-.24.81-.45 1.28-.63.47-.17.99-.26 1.54-.26.86 0 1.51.25 1.93.76s.63 1.25.63 2.21v4.07h-1.32l-.12-.76h-.05c-.3.27-.63.48-.98.66s-.73.27-1.14.27c-.61 0-1.1-.19-1.48-.56-.38-.36-.57-.85-.57-1.46zm1.57-.12c0 .3.09.53.27.67.19.14.42.21.71.21.28 0 .54-.07.77-.2s.48-.31.73-.56v-1.54c-.47.06-.86.13-1.18.23-.31.09-.57.19-.76.31s-.33.25-.41.4c-.09.15-.13.31-.13.48zm6.29-3.63h-.98v-1.2l1.06-.07.2-1.88h1.34v1.88h1.75v1.27h-1.75v3.28c0 .8.32 1.2.97 1.2.12 0 .24-.01.37-.04.12-.03.24-.07.34-.11l.28 1.19c-.19.06-.4.12-.64.17-.23.05-.49.08-.76.08-.4 0-.74-.06-1.02-.18-.27-.13-.49-.3-.67-.52-.17-.21-.3-.48-.37-.78-.08-.3-.12-.64-.12-1.01zm4.36 2.17c0-.56.09-1.06.27-1.51s.41-.83.71-1.14c.29-.3.63-.54 1.01-.71.39-.17.78-.25 1.18-.25.47 0 .88.08 1.23.24.36.16.65.38.89.67s.42.63.54 1.03c.12.41.18.84.18 1.32 0 .32-.02.57-.07.76h-4.37c.08.62.29 1.1.65 1.44.36.33.82.5 1.38.5.3 0 .58-.04.84-.13.25-.09.51-.21.76-.37l.54 1.01c-.32.21-.69.39-1.09.53s-.82.21-1.26.21c-.47 0-.92-.08-1.33-.25-.41-.16-.77-.4-1.08-.7-.3-.31-.54-.69-.72-1.13-.17-.44-.26-.95-.26-1.52zm4.61-.62c0-.55-.11-.98-.34-1.28-.23-.31-.58-.47-1.06-.47-.41 0-.77.15-1.08.45-.31.29-.5.73-.57 1.3zm3.01 2.23c.31.24.61.43.92.57.3.13.63.2.98.2.38 0 .65-.08.83-.23s.27-.35.27-.6c0-.14-.05-.26-.13-.37-.08-.1-.2-.2-.34-.28-.14-.09-.29-.16-.47-.23l-.53-.22c-.23-.09-.46-.18-.69-.3-.23-.11-.44-.24-.62-.4s-.33-.35-.45-.55c-.12-.21-.18-.46-.18-.75 0-.61.23-1.1.68-1.49.44-.38 1.06-.57 1.83-.57.48 0 .91.08 1.29.25s.71.36.99.57l-.74.98c-.24-.17-.49-.32-.73-.42-.25-.11-.51-.16-.78-.16-.35 0-.6.07-.76.21-.17.15-.25.33-.25.54 0 .14.04.26.12.36s.18.18.31.26c.14.07.29.14.46.21l.54.19c.23.09.47.18.7.29s.44.24.64.4c.19.16.34.35.46.58.11.23.17.5.17.82 0 .3-.06.58-.17.83-.12.26-.29.48-.51.68-.23.19-.51.34-.84.45-.34.11-.72.17-1.15.17-.48 0-.95-.09-1.41-.27-.46-.19-.86-.41-1.2-.68z" fill="#535353"/></g></svg>"></a></div><div class="c-bibliographic-information__column"><h3 class="c-article__sub-heading" id="citeas">Cite this article</h3><p class="c-bibliographic-information__citation">Russakovsky, O., Deng, J., Su, H. <i>et al.</i> ImageNet Large Scale Visual Recognition Challenge. <i>Int J Comput Vis</i> <b>115</b>, 211–252 (2015). https://doi.org/10.1007/s11263-015-0816-y</p><p class="c-bibliographic-information__download-citation u-hide-print"><a data-test="citation-link" data-track="click" data-track-action="download article citation" data-track-label="link" data-track-external="" rel="nofollow" href="https://citation-needed.springer.com/v2/references/10.1007/s11263-015-0816-y?format=refman&amp;flavour=citation">Download citation<svg width="16" height="16" focusable="false" role="img" aria-hidden="true" class="u-icon"><use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-download-medium"></use></svg></a></p><ul class="c-bibliographic-information__list" data-test="publication-history"><li class="c-bibliographic-information__list-item"><p>Received<span class="u-hide">: </span><span class="c-bibliographic-information__value"><time datetime="2014-08-31">31 August 2014</time></span></p></li><li class="c-bibliographic-information__list-item"><p>Accepted<span class="u-hide">: </span><span class="c-bibliographic-information__value"><time datetime="2015-03-12">12 March 2015</time></span></p></li><li class="c-bibliographic-information__list-item"><p>Published<span class="u-hide">: </span><span class="c-bibliographic-information__value"><time datetime="2015-04-11">11 April 2015</time></span></p></li><li class="c-bibliographic-information__list-item"><p>Issue Date<span class="u-hide">: </span><span class="c-bibliographic-information__value"><time datetime="2015-12">December 2015</time></span></p></li><li class="c-bibliographic-information__list-item c-bibliographic-information__list-item--full-width"><p><abbr title="Digital Object Identifier">DOI</abbr><span class="u-hide">: </span><span class="c-bibliographic-information__value">https://doi.org/10.1007/s11263-015-0816-y</span></p></li></ul><div data-component="share-box"><div class="c-article-share-box u-display-none" hidden=""><h3 class="c-article__sub-heading">Share this article</h3><p class="c-article-share-box__description">Anyone you share the following link with will be able to read this content:</p><button class="js-get-share-url c-article-share-box__button" type="button" id="get-share-url" data-track="click" data-track-label="button" data-track-external="" data-track-action="get shareable link">Get shareable link</button><div class="js-no-share-url-container u-display-none" hidden=""><p class="js-c-article-share-box__no-sharelink-info c-article-share-box__no-sharelink-info">Sorry, a shareable link is not currently available for this article.</p></div><div class="js-share-url-container u-display-none" hidden=""><p class="js-share-url c-article-share-box__only-read-input" id="share-url" data-track="click" data-track-label="button" data-track-action="select share url"></p><button class="js-copy-share-url c-article-share-box__button--link-like" type="button" id="copy-share-url" data-track="click" data-track-label="button" data-track-action="copy share url" data-track-external="">Copy to clipboard</button></div><p class="js-c-article-share-box__additional-info c-article-share-box__additional-info"> Provided by the Springer Nature SharedIt content-sharing initiative </p></div></div><h3 class="c-article__sub-heading">Keywords</h3><ul class="c-article-subject-list"><li class="c-article-subject-list__subject"><span><a href="/search?query=Dataset&amp;facet-discipline=&#34;Computer%20Science&#34;" data-track="click" data-track-action="view keyword" data-track-label="link">Dataset</a></span></li><li class="c-article-subject-list__subject"><span><a href="/search?query=Large-scale&amp;facet-discipline=&#34;Computer%20Science&#34;" data-track="click" data-track-action="view keyword" data-track-label="link">Large-scale</a></span></li><li class="c-article-subject-list__subject"><span><a href="/search?query=Benchmark&amp;facet-discipline=&#34;Computer%20Science&#34;" data-track="click" data-track-action="view keyword" data-track-label="link">Benchmark</a></span></li><li class="c-article-subject-list__subject"><span><a href="/search?query=Object%20recognition&amp;facet-discipline=&#34;Computer%20Science&#34;" data-track="click" data-track-action="view keyword" data-track-label="link">Object recognition</a></span></li><li class="c-article-subject-list__subject"><span><a href="/search?query=Object%20detection&amp;facet-discipline=&#34;Computer%20Science&#34;" data-track="click" data-track-action="view keyword" data-track-label="link">Object detection</a></span></li></ul><div data-component="article-info-list"></div></div></div></div></div></section> </div> </main> <div class="c-article-sidebar u-text-sm u-hide-print l-with-sidebar__sidebar" id="sidebar" data-container-type="reading-companion" data-track-component="reading companion"> <aside> <div data-test="collections"> </div> <div data-test="editorial-summary"> </div> <div class="c-reading-companion"> <div class="c-reading-companion__sticky" data-component="reading-companion-sticky" data-test="reading-companion-sticky"> <div data-test="access-article" class="app-article-access"> <h2 class="app-article-access__heading">Access this article</h2> <div class="u-ma-16 u-clear-both"> <a href="//wayf.springernature.com?redirect_uri&#x3D;https%3A%2F%2Flink.springer.com%2Farticle%2F10.1007%2Fs11263-015-0816-y%3FfromPaywallRec%3Dfalse%26error%3Dcookies_not_supported%26code%3D87e99a51-693a-4fca-9006-8a4046de5dd8" class="u-button u-button--full-width u-button--primary u-justify-content-space-between c-pdf-download__link" data-track="click" data-track-action="institution access" data-track-label="button"> <span data-test="access-via-institution">Log in via an institution</span> <svg aria-hidden="true" focusable="false" width="24" height="24" class="u-icon"> <use xlink:href="#icon-eds-i-arrow-right-medium"></use> </svg> </a> </div> <div data-test="buy-box-desktop" class="c-article-buy-box"> <div class="sprcom-buybox-articleDarwin" id="sprcom-buybox-articleDarwin"> <!-- rendered: 2024-11-23T21:56:01.556376 --><!-- Darwin version --> <div class="buying-option" data-test-id="buy-article-darwin"> <div> <div class="c-springer-plus"> <h2 class="springer-plus-heading">Subscribe and save</h2> <div class="springer-plus"> <div class="springer-plus-headline"> <div class="springer-plus-title"> <svg aria-hidden="true" focusable="false" width="16" height="16" class="u-icon"> <use xlink:href="#icon-eds-i-check-filled-medium"></use> </svg><span>Springer+ Basic</span> </div> <div class="dd price-amount-springer-plus"> €32.70 /Month </div> </div> <ul class="buying-option-usps"> <li>Get 10 units per month</li> <li>Download Article/Chapter or eBook</li> <li>1 Unit = 1 Article or 1 Chapter</li> <li>Cancel anytime</li> </ul><a href="https://link.springer.com/product/springer-plus" id="btn-subscribe-springerPlus" class="u-button u-button--full-width u-button--secondary" data-track="click||click_springer_subscribe" data-track-context="buy box"><span>Subscribe now </span> <svg aria-hidden="true" focusable="false" width="16" height="16" class="u-icon"> <use xlink:href="#icon-eds-i-arrow-right-medium"></use> </svg></a> </div> <h2 class="springer-plus-heading">Buy Now</h2> </div> <div class="buybox__buy"> <form action="https://order.springer.com/public/cart" method="post"> <input type="hidden" name="type" value="article"><input type="hidden" name="doi" value="10.1007/s11263-015-0816-y"><input type="hidden" name="isxn" value="1573-1405"><input type="hidden" name="contenttitle" value="ImageNet Large Scale Visual Recognition Challenge"><input type="hidden" name="copyrightyear" value="2015"><input type="hidden" name="year" value="2015"><input type="hidden" name="authors" value="Olga Russakovsky, et al."><input type="hidden" name="title" value="International Journal of Computer Vision"><input type="hidden" name="mac" value="d82d2a23bed70747dba1952adf2c2eb9"> <div class="u-ma-16"> <button type="submit" class="u-button--small u-button u-button--secondary u-button--full-width" onclick="dataLayer.push({&quot;event&quot;:&quot;addToCart&quot;,&quot;ecommerce&quot;:{&quot;currencyCode&quot;:&quot;EUR&quot;,&quot;add&quot;:{&quot;products&quot;:[{&quot;name&quot;:&quot;ImageNet Large Scale Visual Recognition Challenge&quot;,&quot;id&quot;:&quot;1573-1405&quot;,&quot;price&quot;:39.95,&quot;brand&quot;:&quot;Springer US&quot;,&quot;category&quot;:&quot;Computer Imaging, Vision, Pattern Recognition and Graphics&quot;,&quot;variant&quot;:&quot;ppv-article&quot;,&quot;quantity&quot;:1}]}}});"><span>Buy article PDF 39,95 €</span></button> </div> </form> <p class="c-notes__text c-notes__vat">Price includes VAT (Hong Kong/P.R.China)<br></p> <p class="c-notes__text c-notes__usp">Instant access to the full article PDF.</p> </div> </div> <script>dataLayer.push({"ecommerce":{"currency":"EUR","impressions":[{"name":"ImageNet Large Scale Visual Recognition Challenge","id":"1573-1405","price":39.95,"brand":"Springer US","category":"Computer Imaging, Vision, Pattern Recognition and Graphics","variant":"ppv-article","quantity":1}]}});</script> <script style="display: none"> ;(function () { if (document.cookie.indexOf("feature-monetise-subscriptions-display-springer-plus") > -1) { document.querySelectorAll(".c-springer-plus").forEach(function(node) { node.style.display = "block" }) } // springerPlus roll out 10% starts here var springerPlusGroup = setLocalStorageSpringerPlus(); var rollOutSpringerPlus = springerPlusGroup === "B" function setLocalStorageSpringerPlus() { var selectUserKey = "springerPlusRollOut"; var springerPlusGroup = "X"; if (!window.localStorage) return springerPlusGroup; try { var selectUserValue = window.localStorage.getItem(selectUserKey) springerPlusGroup = selectUserValue || randomDistributionSpringerPlus(selectUserKey) } catch (err) { console.log(err) } return springerPlusGroup; } function randomDistributionSpringerPlus(selectUserKey) { var randomGroup = Math.random() < 0.9 ? "A" : "B" window.localStorage.setItem(selectUserKey, randomGroup) return randomGroup } if (rollOutSpringerPlus) { revealSpringerPlus(); } function revealSpringerPlus() { var article = document.getElementById("sprcom-buybox-articleDarwin"); if(article) { document.querySelectorAll(".c-springer-plus").forEach(function(node) { node.style.display = "block" }) } } //springerPlus ends here })() </script> <style> .springer-plus .buying-option-usps > li::before { background-image: url("data:image/svg+xml,%3Csvg viewBox='0 0 100 100' xmlns='http://www.w3.org/2000/svg' fill='%230070A8'%3E%3Ccircle cx='50' cy='50' r='50'/%3E%3C/svg%3E"); } </style> </div> <article class="buybox__rent-article buybox__access-option u-sans-serif" id="deepdyve" style="display: none" data-test-id="journal-subscription"> <div class="c-box__body"> <div class="buybox__info"> <p>Rent this article via <a class="deepdyve-link" target="deepdyve" rel="nofollow" data-track="click" data-track-action="rent article" data-track-label="rent action, new buybox">DeepDyve</a> <svg focusable="false" role="img" aria-hidden="true" class="u-icon" style="vertical-align: middle"> <use xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="#icon-eds-i-external-link-small"></use> </svg></p> </div> </div> <script> function deepDyveResponse(data) { if (data.status === 'ok') { [].slice.call(document.querySelectorAll('.buybox__rent-article')).forEach(function (article) { article.style.display = 'flex' var link = article.querySelector('.deepdyve-link') if (link) { link.setAttribute('href', data.url) } }) } } var script = document.createElement('script') script.src = '//www.deepdyve.com/rental-link?docId=10.1007/s11263-015-0816-y&journal=1573-1405&fieldName=journal_doi&affiliateId=springer&format=jsonp&callback=deepDyveResponse' document.body.appendChild(script) </script> </article> <div class="buybox__access-option buybox__institutional-subs-link u-sans-serif"> <p><a href="https://www.springernature.com/gp/librarians/licensing/agc/journals">Institutional subscriptions <svg aria-hidden="true" focusable="false" width="24" height="24" class="u-icon" style="vertical-align: middle"> <use xlink:href="#icon-eds-i-arrow-right-medium"></use> </svg></a></p> </div> <style>.sprcom-buybox-articleDarwin .buybox__access-option{ border-top: 1px solid #cedbe0; font-size: 1rem; padding: 16px; } .sprcom-buybox-articleDarwin .c-springer-plus{ display: none; } .sprcom-buybox-articleDarwin .springer-plus{ background-color: #EBF6FF; font-family: 'Merriweather Sans', 'Helvetica Neue', Helvetica, Arial, sans-serif; padding: 16px; } .sprcom-buybox-articleDarwin .springer-plus-headline{ display: flex; justify-content: space-between; } .sprcom-buybox-articleDarwin .springer-plus-heading{ border-bottom: 1px solid #c5e0f4; border-top: 1px solid #c5e0f4; font-family: 'Merriweather Sans', 'Helvetica Neue', Helvetica, Arial, sans-serif; font-size: 1.125rem; font-weight: 700; margin: 0; padding: 16px; text-align: center; } .sprcom-buybox-articleDarwin .springer-plus-title{ align-items: center; display: flex; } .sprcom-buybox-articleDarwin .springer-plus-title span{ margin-left: 8px; } .sprcom-buybox-articleDarwin .springer-plus a{ background-color: #fff; border: 1px solid #025e8d; color: #025e8d; font-size: 16px; font-weight: 700; max-height: 44px; } .sprcom-buybox-articleDarwin .springer-plus a span{ margin-right: 8px; } .sprcom-buybox-articleDarwin .springer-plus a:hover{ background-color: #025e8d; border: 4px solid #025e8d; box-shadow: none; color: #fff; font-weight: 700; } .sprcom-buybox-articleDarwin .springer-plus a:visited{ color: #025e8d; } .sprcom-buybox-articleDarwin .springer-plus a:visited:hover{ color: #fff; } .sprcom-buybox-articleDarwin .springer-plus .buying-option-usps{ color: #555; font-size: 1rem; line-height: 1.6; list-style: none; margin: 0; padding: 16px 0 24px 0; } .sprcom-buybox-articleDarwin .springer-plus .buying-option-usps > li{ padding-left: 26px; position: relative; } .sprcom-buybox-articleDarwin .springer-plus .buying-option-usps > li::before{ content: ''; height: 10px; left: 0; position: absolute; top: calc(0.8em - 5px); width: 10px; } .sprcom-buybox-articleDarwin .springer-plus .buying-option-usps > li:not(:first-child){ margin-top: 4px; } </style> </div> </div> </div> <div class="c-reading-companion__panel c-reading-companion__sections c-reading-companion__panel--active" id="tabpanel-sections"> <div class="u-lazy-ad-wrapper u-mt-16 u-hide" data-component-mpu><div class="c-ad c-ad--300x250"> <div class="c-ad__inner"> <p class="c-ad__label">Advertisement</p> <div id="div-gpt-ad-MPU1" class="div-gpt-ad grade-c-hide" data-pa11y-ignore data-gpt data-gpt-unitpath="/270604982/springerlink/11263/article" data-gpt-sizes="300x250" data-test="MPU1-ad" data-gpt-targeting="pos=MPU1;articleid=s11263-015-0816-y;"> </div> </div> </div> </div> </div> <div class="c-reading-companion__panel c-reading-companion__figures c-reading-companion__panel--full-width" id="tabpanel-figures"></div> <div class="c-reading-companion__panel c-reading-companion__references c-reading-companion__panel--full-width" id="tabpanel-references"></div> </div> </div> </aside> </div> </div> </article> <div class="app-elements"> <div class="eds-c-header__expander eds-c-header__expander--search" id="eds-c-header-popup-search"> <h2 class="eds-c-header__heading">Search</h2> <div class="u-container"> <search class="eds-c-header__search" role="search" aria-label="Search from the header"> <form method="GET" action="//link.springer.com/search" data-test="header-search" data-track="search" data-track-context="search from header" data-track-action="submit search form" data-track-category="unified header" data-track-label="form" > <label for="eds-c-header-search" class="eds-c-header__search-label">Search by keyword or author</label> <div class="eds-c-header__search-container"> <input id="eds-c-header-search" class="eds-c-header__search-input" autocomplete="off" name="query" type="search" value="" required> <button class="eds-c-header__search-button" type="submit"> <svg class="eds-c-header__icon" aria-hidden="true" focusable="false"> <use xlink:href="#icon-eds-i-search-medium"></use> </svg> <span class="u-visually-hidden">Search</span> </button> </div> </form> </search> </div> </div> <div class="eds-c-header__expander eds-c-header__expander--menu" id="eds-c-header-nav"> <h2 class="eds-c-header__heading">Navigation</h2> <ul class="eds-c-header__list"> <li class="eds-c-header__list-item"> <a class="eds-c-header__link" href="https://link.springer.com/journals/" data-track="nav_find_a_journal" data-track-context="unified header" data-track-action="click find a journal" data-track-category="unified header" data-track-label="link" > Find a journal </a> </li> <li class="eds-c-header__list-item"> <a class="eds-c-header__link" href="https://www.springernature.com/gp/authors" data-track="nav_how_to_publish" data-track-context="unified header" data-track-action="click publish with us link" data-track-category="unified header" data-track-label="link" > Publish with us </a> </li> <li class="eds-c-header__list-item"> <a class="eds-c-header__link" href="https://link.springernature.com/home/" data-track="nav_track_your_research" data-track-context="unified header" data-track-action="click track your research" data-track-category="unified header" data-track-label="link" > Track your research </a> </li> </ul> </div> <footer > <div class="eds-c-footer" > <div class="eds-c-footer__container"> <div class="eds-c-footer__grid eds-c-footer__group--separator"> <div class="eds-c-footer__group"> <h3 class="eds-c-footer__heading">Discover content</h3> <ul class="eds-c-footer__list"> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://link.springer.com/journals/a/1" data-track="nav_journals_a_z" data-track-action="journals a-z" data-track-context="unified footer" data-track-label="link">Journals A-Z</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://link.springer.com/books/a/1" data-track="nav_books_a_z" data-track-action="books a-z" data-track-context="unified footer" data-track-label="link">Books A-Z</a></li> </ul> </div> <div class="eds-c-footer__group"> <h3 class="eds-c-footer__heading">Publish with us</h3> <ul class="eds-c-footer__list"> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://link.springer.com/journals" data-track="nav_journal_finder" data-track-action="journal finder" data-track-context="unified footer" data-track-label="link">Journal finder</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/authors" data-track="nav_publish_your_research" data-track-action="publish your research" data-track-context="unified footer" data-track-label="link">Publish your research</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/open-research/about/the-fundamentals-of-open-access-and-open-research" data-track="nav_open_access_publishing" data-track-action="open access publishing" data-track-context="unified footer" data-track-label="link">Open access publishing</a></li> </ul> </div> <div class="eds-c-footer__group"> <h3 class="eds-c-footer__heading">Products and services</h3> <ul class="eds-c-footer__list"> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/products" data-track="nav_our_products" data-track-action="our products" data-track-context="unified footer" data-track-label="link">Our products</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/librarians" data-track="nav_librarians" data-track-action="librarians" data-track-context="unified footer" data-track-label="link">Librarians</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/societies" data-track="nav_societies" data-track-action="societies" data-track-context="unified footer" data-track-label="link">Societies</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springernature.com/gp/partners" data-track="nav_partners_and_advertisers" data-track-action="partners and advertisers" data-track-context="unified footer" data-track-label="link">Partners and advertisers</a></li> </ul> </div> <div class="eds-c-footer__group"> <h3 class="eds-c-footer__heading">Our imprints</h3> <ul class="eds-c-footer__list"> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.springer.com/" data-track="nav_imprint_Springer" data-track-action="Springer" data-track-context="unified footer" data-track-label="link">Springer</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.nature.com/" data-track="nav_imprint_Nature_Portfolio" data-track-action="Nature Portfolio" data-track-context="unified footer" data-track-label="link">Nature Portfolio</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.biomedcentral.com/" data-track="nav_imprint_BMC" data-track-action="BMC" data-track-context="unified footer" data-track-label="link">BMC</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.palgrave.com/" data-track="nav_imprint_Palgrave_Macmillan" data-track-action="Palgrave Macmillan" data-track-context="unified footer" data-track-label="link">Palgrave Macmillan</a></li> <li class="eds-c-footer__item"><a class="eds-c-footer__link" href="https://www.apress.com/" data-track="nav_imprint_Apress" data-track-action="Apress" data-track-context="unified footer" data-track-label="link">Apress</a></li> </ul> </div> </div> </div> <div class="eds-c-footer__container"> <nav aria-label="footer navigation"> <ul class="eds-c-footer__links"> <li class="eds-c-footer__item"> <button class="eds-c-footer__link" data-cc-action="preferences" data-track="dialog_manage_cookies" data-track-action="Manage cookies" data-track-context="unified footer" data-track-label="link"><span class="eds-c-footer__button-text">Your privacy choices/Manage cookies</span></button> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://www.springernature.com/gp/legal/ccpa" data-track="nav_california_privacy_statement" data-track-action="california privacy statement" data-track-context="unified footer" data-track-label="link">Your US state privacy rights</a> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://www.springernature.com/gp/info/accessibility" data-track="nav_accessibility_statement" data-track-action="accessibility statement" data-track-context="unified footer" data-track-label="link">Accessibility statement</a> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://link.springer.com/termsandconditions" data-track="nav_terms_and_conditions" data-track-action="terms and conditions" data-track-context="unified footer" data-track-label="link">Terms and conditions</a> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://link.springer.com/privacystatement" data-track="nav_privacy_policy" data-track-action="privacy policy" data-track-context="unified footer" data-track-label="link">Privacy policy</a> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://support.springernature.com/en/support/home" data-track="nav_help_and_support" data-track-action="help and support" data-track-context="unified footer" data-track-label="link">Help and support</a> </li> <li class="eds-c-footer__item"> <a class="eds-c-footer__link" href="https://support.springernature.com/en/support/solutions/articles/6000255911-subscription-cancellations" data-track-action="cancel contracts here">Cancel contracts here</a> </li> </ul> </nav> <div class="eds-c-footer__user"> <p class="eds-c-footer__user-info"> <span data-test="footer-user-ip">8.222.208.146</span> </p> <p class="eds-c-footer__user-info" data-test="footer-business-partners">Not affiliated</p> </div> <a href="https://www.springernature.com/" class="eds-c-footer__link"> <img src="/oscar-static/images/logo-springernature-white-19dd4ba190.svg" alt="Springer Nature" loading="lazy" width="200" height="20"/> </a> <p class="eds-c-footer__legal" data-test="copyright">&copy; 2024 Springer Nature</p> </div> </div> </footer> </div> </body> </html>

Pages: 1 2 3 4 5 6 7 8 9 10