A_Survey_of_Unsupervised_Deep_Domain_Adaptation

本文由用户“huangtaoaa”分享发布 更新时间:2020-03-14 17:27:05 举报文档

以下为《A_Survey_of_Unsupervised_Deep_Domain_Adaptation》的无排版文字预览,完整格式请下载

下载前请仔细阅读文字预览以及下方图片预览。图片预览是什么样的,下载的文档就是什么样的。

See discussions, stats, and author profiles for this publication at: https://doc.001pp.com/publication/***7 A Survey of Unsupervised Deep Domain Adaptation Preprint · December 2018 CITATIONS 0 2 authors: Garrett Wilson Washington State University 3 PUBLICATIONS 9 CITATIONS SEE PROFILE READS 1,003 Diane J. Cook Washington State University 503 PUBLICATIONS 15,397 CITATIONS SEE PROFILE Some of the authors of this publication are also working on these related projects: Integrated Measurements and Modeling Using US Smart Homes to Assess Climate Change Impacts on Indoor Air Quality View project Measuring the Progress of Post Stroke Inpatient Rehabilitation View project All content following this page was uploaded by Diane J. Cook on 17 August 2019. The user has requested enhancement of the downloaded file. 1 1 A Survey of Unsupervised Deep Domain Adaptation 2 3 GARRETT WILSON and DIANE J. COOK, Washington State University, USA 4 5 Deep learning has produced state-of-the-art results for a variety of tasks. While such approaches for supervised 6 learning have performed well, they assume that training and testing data are drawn from the same distribution, 7 which may not always be the case. As a complement to this challenge, unsupervised domain adaptation can 8 handle situations where a network is trained on labeled data from a source domain and unlabeled data from a 9 related but different target domain with the goal of performing well at test-time on the target domain. Many unsupervised deep domain adaptation approaches have thus been developed. This survey will compare these 10 approaches by examining alternative methods, the unique and common elements, results, and theoretical 11 insights. We follow this with a look at application areas and open research directions. 12 13 CCS Concepts: • General and reference → Surveys and overviews; • Computing methodologies → 14 Transfer learning; Unsupervised learning; Neural networks; Adversarial learning; 15 Additional Key Words and Phrases: domain adaptation, deep learning, generative adversarial networks 16 ACM Reference Format: 17 Garrett Wilson and Diane J. Cook. 2019. A Survey of Unsupervised Deep Domain Adaptation. ACM Trans. 18 Intell. Syst. Technol. 1, 1, Article 1 (March 2019), 43 pages. https://doi.org/***.*** 19 20 1 INTRODUCTION 21 Supervised learning is arguably the most prevalent type of machine learning and has enjoyed 22 much success across diverse application areas. However, many supervised learning methods make 23 a common assumption: the training and testing data are drawn from the same distribution. When 24 this constraint is violated, a classifier trained on the source domain will likely experience a drop 25 in performance when tested on the target domain due to the differences between domains [160]. 26 Domain adaptation refers to the goal of learning a concept from labeled data in a source domain 27 that performs well on a different but related target domain [62, 69, 158]. Unsupervised domain 28 adaptation specifically addresses the situation where there is labeled source data and only unlabeled 29 target data available for use during training [62, 128]. 30 Because of its ability to adapt labeled data for use in a new application, domain adaptation can 31 reduce the need for costly labeled data in the target domain. As an example, consider the problem of 32 semantically segmenting images. Each real image in the Cityscapes dataset required approximately 33 1.5 hours to annotate for semantic segmentation [39]. In this case, human annotation time could be 34 spared by training an image semantic segmentation model on synthetic street view images (the 35 source domain) since these can be cheaply generated, then adapting and testing for real street view 36 images (the target domain, here the Cityscapes dataset). 37 An undeniable trend in machine learning is the increased usage of deep neural networks. Deep 38 networks have produced many state-of-the-art results for a variety of machine learning tasks 39 40 Authors’ address: Garrett Wilson, garrett.wilson@wsu.edu; Diane J. Cook, djcook@wsu.edu, Washington State University, School of Electrical Engineering and Computer Science, Pullman, WA, 99164, USA. 41 42 Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee 43 provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the 44 full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. 45 Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from permissions@acm.org. 46 © 2019 Copyright held by the owner/author(s). Publication rights licensed to ACM. 47 2157-6904/2019/3-ART1 $15.00 48 https://doi.org/***.*** 49 ACM Trans. Intell. Syst. Technol., Vol. 1, No. 1, Article 1. Publication date: March 2019. 1:2 Garrett Wilson and Diane J. Cook 50 [62, 69] such as image classification, speech recognition, machine translation, and image generation 51 [68, 69]. When trained on large amounts of data, these many-layer neural networks can learn 52 powerful, hierarchical representations [69, 128, 200] and can be highly scalable [65]. At the same 53 time, these networks can also experience performance drops due to domain shifts [61, 200]. Thus, 54 much research has gone into adapting such networks from large labeled datasets to domains where 55 little (or possibly no) labeled training data is available (for a list, see [225]). These unsupervised 56 deep domain adaptation approaches, which combine the benefit of deep learning with the very 57 practical use of domain adaptation to remove the reliance on potentially costly target data labels, 58 will be the focus of this survey. 59 A number of surveys have been created on the topic of domain adaptation [10, 21, 36, 41, 60 42, 107, 108, 141, 160, 201, 215, 249] and more generally transfer learning [38, 113, 133, 158, 191, 61 205, 207, 221, 237], of which domain adaptation can be viewed as a special case [160]. Previous 62 domain adaptation surveys lack depth of coverage and comparison of unsupervised deep domain 63 adaptation approaches. In some cases, prior surveys do not discuss domain mapping [41, 42, 107], 64 normalization statistic-based [41, 42, 107, 249], or ensemble-based [41, 42, 107, 215, 249] methods. 65 In other cases, they do not survey deep learning approaches [10, 108, 141, 160]. Still others are 66 application-centric, focusing on a single use case such as machine translation [21, 36]. One earlier 67 survey focuses on the multi-source scenario [201], while we focus on the more prevalent single- 68 source scenario. Transfer learning is a broader topic to cover, thus surveys provide minimal coverage 69 and comparison of the deep learning methods that have been designed for unsupervised domain 70 adaptation [133, 158, 191, 205, 221, 237], or they focus on tasks such as activity recognition [38] or 71 reinforcement learning [113, 207]. The goal of this survey is to discuss, highlight unique components, 72 and compare approaches to unsupervised deep domain adaptation. 73 We first provide background on where domain adaptation fits into the more general problem 74 of transfer learning. We follow this with an overview of generative adversarial networks (GANs) 75 to provide background for the increasingly widespread use of adversarial techniques in domain 76 adaptation. Next, we investigate the various domain adaptation methods, the components of those 77 methods, and the results. Then, we overview domain adaptation theory and discuss what we can 78 learn from the theoretical results. Finally, we look at application a 内容过长,仅展示头部和尾部部分文字预览,全文请查看图片预览。 a Kumar, and Jinsong Wang. 2018. Unsupervised Domain Adaptation for Semantic Segmentation via Class-Balanced Self-Training. In The European Conference on Computer Vision (ECCV). 2077 2078 Received March 2019 2079 2080 2081 2082 2083 2084 2085 2086 2087 2088 2089 2090 2091 2092 2093 2094 2095 2096 2097 2098 2099 2100 2101 2102 2103 2104 2105 2106 2107 ACM Trans. Intell. Syst. Technol., Vol. 1, No. 1, Article 1. Publication date: March 2019. View publication stats [文章尾部最后500字内容到此结束,中间部分内容请查看底下的图片预览]请点击下方选择您需要的文档下载。

  1. 4425
  2. 研究生复试口语资料_***05
  3. 【精校版】高考全国Ⅰ卷英语试题(word版含答案)
  4. Challenge to All演讲稿
  5. Unit6I’mgoingtostudy
  6. The clinicopathological signifcance
  7. A Study on Supporting the Deployment and Evaluatio
  8. Full_Paper_Template
  9. ann
  10. U1 Company单某某

以上为《A_Survey_of_Unsupervised_Deep_Domain_Adaptation》的无排版文字预览,完整格式请下载

下载前请仔细阅读上面文字预览以及下方图片预览。图片预览是什么样的,下载的文档就是什么样的。

图片预览