Active Sampling Based on MMD for Model Adaptation

Document Type

Conference Proceeding

Source of Publication

Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST

Publication Date

1-1-2019

Abstract

© 2019, ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering. In this paper, we demonstrate a method for transfer learning with minimal supervised information. Recently, researchers have proposed various algorithms to solve transfer learning problems, especially the unsupervised domain adaptation problem. They mainly focus on how to learn a good common representation and use it directly for downstream task. Unfortunately, they ignore the fact that this representation may not capture target-specific feature for target task well. In order to solve this problem, this paper attempts to capture target-specific feature by utilizing labeled data in target domain. Now it’s a challenge that how to seek as little supervised information as possible to achieve good results. To overcome this challenge, we actively select instances for training and model adaptation based on MMD method. In this process, we try to label some valuable target data to capture target-specific feature and fine-tune the classifier networks. We choose a batch of data in target domain far from common representation space and having maximum entropy. The first requirement is helpful to learn a good representation for target domain and the second requirement tries to improve the classifier performance. Finally, we experiment with our method on several datasets which shows significant improvement and competitive advantage against common methods.

ISBN

9783030323875

ISSN

1867-8211

Publisher

Springer

Volume

294 LNCIST

First Page

397

Last Page

409

Disciplines

Computer Sciences

Keywords

Active sampling, Characteristics, Maximum mean discrepancy, Transfer learning, Uncertainty

Scopus ID

85076135448

Indexed in Scopus

yes

Open Access

no

Share

COinS