Abstract

Although the performance of person Re-Identification (ReID) has been significantly boosted, many challenging issues in real scenarios have not been fully investigated, e.g., the complex scenes and lighting variations, viewpoint and pose changes, and the large number of identities in a camera network. To facilitate the research towards conquering those issues, this paper contributes a new dataset called MSMT171 with many important features, e.g., 1) the raw videos are taken by an 15-camera network deployed in both indoor and outdoor scenes, 2) the videos cover a long period of time and present complex lighting variations, and 3) it contains currently the largest number of annotated identities, i.e., 4,101 identities and 126,441 bounding boxes. We also observe that, domain gap commonly exists between datasets, which essentially causes severe performance drop when training and testing on different datasets. This results in that available training data cannot be effectively leveraged for new testing domains. To relieve the expensive costs of annotating new training samples, we propose a Person Transfer Generative Adversarial Network (PTGAN) to bridge the domain gap. Comprehensive experiments show that the domain gap could be substantially narrowed-down by the PTGAN.

Keywords

Computer scienceBounding overwatchIdentification (biology)Domain (mathematical analysis)Bridge (graph theory)Generative adversarial networkTransfer of learningArtificial intelligenceGenerative grammarCover (algebra)Adversarial systemMachine learningTransfer (computing)Deep learningComputer visionEngineering

Affiliated Institutions

Related Publications

Publication Info

Year
2018
Type
preprint
Citations
1899
Access
Closed

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

1899
OpenAlex
407
Influential
1486
CrossRef

Cite This

Longhui Wei, Shiliang Zhang, Wen Gao et al. (2018). Person Transfer GAN to Bridge Domain Gap for Person Re-identification. 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition . https://doi.org/10.1109/cvpr.2018.00016

Identifiers

DOI
10.1109/cvpr.2018.00016
arXiv
1711.08565

Data Quality

Data completeness: 84%