Enhancing Diverse Intra-Identity Representation for Visible-Infrared Person Re-Identification

ComputerVisionFoundation Videos
ComputerVisionFoundation Videos
19 بار بازدید - 6 ماه پیش - Authors: Sejun Kim; Soonyong Gwon;
Authors: Sejun Kim; Soonyong Gwon; Kisung Seo
Description: Visible-Infrared person Re-Identification (VI-ReID) is a challenging task due to modality discrepancy. To reduce modality-gap, existing methods primarily focus on sample diversity, such as data augmentation or generating intermediate modality between Visible and Infrared. However, these methods do not consider the increase in intra-instance variance caused by sample diversity, and they focus on dominant features, which results in a remaining modality gap for hard samples. This limitation hinders performance improvement. We propose Intra-identity Representation Diversification (IRD) based metric learning to handle the intra-instance variance. Specifically IRD method enlarge the Intra-modality Intra-identity Representation Space (IIRS) for each modality within the same identity to learn diverse feature representation abilities. This enables the formation of a shared space capable of representing common features across hetero-modality, thereby reducing the modality gap more effectively. In addition, we introduce a HueGray (HG) data augmentation method, which increases sample diversity simply and effectively. Finally, we propose the Diversity Enhancement Network (DEN) for robustly handling intra-instance variance. The proposed method demonstrates superior performance compared to the state-of-the-art methods on the SYSU-MM01 and RegDB datasets. Notably, on the challenging SYSU-MM01 dataset, our approach achieves remarkable results with a Rank-1 accuracy of 76.36% and a mean Average Precision (mAP) of 71.30%.
6 ماه پیش در تاریخ 1402/11/09 منتشر شده است.
19 بـار بازدید شده
... بیشتر