top of page
Fashion Compaitibility Modeling through a Multi-modal Try-on-guided Scheme
Xue Dong Jianlong Wu Xuemeng Song Hongjun Dai Liqiang Nie
Accepted to ACM SIGIR 2020
Abstract
Recent years have witnessed a growing trend of fashion compatibility modeling, which scores the matching degree of the given outfit and then provides people with some dressing advice. Existing methods have primarily solved this problem by analyzing the discrete interaction among complementary items. However, the fashion items would present certain occlusion and deformation when they are tried on. Therefore, the discrete item interaction cannot capture the fashion compatibility in a combined manner due to the neglect of a crucial factor: the overall try-on appearance. In light of this, we propose a multi-model try-on-guided compatibility modeling scheme to jointly characterize the discrete interaction and try-on appearance. In particular, we first propose a multi-modal try-on template generator to automatically generate a try-on template of the outfit, depicting the overall look of its composing fashion items. Then, we introduce a new compatibility modeling scheme which integrates outfit try-on appearance into the discrete item interaction modeling. To fulfill the proposal, we construct a large-scale real-world dataset from SSENSE, named FOTOS, consisting of 11,000 well-matched outfits and their corresponding realistic try-on images.
Figure 2: Illustration of the proposed TryOn-CM, which could analyze the fashion compatibility from both the discrete item interaction and try-on appearance.


Framework
Figure 3: Structure of the multi-modal try-on template generator, comprising a visual and textual generator.
FOTOS Dataset

