Personalized clothing matching recommendation based on multi-modal fusion

被引:0
|
作者
Liu J. [1 ,2 ,3 ]
Zhang F. [1 ]
Hu X. [1 ,2 ,3 ]
Peng T. [2 ]
Li L. [1 ,2 ,3 ]
Zhu Q. [1 ,2 ,3 ]
Zhang J. [1 ,2 ,3 ]
机构
[1] College of Computer Science and Artificial Intelligence, Wuhan Textile University, Hubei, Wuhan
[2] Hubei Provincial Engineering Research Center for Intelligent Textile and Fashion, Wuhan Textile University, Hubei, Wuhan
[3] Engineering Research Center of Hubei Province for Clothing Information, Wuhan Textile University, Hubei, Wuhan
来源
关键词
complementary clothing matching; feature extraction; feature fusion; matching degree; multi-modal; personalized recommendation;
D O I
10.13475/j.fzxb.20211106611
中图分类号
学科分类号
摘要
Objective In the context of fast fashion, most consumers do not have the keen insight of professional designers on fashion clothing matching, which leads to their capability of quickly selecting a set of appropriate, harmonious and suitable clothing from a large number of clothing. In order to better improve users' online shopping experience and help them accurately express their unique personality characteristics, professional identity, status and other image positioning to the outside world, this paper aims to achieve high-precision recommendation by improving the clothing matching degree, so as to meet the huge demand of consumers for personalized clothing matching recommendation.Method By studying the highly nonlinear complex attribute interaction from clothing color to category, and based on the quantitative standard of matching degree of clothing matching, an embedded model of the potential feature representation space of an item was built. By building a matrix decomposition framework model that integrates multimodal information, the shortcomings of existing multimodal feature fusion algorithms were further analyzed, and the clothing style preferences of different users were depicted. Through feature extraction, multimodal feature fusion match degree was calculated and other operations were carried out to establish personalized clothing matching scheme.Results PCMF (personalized clothing matching recommendation based on multi-modal fusion) with some conventional clothing matching methods were qualitatively compared. Compared with all baselines, the clothing matching degree calculated by this model reached 0. 81, which is 1. 25% higher than the AUC (area under curve) value of conventional methods (Tab. 2). It is confirmed that the transposition fusion method of text features and visual features used in PCMF improves the correlation between features, making the presentation of individual style more accurate. In order to compare the difference of contribution of different modal information to the matching degree of PCMF modeled clothing, experiments under three different modal combinations were conducted, i. e. PCMF-T (only exploring the text information of items), PCMF-V (only exploring the visual information of items), and PCMF-TV (exploring the visual and text information of items) (Tab. 3). The AUC value of PCMF-T reached 0. 775, higher than that of PCMF-V (0. 763), indicating that the text information of the piece can more succinctly summarize the key features of the piece, such as patterns, materials and brands. PCMF-TV shows better performance than PCMF-T and PCMF-V, which indicates the necessity of combining multi-modal information of items, and verifies the effectiveness of adding user factors to the general clothing matching modeling to make personalized clothing matching recommendation. In order to effectively evaluate the practical application of thePCMF model, the PCMF model was deployed in the complementary item retrieval task (Fig. 5). It is demonstrated that the PCMF model can complete the personalized clothing matching recommendation task according to the user's preferences. In addition, the MRR (mean reciprocal rank) measurement method was used as the evaluation index to further evaluate the model. PCMF performs better than other models regardless of the number of clothing candidates (Fig. 6).Conclusion Through the combination of IGCM (item-item general clothing match modeling) and UPCM (user-item personalized clothing match modeling), a personalized clothing matching recommendation model based on multi-modal fusion is constructed, which facilitates high-precision personalized clothing matching recommendation. Specifically, the purpose is to match a lower garment that not only has a good match with a given user's top, but also meets the user's taste. In general, the research results show the necessity and effectiveness of combining visual and text modal information and introducing user factors in personalized clothing matching recommendation and the practical application value of PCMF in real scenes is verified. In the future, the clothing matching recommendation problem of two examples will be transformed into a multiinstance learning problem to provide users with personalized package recommendation including shoes and accessories. © 2023 China Textile Engineering Society. All rights reserved.
引用
收藏
页码:176 / 186
页数:10
相关论文
共 64 条
  • [1] AMED I, BALCHANDANI A, BERG A, Et al., The state of fashion 2022
  • [2] global gains mask recovery pains
  • [3] EB/0L]
  • [4] ZHU Weming, WEI Yanghong, Experiential value differences of clothing personalized customization under different situations, Journal of Textile Research, 39, 10, pp. 115-119, (2018)
  • [5] CHEN Huiyuan, LIN Yusan, WANG Fei, Et al., Tops, bottoms, and shoes
  • [6] building capsule wardrobes via cross-attention tensor network [C], Fifteenth ACM Conference on Recommender Systems, pp. 453-462, (2021)
  • [7] DONG Xue, SONG Xuemeng, FENG Fuli, Et al., Personalized capsule wardrobe creation with garment and user modeling [C], Proceedings of the 27th ACM International Conference on Multimedia (MM'19), pp. 302-310, (2019)
  • [8] CHEN Wen, HUANG Pipei, XU Jiaming, Et al., POG: Personalized outfit generation for fashion recommendation at alibaba ifashion, Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (KDD'19), pp. 2662-2670, (2019)
  • [9] LIN Yusan, MOOSAEI M, YANG Hao, Outfitnet
  • [10] fashion outfit recommendation with attention-based multiple instance learning, Proceedings of The Web Conference 2020, pp. 77-87, (2015)