Purchasing clothes that fit well on e-commerce portals can be problematic if consumers do not trust the fit of clothes based on specified size labels. This is especially a problem in developing countries, where size labels are not adequately standardized. In this paper, we introduce a system that can take a person’s mirror selfie as input and accurately predict anthropometric measurements using that image. These anthropometric measurements can then be used to predict clothing fit based on supplier-specific measurement-label mappings that are available or can easily be developed by e-commerce clothing retailers. The key novelty of our proposal is our use of mirror selfies, which physically ensures that an object of standardized and known size, a cellphone, is present in an image at a predictable orientation and location with the person being photographed. For predicting measurements, we experimented with a number of regression models. Empirical testing showed that the best regression models yield $${\le }5\%$$ test set error with respect to 11 tailor-derived body measurements for each of 70 male subjects. The empirical success of our proposal leads us to believe that our proposed approach may considerably simplify the task of online body size prediction.
CITATION STYLE
Sheth, M., & Srivastava, N. (2020). Predicting Body Size Using Mirror Selfies. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11886 LNCS, pp. 151–162). Springer. https://doi.org/10.1007/978-3-030-44689-5_14
Mendeley helps you to discover research relevant for your work.