In visual search systems, it is important to address the issue of how to leverage the rich contextual information in a visual computational model to build more robust visual search systems and to better satisfy the user's need and intention. In this paper, we introduced a ranking model by understanding the complex relations within product visual and textual information in visual search systems. To understand their complex relations, we focused on using graph-based paradigms to model the relations among product images, product category labels, and product names and descriptions. We developed a unified probabilistic hypergraph ranking algorithm, which, modeling the correlations among product visual features and textual features, extensively enriches the description of the image. We conducted experiments on the proposed ranking algorithm on a dataset collected from a real e-commerce website. The results of our comparison demonstrate that our proposed algorithm extensively improves the retrieval performance over the visual distance based ranking.
CITATION STYLE
Zeng, K., Wu, N., Sargolzaei, A., & Yen, K. (2016). Learn to Rank Images: A Unified Probabilistic Hypergraph Model for Visual Search. Mathematical Problems in Engineering, 2016. https://doi.org/10.1155/2016/7916450
Mendeley helps you to discover research relevant for your work.