
Joint Hypergraph Learning for Tag – Based Image Retrieval
Abstract
Joint Hypergraph Learning for Tag – Based Image Retrieval, As the image sharing websites like Flickr become more and more popular, extensive scholars concentrate on tag-based image retrieval. It is one of the important ways to find images contributed by social users. In this research field, tag information and diverse visual features have been investigated. However, most existing methods use these visual features separately or sequentially. In this paper, we propose a global and local visual features fusion approach to learn the relevance of images by hypergraph approach. A hypergraph is constructed first by utilizing global, local visual features, and tag information. Then, we propose a pseudo-relevance feedback mechanism to obtain the pseudo-positive images. Finally, with the hypergraph and pseudo relevance feedback, we adopt the hypergraph learning algorithm to calculate the relevance score of each image to the query. Experimental results demonstrate the effectiveness of the proposed approach.
Conclusion and Future Work
In this paper, we propose a new joint re-ranking method for social image retrieval, in which we simultaneously utilize global, local visual features and textual feature to improve the retrieval accuracy. Experiment results on NUS-Wide dataset show that combing the global and local visual features is much better than using any of them alone and also more efficient than the comparison methods. The discussions in experiment show that our method has lighter dependence on the learning parameters, clustering methods and the metric methods we apply.
However, in our method, we only consider the relevance of result and ignore the diversity. In our future work, we will investigate the diversity by multiple visual features