Skip to Main content Skip to Navigation
Conference papers

Benchmarking result diversification in social image retrieval

Abstract : This article addresses the issue of retrieval result diversification in the context of social image retrieval and discusses the results achieved during the MediaEval 2013 benchmarking. 38 runs and their results are described and analyzed in this text. A comparison of the use of expert vs. crowdsourcing annotations shows that crowdsourcing results are slightly different and have higher inter observer differences but results are comparable at lower cost. Multimodal approaches have best results in terms of cluster recall. Manual approaches can lead to high precision but often lower diversity. With this detailed results analysis we give future insights on this matter.
Document type :
Conference papers
Complete list of metadata
Contributor : Léna Le Roy Connect in order to contact the contributor
Submitted on : Tuesday, July 17, 2018 - 2:36:13 PM
Last modification on : Saturday, June 25, 2022 - 9:11:42 PM

Links full text





B. Ionescu, A. Popescu, H. Muller, M. Menendez, A.-L. Radu. Benchmarking result diversification in social image retrieval. 2014 IEEE International Conference on Image Processing (ICIP), Oct 2014, Paris, France. pp.3072-3076, ⟨10.1109/ICIP.2014.7025621⟩. ⟨cea-01841686⟩



Record views