摘要
In space feature quantization, the most important problem is designing an efficient and compact codebook. The hierarchical clustering approach successfully solves the problem of quantifying the feature space in a large vocabulary size. In this paper we propose to use a tree structure of hierarchical self-organizing-map (H-SOM) with the depth length equal to two and a high size of branch factors (50, 100, 200, 400, and 500). Moreover, an incremental learning process of H-SOM is used to overcome the problem of the curse of the dimensionafity of space. The method is evaluated on three public datasets. Results exceed the current state-of-art retrieval performance on Kentucky and Oxford5k dataset. However, it is with less performance on the Holidays dataset. The experiment results indicate that the proposed tree structure shows significant improvement with a large number of branch factors.
In space feature quantization, the most important problem is designing an efficient and compact codebook. The hierarchical clustering approach successfully solves the problem of quantifying the feature space in a large vocabulary size. In this paper we propose to use a tree structure of hierarchical self-organizing-map (H-SOM) with the depth length equal to two and a high size of branch factors (50, 100, 200, 400, and 500). Moreover, an incremental learning process of H-SOM is used to overcome the problem of the curse of the dimensionafity of space. The method is evaluated on three public datasets. Results exceed the current state-of-art retrieval performance on Kentucky and Oxford5k dataset. However, it is with less performance on the Holidays dataset. The experiment results indicate that the proposed tree structure shows significant improvement with a large number of branch factors.